You are on page 1of 61

Introduction to Probability and Statistics

Probability & Statistics for Engineers & Scientists, 9th Ed.


2009

Handout #4

Lingzhou Xue

The pdf file for this class is available on the class web page.

Chapter 5
Some Discrete Probability Distributions

Name

Notation

P.D.F. f (x) = P (X = x)

Ber(p)

1,
x = x1, . . . , xk .
k
px(1 p)1x, x = 0, 1.

Uniform
Bernoulli
Binomial

Bin(n, p)

Geometric

Geo(p)

Negative Binomial

NB(k, p)

Poisson

px(1 p)nx, x = 0, 1, . . . , n
!

n
x1, x2, . . . , xk

Multinomial

Hypergeometric

n
x

Hyp(N, n, k)

Poi(t)

p11 p22 pk k

x1 , x = 1, 2, 3, . . . .
pq!
x1
pk q x1, x = k, k + 1, . . . .
k1

k
x

N k
nx
N
n

et (t)x
,
x!

x = 0, 1, 2, . . . .
3

Discrete Uniform Distribution


If the random variable X assumes the values x1, x2, . . . , xk , with
equal probabilities, then the discrete uniform distribution is
given by
1
f (x; k) = P (X = x) = ,
k
k
1 X
= E(X) =
xi ,
k i=1

and

x = x1, . . . , xk .

k
1 X
2
= Var(X) =
(xi )2.
k i=1

The random variable assumes each of its values with an equal


probability.
We have used the notation f (x; k) instead of f (x) to indicate
that the uniform distribution depends on the parameter k.
4

1
f (; , 10) =
10

x = 0, 2, . . . , 9.

Probability density function for a discrete uniform random


variable.

Example 1
When a fair dice is tossed, each element of the sample space
1 . Therefore, we
S = {1, 2, 3, 4, 5, 6} occurs with probability 6
have a uniform distribution with
1
f (x; 6) = , x = 1, 2, 3, 4, 5, 6.
6
6
1 X
= E(X) =
xi = 3.5;
6 i=1

(1)

6
35
1 X
2
2
= Var(X) =
(xi 3.5) =
.
6 i=1
12

(2)

Parameter
A parameter is a value, usually unknown (and which therefore
has to be estimated), used to represent a certain population
characteristic. For example, the population mean is a parameter
that is often used to indicate the average value of a quantity.

Bernoulli, Binomial and Multinomial Distributions

Binomial Experiment
An experiment often consists of repeated trials, each with two
possible outcomes that may be labelled success or failure. Each
trial is called a Bernoulli trial.

Bernoulli Process
Strictly speaking, the Bernoulli process must possess the following properties:

1. The experiment consists of n repeated trials.

2. Each trial results in an outcome that may be classified as a


success or a failure.

3. The probability of success, denoted by p, remains constant


from trial to trial.

4. The repeated trials are independent.


10

Binomial Experiment
A binomial experiment is a probability experiment that satisfies
the following conditions.
1. The experiment is repeated a fixed number of trials.
2. Each trial is independent of the other trials.
3. There are only two possible outcomes of interest for each
trial. The outcomes can be classified as a success S or as a
failure F .
4. The probability of a success, P (S) = p, is the same for each
trial.
11

The number of X of successes in n Bernoulli trials is called a


binomial random variable. The probability distribution of this
discrete random variable is called the binomial distribution, and
its values will denoted by b(x; n, p) since they depend on the
number of trials, n, and the probability of a success on a given
trial, p.

12

Bernoulli Distribution
Let X be a random variable associated with a Bernoulli trial by
defining it as follows:
(

X=

1, if Success;
0, if Failure.

The probability density function (p.d.f.) of X can be written as


P (X = x) = px(1 p)1x,

x = 0, 1,

where p is probability of a success in each trial, that is, P (X =


1) = p.
E(X) = p,

Var(X) = p(1 p).

13

Example 2
Suppose a fair coin is tossed 3 times. Let X denote the number
of heads in this 3 tosses. What is the probability distribution of
the random variable X?

14

P (X
P (X

P (X

P (X
P (X

 3
1

!   
1 0 1 3
3

1 1 1
=
= 0) =P ({T T T }) =
0
2 2 2
2
= 1) =P ({HT T }, {T HT }, {T T H})
1 1 1
1 1 1
1 1 1
= + +
2 2 2
2! 2 2
2 2 2
 1  2
 3
1
1
1
3
=
.
=3
1
2
2
2
= 2) =P ({HHT }, {T HH}, {HT H}) =
1 1 1
1 1 1
1 1 1
= + +
2 2 2
2! 2 2
2 2 2
 3
 2  1
1
1
1
3
=3
=
.
2
2
2
2
 3
1 1 1
1
=
= 2) =P ({HHH}) = =
2 2 2
2
!   
1 x 1 3x
3
= x) =
, x = 0, 1, 2, 3.
x
2
2

!   
1 3 1 0
3

2
15

Binomial Distribution
Let X be a random variable, the number of successes in n independent Bernoulli trials. The probability density function (p.d.f.)
of X can be written as
P (X = x) = b(x; n, p) =

n
x

px(1 p)nx,

x = 0, 1, . . . , n,

where p is probability of a success in each trial, that is, P (X =


1) = p.
E(X) = np,

Var(X) = np(1 p).

16

Probability density function for a binomial random variable with


n = 20 and p = 0.5.
17

Example 3
The probability that a certain kind of component will survive a
shock test is 3
4 . Find the probability that exactly 2 of the next
4 components tested survive.
Solution:
Let X be the number of components that survives in the next 4
tested components and assuming that the tests are independent
and p = 3/4 for each of the 4 tests, we obtain
X Bin(4, 3/4)


b 2; 4,

3
= P (X = 2) =
4


!   
3 2 1 2
4

27
.
128
18

Example 4
The probability that a patient recovers from a rare blood disease
is 0.4. If 15 people are known to have contracted this disease,
what is the probability that

1. at least 10 survive,

2. from 3 to 8 survive, and

3. exactly 5 survive?

19

Solution:
Let X be the number of people that survive. X Bin(15, 0.4).

1.
P (X 10) = 1 P (X < 10) = 1

9
X

b(x; 15, 0.4) = 0.0338.

x=0

2.
P (3 X 8) =

8
X

b(x; 15, 0.4) = 0.8779.

x=3

3.
P (X = 5) = b(5; 15, 0.4) = 0.1859.
20

Where Dose the Name Binomial Come From?


n
0

(p + q)n =

qn +

n
1

pq (n1) +

n
1

p2q (n2) +

n
n

= b(0; n, p) + b(1; n, p) + b(2; n, p) + + b(n; n, p)


=

n
X

b(x; n, p)

x=0

=1

(when p + q = 1).

21

pn

Areas of Application of Binomial Distribution


The binomial distribution finds applications in many scientific
fields. An industrial engineer is keenly interested in the proportion defective in an industrial process. Often, quality control
measures and sampling schemes for processes are based on the
binomial distribution. The binomial applies for any industrial situation where an outcome of a process is dichotomous and the
results of the process are independent, with the probability of a
success being constant from trial to trail. The binomial distribution is also used extensively for medical and military applications.
In both cases, a success or failure result is important. For example, cure or no cure is important in pharmaceutical work,
with hit or miss if often the interpretation of the result of
firing a guided missile.
22

Multinomial Experiment
The binomial experiment becomes a multinomial experiment if
we let each trial have more than 2 possible outcomes. In general, if a given trial can result in any one of k possible outcomes
E1, E2, . . . , Ek with probabilities p1, p2, . . . , pk , then the multinomial distribution will give the probability that E1 occurs x1 times,
E2 occurs x2 times, . . . and Ek occurs xk times in n independent
trials, where
x1 + x2 + . . . + xk = n.
The total number of orders yielding similar outcomes for the n
trials is equal to the number of partitions of n items into k groups
with x1 in the first group; x2 in the second group,. . .; and xk in
the kth group. This can be done in
n
x1, x2, . . . , xk

ways.
23

Multinomial Distribution
If a given trial can result in the k outcomes E1, E2, . . . , Ek with
probabilities p1, p2, . . . , pk , then the probability distribution of the
random variables X1, X2, . . . , Xk , representing the number of occurrences for E1, E2, . . . , Ek in n independent trials is

f (x1, x2, . . . , xk ; p1, p2, . . . , pk ) =

n
x1, x2, . . . , xk

p11 p22 pk k ,

with
k
X
i=1

xi = n,

and

k
X

pi = 1.

i=1

24

Example 5
According to a genetics theory, a certain cross of guinea pigs will
result in red, black, and white offspring in the ration 8:4:4. Find
the probability that among 8 offspring 5 will be red, 2 black, and
1 white.
Solution:
Let X1, X2, and X3 represent the number of offspring in red,
black, and white, respectively. So
8 8 8
(X1, X2, X3) Multinomial
,
,
,8 .
16 16 16


P (X1 = 5, X2 = 2, X3 = 1) =

8
5, 2, 1

!

 
 

8 5 4 2 4 1
.
16
16
16

25

Example 6
The complexity of arrivals and departures into an airport are
such that computer simulation is often used to model the ideal
conditions. For a certain airport containing three runways it is
known that in the ideal setting the following are the probabilities
that the individual runways are accessed by a randomly arriving
commercial jet:

Runway 1: p1 = 2
9,
Runway 2: p2 = 1
6,
Runway 3: p3 = 11
18 .
26

What is probability that 6 randomly arriving airplanes are distributed in the following fashion?

Runway 1: 2 airplanes,

Runway 2: 1 airplanes,

Runway 3: 3 airplanes.

Solution:
Using the multinomial distribution, we have
2 1 11
f (2, 1, 3; , ,
, 6) =
9 6 18

6
2, 1, 3

!    

2 2 1 1 11 3

18

Example 6
A fair four-sided dice is rolled 6 times. What is probability that
we have 2 1s, 2 2s and 2 4s?
Solution:
1 for
Let pi denote the probability getting the number i, so pi = 4
i = 1, 2, 3, 4. Using the multinomial distribution, we have
1 1 1 1
f (2, 2, 0, 2; , , , , 6) =
4 4 4 4

6
2, 2, 0, 3

!       
1 2 1 2 1 0 1 2

27

Negative Binomial and Geometric Distributions

28

Negative Binomial Experiment


Let us consider an experiment where the properties are the same
as those listed for a binomial experiment, with the exception that
the trial will be repeated until a fixed number of successes
occur. Therefore, instead of finding the probability of x successes in n trials, where n is fixed, we are now interested in the
probability that the kth success occurs on the xth trial. Experiments of this kind are called negative binomial experiments.

29

Geometric Distribution
If repeated independent trials can result in a success with probability p and a failure with probability q = 1p, then the probability
distribution of the random variable X, the number of the trial
on which the first success occurs, is
P (X = x) = g(x; p) = pq x1,
1
E(X) =
p

and

x = 1, 2, 3, . . . .

1p
2
= Var(X) =
p2

30

Probability density function for geometric random variables


with different values of p.
31

Example
Toss a dice repeatedly. Whats the probability that we observe
a 3 for the first time on the 8th toss?
Solution:
Let X be number of trials required to produce the first 3. So,
1
X Geo(p = ).
6
Using the geometric distribution with x = 8 and p = 1
6 , we have
   7
1
5
1
P (X = 8) = g(8; ) =
.
6
6
6
How many tosses would we expect to take?

E(X) =

1
= 6 tosses.
p
32

Example 6
In a certain manufacturing process it is know that, on the average, 1 in very 100 items is defective. What is the probability
that the fifth item inspected is the first defective item found?
Solution:
Let X be number of trials required to produce the first defective.
So,
X Geo(p = 0.01).
Using the geometric distribution with x = 5 and p = 0.01, we
have
P (X = 5) = g(5; 0.01) = (0.01)(0.99)4 = 0.0096.

33

Memoryless P (X > t + s|X > t) = P (X > s)


Suppose X has a geometric distribution with parameter p, that
is, X Geo(p). The probability density function X is
P (X = x) = g(x; p) = pq x1,

x = 1, 2, 3, . . . .

Since
P (X t) =

t
X

pq x1 = 1 q t,

x=1

P (X > t) = 1 P (X t) = q t.
probability, we have

By definition of condition

P (X > t + s, X > t)
P (X > t + s)
=
P (X > t)
P (X > t)
q t+s
= t = q s = P (X > s).
q

P (X > t + s|X > t) =

34

Memoryless Property
A random variable X has the property that the future is independent of the past.
In mathematics:
P (X > t + s|X > t) = P (X > s)
In words:
If an event hasnt occurred by
will occur after an additional s
(unconditional) probability that
forgot that it made it past time

time t, the probability that it


time units is the same as the
it will occur after time s it
t!

35

Memoryless Property
Some lifetime problems exhibit a very important property called
the memoryless property. For example, consider a very large
number of identical radioactive atoms, and observe their decay.

During the first t seconds, a proportion p of these atoms


disintegrate.

Now leave the lab, and come back any time later. Consider
the atoms that have not disintegrated yet, and observe the
radioactive process for another t seconds. During this time,
the proportion of these atoms that will disintegrate is also p.
36

Now translate these observations into terms of probabilities for


individual atoms.

We first state that the probability for an atom to disintegrate


in the first t seconds is p.

We return to the lab after s seconds. Then the probability


for a (still not disintegrated) atom to disintegrate within the
next t seconds is also p.

Negative Binomial Distribution


If repeated independent trials can result in a success with probability p and a failure with probability q = 1p, then the probability
distribution of the random variable X, the number of the trials
on which the kth success occurs, is
P (X = x) = nb(x; k, p) =

E(X) =

k
p

and

x1
k1

pk q xk ,

2 = Var(X) =

x = k, k+1, k+2, . . . .
k(1 p)
p2

37

Negative Binomial Distribution


X= the number of the trials on which the kth success occurs if
and only if get exactly k 1 successes by time x 1, and then
the kth success at time x
"

P (X = x) =
=

x1
k1
x1
k1

pk1q xk p,

x = k, k + 1, k + 2, . . . .

pk q xk ,

x = k, k + 1, k + 2, . . . .

38

Example 7
Find the probability that a person flipping a fair coin gets the
second head on the sixth flip.
Solution:
Let X be the number of flips needed to get the 2nd head.
X NB(2, 0.5),

nb(6; 2, 0.5) = P (X = 6) =

5
1

(0.5)2(0.5)4.

39

Example 8
In an NBA (National Basketball Association) championship series, the team who wins four games out of seven will be the
winner. Suppose that team A has probability 0.55 of wining over
B and both teams A and B face each other in the championship
games.
1. What is the probability that team A will win the series in six
games?
2. What is the probability that team A will the series?
Solution:
Let X represent the number of games that team A needs to play
to win the championship. X N B(4, 0.55).
40

1.
P (X = 6) = nb(6; 4, 0.55)
=

5
3

(0.055)4(1 0.55)64

= 0.1853.

2.
P (team A wins the championship series)
=P (X = 4) + P (X = 5) + P (X = 6) + P (X = 7)
=nb(4; 4, 0.55) + nb(5; 4, 0.55) + nb(6; 4, 0.55) + nb(7; 4, 0.55)
=0.6083.

Hypergeometric Distribution

41

Hypergeometric Experiment
In general, we are interested in the probability of selecting x successes from the k items labeled successes and n x failures from
the N k items labeled failures when a random sample of size
n is selected from N items. This is known as a hypergeometric experiment, that is, one that possesses the following two
properties:

1. A random sample of size n is selected without replacement


from N items.

2. k of the N items may be classified as successes and N k


are classified as failures.
42

Hypergeometric Distribution
The probability distribution of the hypergeometric random variable X, the number of successes in a random sample of size n
selected from N items of which k are labeled success and N k
labeled failure, is

P (X = x) = h(X = x; N, n, k) =

k
x

N k
nx
N
n

max{0, n (N k)} x min{n,k}.


nk
E(X) =
N

and

N n
k
k
Var(X) =
n (1 ).
N 1
n
N
43

Example 9
Suppose we have a lot of 100 items of which 12 are defective.
What is the probability that in a sample of 10, 3 are defective?
Solution:
Let X be the number of defectives in of sample of 10. So,
X Hyp(100, 10, 12).
Using the hypergeometric probability function we have

h(3; 100, 10, 12) =

12
3

88
7

100
10

= 0.08.

44

Poisson Process and Poisson Distribution

45

Poisson Experiment
Experiments yielding numerical values of a random variable X,
the number of outcomes occurring during a given time interval
or in a specified region, are called Poisson experiments.
The given time interval may be of any length, such as a minute,
a day, a week, a month, or even a year.

The number of telephone calls per hour received by an office.

The number of postponed games due to rain during a baseball season.


46

The specified region could be a line segment, an area, a volume,


or perhaps a piece of material.

The number of field mice per acre.

The number of bacteria in a given culture.

The number of typing errors per page.

Poisson Process
Let the number of outcomes occurring in a given continuous
interval be counted. We have a Poisson process with parameter
> 0 if the following are satisfied:

1. The numbers of outcomes occurring in nonoverlapping intervals are independent.

2. The probability of exactly one change in a sufficiently short


interval of length h is approximately h.

3. The probability of two or more outcomes in a sufficiently


short interval is essentially zero.
47

Properties of Poisson Process


1. (No memory) The number of outcomes occurring in one time
interval or specified region is independent of the number
that occurs in any other disjoint time interval or region of
space.
2. The probability that a single outcome will occur during a
very short time interval or in a small region is proportional
to the length of the time interval or the size of the region
and dose not depend on the number of outcomes occurring
outside this time interval or region.
3. The probability that more than one outcome will occur in
such a short time interval or fall in such a small region is
negligible.
48

Poisson Distribution
The probability distribution of the Poisson random variable X,
representing the number of outcomes occurring in a given
time interval or specified region denoted by t, is
e-t(t)x
P (X = x) = p(x; t) =
x = 0, 1, 2, . . . ,
x!
where is the average number of outcomes per unit time,
distance, area, or volume, and e = 2.71828 .
E(X) = Var(X) = t.

49

Probability distribution for Poisson random variables. The


horizontal axis is the index x and the vertical axis is the
corresponding probability, f (x).
50

If events in a Poisson process occur at a mean rate of per


unit, then the expected number of occurrences in an interval of
length t is t. For example, if phone calls arrive at a switchboard
following a Poisson process at a mean rate of three per minute,
then the expected number of phone calls in a 5-minute period
is
3 5 = 15.
Or if calls arrive at a mean rate of 22 in a 5-minute period, the
expected number of calls per minute is
1
22 = 4.4.
5

51

Example 10
The mean number of accidents per month at a certain intersection is 1. What is the probability that in any given year 10
accidents will occur at this intersection?
Solution:
Let X be the number of accidents per month at the certain intersection. Then X Poi(1). The unit time for this example
is 1 month, but the time interval we consider is 1 year = 12
months. Therefore, t = 12. The probability in any given year
10 accidents will occur at this intersection is
p(10; t = 12) =

10 12
X
e
12x
x=0

x!

52

Example 11
During a laboratory experiment the average number of radioactive particles passing through a counter in 1 millisecond is 4.
What is the probability that 6 particles enter the counter in a
given millisecond?
Solution:
Let X be the number of radioactive particles passing through
a counter in 1 millisecond, the X Poi(4). Using the Poisson
distribution with x = 6 and t = 4 and Table A.2, we have
e446
P (X = 6) = p(6; 4) =
= 0.1042.
6!

53

Example 12
Ten is the average number of oil tankers arriving each day at a
certain port city. The facilities at the port can handle at most
15 tankers per day. What is the probability that on a given day
tankers have to be turned away?
Solution:
Let X be the number of tankers arriving each day. X Poi(10).
P (X > 15) = 1 P (X 15) = 1

15
X

p(x; 10)

x=0

=1

15 15
X
e
15x
x=0

x!

= 0.0487.

54

Poisson Approximate Binomial

55

Theorem
Let X be a binomial random variable with probability distribution
n
b(x; n, p). When n , p 0, and np remains constants,
n

b(x; n, p) p(x; ).

56

Example 13
In a manufacturing process where glass products are produced,
defects or bubbles occur, occasionally rendering the piece undesirable for marketing. It is known that, on average, 1 in every
1000 of these items produced has one or more bubbles. What
is the probability that a random sample of 8000 will yield fewer
than 7 items possessing bubbles?
Solution:
This is essentially a binomial experiment with n = 8000 and
p = 0.001. Since p is very close to zero and n is quite large, we
shall approximate with the Poisson distribution using
= 8000 0.001 = 8.
Hence, if X represents the number of items producing bubbles,
we have
P (X < 7) =

6
X
x=0

b(x; 8000, 0.001)

6
X

p(x; 8) = 0.3134.

x=0
57