You are on page 1of 48

Applied Statistics and

Probability for
Engineers
Sixth Edition
Douglas C. Montgomery George C.

Runger
Chapter 3
Discrete Random Variables and Probability
Distributions
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
3
Discrete
Random
Variables and
Probability
CHAPTER OUTLINE Distributions
3-1 Discrete Random 3-5 Discrete Uniform
Variables Distribution
3-2 Probability Distributions 3-6 Binomial Distribution
and 3-7 Geometric and Negative
Probability Mass Binomial Distributions
Functions 3-8 Hypergeometric
3-3 Cumulative Distribution Distribution
Functions 3-9 Poisson Distribution
3-4 Mean and Variance of a
Discrete
Chapter Random
3 Title and Outline
2

Variable Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Learning Objectives for Chapter
3
After careful study of this chapter, you should be able
to do the
following:
1. Determine probabilities from probability mass
functions and the reverse.
2. Determine probabilities and probability mass
functions from cumulative distribution functions
and the reverse.
3. Calculate means and variances for discrete random
variables.
4. Understand the assumptions for discrete
probability distributions.
5. Select an appropriate discrete probability
distribution to calculate probabilities.
6. Calculate probabilities, means and variances for
discrete probability distributions.
Chapter 3 Learning Objectives 3
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Discrete Random Variables
Physical systems can be modeled by the
same or similar random experiments and
random variables. The distribution of the
random variable involved in each of these
common systems can be analyzed. The
results can be used in different applications
and examples.

We often omit a discussion of the


underlying sample space of the random
experiment and directly describe the
distribution of a particular random variable.

Sec 3-1 Discrete Random Variables 4


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-2: Camera Flash Tests

The time to recharge the flash is Table 3-1 Camera Flash Tests
tested in three cell-phone cameras. Outcome
The probability that a camera Camera #
passes the test is 0.8, and the 1 2 3 Probability X
cameras perform independently. Pass Pass Pass 0.512 3
See Table Fail Pass Pass 0.128 2
Pass Fail Pass 0.128 2
3-1 for the sample space for the Fail Fail Pass 0.032 1
experiment and associated Pass Pass Fail 0.128 2
probabilities. For example, because Fail Pass Fail 0.032 1
the cameras are independent, the Pass Fail Fail 0.032 1
probability that the first and second Fail Fail Fail 0.008 0
1.000
cameras pass the test and the third
one fails, denoted as ppf, is

P(ppf) = (0.8)(0.8)(0.2) =
0.128

The random variable X denotes the


Sec number of cameras
3-1 Discrete Random Variables that pass the 5
test. The last column
Copyright of Wiley
© 2014 John the& table
Sons, Inc. All rights reserved.
Probability Distributions
A random variable is a function that
assigns a
real number to each outcome in the
sample
space of a random experiment.

The probability distribution of a


random variable
X gives the probability for each value
of X.
Sec 3-2 Probability Distributions & Probability Mass
6
Functions
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-4: Digital
Channel
• There is a chance that a
bit transmitted through a
digital transmission
channel is received in
error.
• Let X equal the number Figure 3-1 Probability
of bits received in error distribution for bits in
error.
in the next 4 bits
P(X =0) = 0.6561
transmitted. P(X =1) = 0.2916
• The associated P(X =2) = 0.0486
probability distribution of P(X =3) = 0.0036
X is shown in the table. P(X =4) = 0.0001
• The probability 1.0000
distribution
Sec of X& Probability
3-2 Probability Distributions is given Mass
7
by the possible
Functions
values
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Probability Mass Function
For a discrete random variable X with
possible values
x1, x2, …, xn, a probability mass function is a
function such that:

Sec 3-2 Probability Distributions & Probability Mass


8
Functions
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-5: Wafer
Contamination
• Let the random variable X denote the number of wafers that
need to be analyzed to detect a large particle of
contamination. Assume that the probability that a wafer
contains a large particle is 0.01, and that the wafers are
independent. Determine the probability distribution of X.
• Let p denote a wafer in which a large particle is present & let
a denote a wafer in which it is absent.
• The sample space is: S = {p, ap, aap, aaap, …}
• The range of the values of X is: x = 1, 2, 3, 4, …
Probability Distribution
P(X = 1) = 0.01 0.01
P(X = 2) = (0.99)*0.01 0.0099
2
P(X = 3) = (0.99) *0.01 0.0098
3
P(X = 4) = (0.99) *0.01 0.0097

Sec 3=2 Probability Distributions & Probability Mass


9
Functions
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Cumulative Distribution
Functions
Example 3-6: Consider the probability distribution for the
digital channel
example.
x P(X =x )
0 0.6561
1 0.2916
2 0.0486
3 0.0036
4 0.0001
1.0000

Find the probability of three or fewer bits in error.


• The event (X ≤ 3) is the total of the events: (X = 0), (X =
1), (X = 2), and (X = 3).
• From the table:

(X ≤ 3) = P(X = 0) + P(X = 1) + P(X = 2) + P(X = 3) = 0.99


Sec 3-3 Cumulative Distribution Functions 10
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Cumulative Distribution Function and
Properties

The cumulative distribution function, is the probability


that a random
variable X with a given probability distribution will be
found at a value
less than or equal to x.

F ( x)  P ( X  x)   f ( xi )
Symbolically, x x i

For a discrete random variable X, F(x) satisfies the


following properties:

Sec 3-3 Cumulative Distribution Functions 11


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-8: Sampling without
Replacement
A day’s production of 850 parts contains 50 defective
parts. Two parts are selected at random without
replacement. Let the random variable X equal the number
of defective parts in the sample. Find the cumulative
distribution
The function
probability of X.
mass function is calculated as
follows:
P  X  0  800
850  849
799
 0.886
P  X  1  2  800
850  849  0.111
50

P  X  2  50
850  849
49
 0.003
Therefore,
F  0   P  X  0   0.886
F 1  P  X  1  0.997
F  2   P  X  2   1.000
Figure 3-4 Cumulative Distribution
0 x0 Function
0.886 0  x 1

F ( x)  
0.997 1 x  2
1 2 x
Sec 3-3 Cumulative Distribution Functions 12
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Variance Formula
Derivations
is the definitional
formula

is the computational formula

The computational formula is easier to calculate manua


Sec 3-4 Mean & Variance of a Discrete Random Variable 13
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-9: Digital Channel
In Example 3-4, there is a chance that a bit transmitted
through a digital transmission channel is received in error. X
is the number of bits received in error of the next 4
transmitted. The probabilities are
P(X = 0) = 0.6561, P(X = 2) = 0.0486, P(X = 4) =
0.0001,
P(X = 1) = 0.2916, 2
P(X2 = 3) = 0.0036
2
x f (x ) x · f (x ) (x -0.4) (x -0.4) · f (x ) x · f (x )
Use table to calculate the mean & variance.
0 0.6561 0.0000 0.160 0.1050 0.0000
1 0.2916 0.2916 0.360 0.1050 0.2916
2 0.0486 0.0972 2.560 0.1244 0.1944
3 0.0036 0.0108 6.760 0.0243 0.0324
4 0.0001 0.0004 12.960 0.0013 0.0016
Tota l = 0.4000 0.3600 0.5200
2 2
= Me a n = Va ria nce (σ ) = E (x )
=μ σ 2 = E (x 2 ) - μ 2 = 0.3600
Computa tiona l formula

Sec 3-4 Mean & Variance of a Discrete Random Variable 14


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Expected Value of a Function of a Discrete
Random Variable

f X is a discrete random variable with probability mass function f (x

then its expectation is the variance of X.

Sec 3-4 Mean & Variance of a Discrete Random Variable 15


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-12: Digital Channel

In Example 3-9, X is the number of bits in


error in the next four bits transmitted. What is
the expected value of the square of the
number of bits in error?
X 0 1 2 3 4
f (X ) 0.6561 0.2916 0.0486 0.0036 0.0001

Here h(X) = X2
Answer:
E(X²) = X² · f(X) = 0² х 0.6561 + 1² х 0.2916 + 2² х 0.0486 + 3² х 0.036 + 4² х
0.0001

= 0.5200

Sec 3-4 Mean & Variance of a Discrete Random Variable 16


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Discrete Uniform Distribution

If the random variable X assumes the


values x1, x2,
…, xn, with equal probabilities, then the
discrete
uniform distribution is given by

f(xi) = 1/n

Sec 3-5 Discrete Uniform Distribution 17


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Discrete Uniform Distribution

• Let X be a discrete random variable


ranging from a,a+1,a+2,…,b, for a ≤
b. There are b – (a-1) values in the
inclusive interval. Therefore:
f(x) = 1/(b-a+1)
• Its measures are:
μ = E(x) = (b+a)/2
σ2 = V(x) = [(b-a+1)2–1]/12

Note that the mean is the


midpoint of a & b.
Sec 3-5 Discrete Uniform Distribution 18
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-14: Number of Voice
Lines
Let the random variable X denote the
number of
the 48 voice lines that are in use at a
particular
time. Assume that X is a discrete uniform
random
variable with a range of 0 to 48. Find E(X)
& σ.

Answer:

Sec 3-5 Discrete Uniform Distribution 19


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Binomial Distribution
The random variable X that equals the
number
of trials that result in a success is a
binomial
random variable with parameters 0 < p
< 1 and
n x
f  x2,
  ....
  p 1  p  for x  0,1,...n
n x
n = 1, (3-7)
 x
The probability mass function is:
n
a  b   a b
n n k nk
k
k 0
For constants a and b, the binomial
Sec 3-6 Binomial Distribution
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
20
Example 3-17: Binomial Coefficient

Exercises in binomial coefficient calculation:


10! 10  9  8  7!
 10
3   3!7!  3  2 1 7!  120
15! 15 14 13 12 11.10!
 15
10   10!5!  10!.5  4  3  2 1  3, 003
100! 100  99  98  97.96!
 100
4   4!96!  4  3  2 1.96! 3,921, 225

Sec 3-6 Binomial Distribution 21


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Exercise 3-18: Organic
Pollution-1
Each sample of water has a 10% chance of containing a particular
organic pollutant. Assume that the samples are independent with
regard to the presence of the pollutant. Find the probability that, in
the
next 18 samples, exactly 2 contain the pollutant.

Answer:
Let X denote the number of samples that contain the pollutant in the
next 18 samples analyzed. Then X is a binomial random variable with
p = 0.1 and n = 18

Sec 3-6 Binomial Distribution 22


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Exercise 3-18: Organic
Pollution-2
Determine the probability that at least 4
samples contain the pollutant.
Answer:

Sec 3-6 Binomial Distribution 23


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Exercise 3-18: Organic
Pollution-3
Now determine the probability that
3 ≤ X < 7.
Answer:

Appendix A, Table II (pg. 705) is a cumulative binomial table for


selected values of p and n.

Sec 3-6 Binomial Distribution 24


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Binomial Mean and Variance
If X is a binomial random
variable with
parameters p and n,

μ = E(X) = np
and
σ2 = V(X) = np(1-p)

Sec 3-6 Binomial Distribution 25


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-19:
For the number of transmitted bit
received in error
in Example 3-16, n = 4 and p = 0.1. Find
the mean
and variance of the binomial random
variable.
μ = E(X) = np = 4*0.1 = 0.4
Answer:
σ2 = V(X) = np(1-p) = 4*0.1*0.9 =
0.36

σ = SD(X) = 0.6 26
Sec 3-6 Binomial Distribution
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Geometric Distribution
• Binomial distribution has:
– Fixed number of trials.
– Random number of successes.

• Geometric distribution has reversed roles:


– Random number of trials.
– Fixed number of successes, in this case 1.

• The probability density function of


Geometric distribution is
f(x) = p(1-p)x-1
x = 1, 2, … , the number of failures until the 1st
success. 0 < p < 1, the probability of success.
Sec 3-7 Geometric & Negative Binomial Distributions 27
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3.21: Wafer Contamination
The probability that a wafer contains a
large particle of
contamination is 0.01. Assume that the
wafers are
independent. What is the probability that
exactly 125
wafers need to be analyzed before a
particle
Let X denoteis the number of samples analyzed
until a large particle is detected. Then X is a
detected?
geometric random variable with parameter p
Answer:
= 0.01.

P(X=125) = (0.99)124(0.01) =
0.00288.
Sec 3-7 Geometric & Negative Binomial Distributions 28
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Geometric Mean & Variance
If X is a geometric random variable
with
parameter p,
1
  EX  
p
and

 V X 
2

1 p
p2

Sec 3-7 Geometric & Negative Binomial Distributions 29


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Exercise 3-22: Mean and Standard
Deviation
The probability that a bit transmitted through a digital
transmission
channel is received in error is 0.1. Assume that the
transmissions are
independent events, and let the random variable X denote
the number
of bits transmitted until the first error. Find the mean and
standard
deviation.
Mean = μ = E(X) = 1 / p = 1 / 0.1 = 10
Answer:
Variance = σ2 = V(X) = (1-p) / p2 = 0.9 /
0.01 = 90

Standard deviation = 90 = 9.49


Sec 3-7 Geometric & Negative Binomial Distributions 30
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Lack of Memory Property
For a geometric random variable, the
trials are
independent. Thus the count of the
number of
trials until the next success can be
started at any
trial without changing the probability
distribution of
the random variable.
The implication of using a geometric
model is that
the system presumably will not wear out.31
Sec 3-7 Geometric & Negative Binomial Distributions
For all Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-23: Lack of Memory
Property
In Example 3-21, the probability that a
bit is
transmitted in error is 0.1. Suppose 50
bits have
been transmitted. What is the mean
number of
bits transmitted until the next error?
The mean number of bits transmitted until
the next error, after 50 bits have already
Answer:
been transmitted, is
1 / 0.1 = 10.
the same result as the mean number of bits until the
first error.
Sec 3-7 Geometric & Negative Binomial Distributions 32
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Negative Binomial Distribution

In a series of independent trials with


constant
probability of success p, the random
variable X
which equals the number of trials until r
successes
occur is a negative binomial random
variable with
parameters 0 < p < 1 and r = 1, 2, 3, ....
f  x    rx11  p r 1  p  for x  r , r  1, r  2...
x  r
(3-11)
The probability mass function is:
Sec 3-7 Geometric & Negative Binomial Distributions 33
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Mean & Variance of Negative Binomial

If X is a negative binomial random


variable
with parameters p and r,
r
  EX  
p
and
r 1  p 
 V X  
2

p2

Sec 3-7 Geometric & Negative Binomial Distributions 34


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-25: Camera
Flashes
The probability that a camera passes a particular test is
0.8, and the
cameras perform independently. What is the probability
that the third
failure is obtained in five or fewer tests?

Let X denote the number of cameras tested until three


failures have
been obtained. 5
 x The
1 requested probability is P(X ≤ 5). Here
P( X  5)    (0.2) 3
(0.8) x 3

X has a x 3  2 
negative binomial  3  distribution  4 with p = 0.2 and r = 3.
 0.23    0.23 (0.8)    0.23 (0.8) 2
Therefore,  2   2
 0.056

Sec 3-7 Geometric & Negative Binomial Distributions 35


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Hypergeometric Distribution
• A set of N objects contains:
K objects classified as success
N - K objects classified as failures
• A sample of size n objects is selected without
replacement from the N objects randomly,
where K ≤ N and n ≤ N.
• Let the random variable X denote the number of
successes in the sample. Then X is a
hypergeometric random variable with

 x  n  x 
probability
K N  Kdensity function
f x  where x  max  0, n  K  N  to min  K , n  (3-13)
n
N

Sec 3-8 Hypergeometric Distribution 36


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-27: Parts from Suppliers-1
A batch of parts contains 100 parts from
supplier A
and 200 parts from Supplier B. If 4 parts
are
selected randomly, without replacement,
what is
Let X equal the number of parts in the sample
the probability that they are all from
from Supplier A.
Supplier A?
Answer:

Sec 3-8 Hypergeometric Distribution 37


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-27: Parts from Suppliers-2
What is the probability that two or more parts are from
supplier A?
Answer:

Sec 3-8 Hypergeometric Distribution 38


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-27: Parts from Suppliers-3

What is the probability that at least one part in the


sample is from Supplier A?
Answer:

Sec 3-8 Hypergeometric Distribution 39


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Hypergeometric Mean &
Variance
If X is a hypergeometric random
variable
with parameters N, K, and n, then
 N n
  E  X   np and  2  V  X   np 1  p    (3-14)
 N 1 
where p  K
N
 N n
and   is the finite population correction factor.
 N 1 
σ2 approaches the binomial variance as n /N becomes
small.

Sec 3-8 Hypergeometric Distribution 40


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-29: Customer
Sample-1
A list of customer accounts at a large company contains
1,000 customers. Of these, 700 have purchased at least
one of the company’s products in the last 3 months. To
evaluate a new product, 50 customers are sampled at
random from the list. What is the probability that more than
45 of the sampled customers have purchased from the
company in the last 3 months?

Let X denote the number of customers in the sample who have


purchased from the company in the last 3 months. Then X is a
hypergeometric random variable with N = 1,000, K = 700, n =
50.

P  X  45   

50   700 300
x 50  x

 
x  46 1, 000
50
This a lengthy problem! 

Sec 3-8 Hypergeometric Distribution 41


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-29: Customer
Sample-2
Using the binomial approximation to the
distribution of
X results in
 
50
P  X  45   50 0.7 x 1  0.7 50  x  0.00017
x  
x  46

In Excel
0.000172 = 1 - BINOMDIST(45, 50, 0.7, TRUE)

Sec 3-8 Hypergeometric Distribution 42


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Poisson Distribution
The random variable X that equals
the
number of events in a Poisson
process is a
Poisson random variable with
parametere  x
λ > 0,
f  xthe
and   probability
for x  density
0,1, 2,3,... function
(3-16)
x!
is:

Sec 3-9 Poisson Distribution 43


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-31: Calculations for Wire
Flaws-1
For the case of the thin copper wire,
suppose that
the number of flaws follows a Poisson
distribution
With a mean of 2.3 flaws per mm. Find
the
probability of exactly 2 flaws in 1 mm of
wire.
Answer:
Let X denote the number of flaws in 1
mm of wire
Sec 3-9 Poisson Distribution 44
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-31: Calculations for Wire
Flaws-2
Determine the probability of 10 flaws in 5
mm of
wire.
Answer :
Let X denote the number of flaws in 5
mm of wire.

Sec 3-9 Poisson Distribution 45


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Example 3-31: Calculations for Wire
Flaws-3
Determine the probability of at least 1
flaw in 2 mm
of wire.
Answer :
Let X denote the number of flaws in 2
mm of wire.
Note that P(X ≥ 1) requires  terms. 

Sec 3-9 Poisson Distribution 46


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Poisson Mean & Variance
If X is a Poisson random variable with parameter λ, then

μ = E(X) = λ and σ2=V(X) = λ

The mean and variance of the Poisson model are the same.

For example, if particle counts follow a Poisson distribution with a


mean
of 25 particles per square centimeter, the variance is also 25 and the
standard deviation of the counts is 5 per square centimeter.

If the variance of a data is much greater than the mean, then the
Poisson distribution would not be a good model for the distribution of
the random variable.

Sec 3-9 Poisson Distribution 47


Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Important Terms & Concepts of
Chapter 3
Bernoulli trial Mean – discrete random variable

Binomial distribution Mean – function of a discrete random


variable
Cumulative probability distribution –
discrete random variable Negative binominal distribution

Discrete uniform distribution Poisson distribution

Expected value of a function of a Poisson process


random variable
Probability distribution – discrete
Finite population correction factor random variable

Geometric distribution Probability mass function

Hypergeometric distribution Standard deviation – discrete random


variable
Lack of memory property – discrete
random variable Variance – discrete random variable

Chapter 3 Summary 48
Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.

You might also like