Professional Documents
Culture Documents
Chap 3 Some Special Distributions
Chap 3 Some Special Distributions
E-mail: khanhvanphan@hcmut.edu.vn
July 2, 2020
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 1 / 79
Table of Contents
1 Bernoulli distribution
2 Binomial distribution
3 Geometric distributions
4 Hypergeometric distribution
5 Negative binomial distributions
6 Poisson distributions. Poisson processes
7 Uniform distributions.
8 Normal distributions. Central limit theorems
9 Log-normal distributions
10 Exponential distributions
11 Erlang distributions and Gamma distributions
12 Weibull distributions
13 Beta distributions
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 2 / 79
Bernoulli distribution
1 Flip a coin. X is the number of heads. P(X = 1) = 0.5 = P(X = 0).
2 A worn machine tool produces 1% defective parts. Let X = number
of defective part produced in one trial. P(X = 1) = 0.01,
P(X = 0) = 0.99
3 Each sample of air has a 10% chance of containing a particular rare
molecule. Let X = the number of air samples that contain the rare
molecule when the next sample is analyzed. P(X = 1) = 0.1.
4 Of all bits transmitted through a digital transmission channel, 10%
are received in error. Let X = the number of bits in error when the
next bit is transmitted. P(X = 1) = 0.1, P(X = 0) = 0.9.
5 Of all patients suffering a particular illness, 35% experience
improvement from a particular medication. A patient is administered
the medication, let X = the number of patients who experience
improvement. P(X = 1) = 0.35, P(X = 0) = 0.75.
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 3 / 79
Bernoulli distribution
The random variable X is called a Bernoulli random variable if it takes
the value 1 with probability p and the value 0 with probability q = 1 − p.
µ = E (X ) = p, σ 2 = V (X ) = p(1 − p).
Example
A worn machine tool produces 1% defective parts. Let X = number of
defective part produced in one trial. P(X = 1) = 0.01,
P(X = 0) = 0.99.
E (X ) = 0.01
V (X ) = 0.01 × 0.99 = 0.099.
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 4 / 79
Binomial distribution
1 Flip a coin 3 times. Let X = number of heads obtained.
2 A worn machine tool produces 1% defective parts. Let X = number
of defective parts in the next 25 parts produced.
3 Each sample of air has a 10% chance of containing a particular rare
molecule. Let X = the number of air samples that contain the rare
molecule in the next 18 samples analyzed.
4 Of all bits transmitted through a digital transmission channel, 10%
are received in error. Let X = the number of bits in error in the next
5 bits transmitted.
5 A multiple-choice test contains 10 questions, each with four choices,
and you guess at each question. Let X = the number of questions
answered correctly.
6 In the next 20 births at a hospital, let X = the number of female
births.
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 5 / 79
Binomial distribution
Flip 3 coins
Flip 3 coins, consider the number of heads:
x 0 1 2 3
P(X = x) 18 38 38 18
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 6 / 79
Binomial distribution
Binomial distribution.
A random experiment consists of n Bernoulli trials such that
1 The trials are independent.
2 Each trial results in only two possible outcomes, labeled as “success”
and “failure.”
3 The probability of a success in each trial, denoted as p, remains
constant.
The random variable X that equals the number of trials that result in a
success is a binomial random variable with parameters 0 < p < 1 and
n = 1, 2, · · · .
The probability mass function (PMF) of X is
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 7 / 79
Digital Channel
The chance that a bit transmitted through a digital transmission channel
is received in error is 0.1. Also, assume that the transmission trials are
independent. Let X = the number of bits in error in the next four bits
transmitted. Determine P(X = 2).
Outcome x Outcome x
OOOO 0 EOOO 1
OOOE 1 EOOE 2
OOEO 1 EOEO 2
OOEE 2 EOEE 3
OEOO 1 EEOO 2
OEOE 2 EEOE 3
OEEO 2 EEEO 3
OEEE 3 EEEE 4
P(X = 2) = 6 · P(E ) · P(O)2 = 6 · 0.12 · 0.92 = 0.0486.
2
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 8 / 79
Mean and Variance
If X is a binomial random variable with parameter p,
Digital Channel
PMF of X :
X 0 1 2 3 4
P(X = x) 0.6561 0.2916 0.0486 0.0036 0.0001
P
µ = Px.P(X = x) = 0.4.
σ 2 = x 2 .P(X = x) − µ2 = 4 × 0.1 × 0.9 = 0.36
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 9 / 79
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 10 / 79
Example
A multiple-choice test contains 25 questions, each with four answers.
Assume that a student just guesses on each question. Find the probability
(a) that the student answers more than 20 questions correctly?
(b) that the student answers fewer than 5 questions correctly?
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 11 / 79
Geometric distributions
Digital Channel
The probability that a bit transmitted through a digital transmission
channel is received in error is 0.1. Assume that the transmissions are
independent events, and let the random variable X denote the number of
bits transmitted until the first error.
Then P(X = 5) is the probability that the first four bits are transmitted
correctly and the fifth bit is in error: {OOOOE }, where O denotes an
okay bit.
P(X = 5) = P(OOOOE ) = 0.94 0.1 = 0.066
Note that there is some probability that X will equal any integer value.
Also, if the first trial is a success, X = 1. Therefore, the range of X is
{1, 2, 3, ...}, that is, all positive integers.
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 12 / 79
Geometric distributions
Geometric distributions
In a series of Bernoulli trials (independent trials with constant probability
p of a success), the random variable X that equals the number of trials
until the first success is a geometric random variable with parameter
0 < p < 1 and
µ = E (X ) = 1/p, σ 2 = V (X ) = (1 − p)/p 2 .
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 13 / 79
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 14 / 79
Example
Assume that each of your calls to a popular radio station has a probability
of 0.02 of connecting, that is, of not obtaining a busy signal. Assume that
your calls are independent.
(a) What is the probability that your first call that connects is your 10th
call?
(b) What is the probability that it requires more than five calls for you to
connect?
(c) What is the mean number of calls needed to connect?
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 15 / 79
Example
An array of 30 LED bulbs is used in an automotive light. The probability
that a bulb is defective is 0.001 and defective bulbs occur independently.
Determine the following:
a) Probability that an automotive light has two or more defective bulbs.
b) Expected number of automotive lights to check to obtain one with
two or more defective bulbs.
a) Let X be the number of defective bulbs. X is a binomial random
variable.
P(X ≥ 2) = 1 − P(X = 0) − P(X = 1) =
0 0.99930 − C 1 0.001 · 0.99929 ≈ 0.000423
1 − C30 30
b) Let Y be the number of automotive lights to check to obtain one
with two or more defective bulbs. Y is a geometric random variable
with the parameter p = P(X ≥ 2) = 0.000423. Therefore, the
expected number of automotive lights to check to obtain one with
two or more defective bulbs: E (Y ) = 1/p ≈ 2364 (lights).
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 16 / 79
Hypergeometric distribution
Let X equal the number of parts in the sample from the local supplier.
C 4 .C200
0
(a) P(X = 4) = 100 4
C300
= 0.0119.
(b) P(X ≥ 2) = P(X = 2) + P(X = 3) + P(X = 4) = 0.408.
(c) P(X ≥ 1) = 1 − P(X = 0) = 0.804.
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 17 / 79
Hypergeometric distribution
Hypergeometric distribution
A set of N objects contains K objects classified as successes and N − K
objects classified as failures. A sample of size n objects is selected
randomly (without replacement) from the N objects where K ≤ N and
n ≤ N. The random variable X that equals the number of successes in the
sample is a hypergeometric random variable and
n−x
CKx .CN−K
f (x) = , x = max{0, n + K − N} to min{K , n}.
CNn
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 18 / 79
Mean and Variance
If X is a hypergeometric random variable with parameters N, K , and n,
then
µ = E (X ) = np and
σ 2 = V (X ) = np(1 − p) N−n
N−1 , where p = K /N.
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 21 / 79
Let X be the number of numbers chosen by a player that match the
numbers in the state’s sample. X is a hypergeometric random variable.
C66 C34
0
a) P(X = 6) = C406 = 2.6053.10−7
C65 C34
1
b) P(X = 5) = C406
C64 C34
2
c) P(X = 4) = C406
d) Assume that the ”success” is the event that the player matches all 6
numbers in the state’s sample.
Let Y be number of weeks until the first ”success”. Then Y is a
geometric random variable with the parameter
p = P(X = 6) = 2.6053.10−7 .
The expected number of weeks until a player matches all 6 numbers
in the state’s sample: E (Y ) = 1/p = 3838380.
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 22 / 79
Negative Binomial Distribution
Camera flashes
Consider the time to recharge the flash of cameras. The probability that a
camera passes the test is 0.8, and the cameras perform independently.
What is the probability that the third failure is obtained in five or fewer
tests?
Let X denote the number of cameras tested until 3 failures have been
obtained.
The requested probability is P(X ≤ 5). Here X has a negative binomial
distribution with p = 0.2 and r = 3.
Therefore,
P(X ≤ 5) = P(X = 3) + P(X = 4) + P(X = 5)
= C22 0.23 + C32 0.23 · 0.8 + C42 0.23 · 0.82 = 0.056.
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 23 / 79
Negative Binomial Distribution
In a series of Bernoulli trials (independent trials with constant probability
p of a success), the random variable X that equals the number of trials
until r successes occur is a negative binomial random variable with
parameters 0 < p < 1 and r = 1, 2, 3, ..., and
r −1
f (x) = Cx−1 (1 − p)x−r p r , x = r , r + 1, r + 2, ...
µ = E (X ) = r /p, σ 2 = V (X ) = r (1 − p)/p 2 .
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 24 / 79
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 25 / 79
Example
In a clinical study, volunteers are tested for a gene that has been found to
increase the risk for a disease. The probability that a person carries the
gene is 0.1.
a) What is the probability that four or more people need to be tested to
detect two with the gene?
b) What is the expected number of people to test to detect two with the
gene?
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 26 / 79
Random processes
Let N(t) ∈ {0, 1, 2...} be the number of customers who have visited a
bank from t = 9 am until time t, on a given day, for t ∈ [9, 16] -
continuous-time.
Let W (t) ∈ R be the thermal noise voltage generated across a resistor
in an electric circuit at time t, for t ∈ [0, ∞) - continuous-time.
Let X (tn ) = Xn ∈ [25, 39] be the temperature in HCM city in the
n − th day of May, n = 1, 2, 3....31 - discrete-time.
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 28 / 79
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 29 / 79
Poisson processes
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 30 / 79
Poisson random variable
The random variable X that equals the number of events that occur in the
interval of length T in a Poisson process is a Poisson random variable
with parameter 0 < λ, and
e −λT (λT )x
f (x) = , x = 0, 1, 2...
x!
µ = E (X ) = λT , σ 2 = V (X ) = λT .
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 31 / 79
Example 1
The number of telephone calls that arrive at a phone exchange is often
modeled as a Poisson random variable. Assume that on the average there
are 10 calls per hour.
a) What is the probability that there are exactly 5 calls in one hour?
b) What is the probability that there are 3 or fewer calls in one hour?
c) What is the probability that there are exactly 15 calls in two hours?
d) What is the probability that there are exactly 5 calls in 30 minutes?
Uniform distributions
A continuous uniform random variable X is a continuous random
variable with: (
1
, a ≤ x ≤ b,
PDF: f (x) = b−a .
0, x < a or x > b
0
x < a,
x−a
CDF: F (x) = b−a , a ≤ x ≤ b,
1, x >b
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 35 / 79
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 36 / 79
Example 1
An e-mail message will arrive at a time uniformly distributed between
9 : 00 a.m. and 11 : 00 a.m. You check e-mail at 9 : 15 a.m. and every 30
minutes afterward.
a) What is the standard deviation of arrival time (in minutes)?
b) What is the probability that the message arrives less than 10 minutes
before you view it?
c) What is the probability that the message arrives more than 15
minutes before you view it?
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 38 / 79
Example 2
The volume of a shampoo filled into a container is uniformly distributed
between 374 and 380 milliliters.
a) What are the mean and standard deviation of the volume of
shampoo?
b) What is the probability that the container is filled with less than the
advertised target of 375 milliliters?
c) What is the volume of shampoo that is exceeded by 95% of the
containers?
d) Every milliliter of shampoo costs the producer $0.002. Any shampoo
more than 375 milliliters in the container is an extra cost to the
producer. What is the mean extra cost?
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 39 / 79
a) Let X be the volume of a shampoo filled into a container. X is a
uniform random variable with a = 374, b = 380.
E (X ) = a+b = 377
p 2 b−a
√
σ = V (x) = √ 12
= 3.
R 375
b) P(X < 375) = 374 dt 1
6 = 6.
R 380
c) P(X > x0 ) = 95% ⇔ x0 dt 6 = 0.95
380−x0
⇔ 6 = 0.95 ⇔ x0 = 380 − 6 · 0.95 = 374.3.
d) The mean extra cost:
C = (E (X ) − 375) · 0.002 = ( 374+380
2 − 375) · 0.002 = 0.004($)
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 40 / 79
Normal distributions
A random variable X with probability density function
−(x−µ)2
f (x) = √1 e 2σ 2 , −∞ < x < ∞
2πσ
E (X ) = µ and V (X ) = σ 2 .
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 41 / 79
For any normal random variable:
P(µ − σ < X < µ + σ) = 0.68
P(µ − 2σ < X < µ + 2σ) = 0.95
P(µ − 3σ < X < µ + 3σ) = 0.997
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 42 / 79
Normal distributions
Example 1
Assume that the current measurements in a strip of wire follow a normal
distribution with a mean of 10 milliamperes and a variance of 4
(milliamperes) . What is the probability that a measurement exceeds 13
milliamperes?
µ = 0, σ2 = 1
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 44 / 79
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 45 / 79
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 46 / 79
Normal distributions
Example 2
The time until recharge for a battery in a laptop computer under common
conditions is normally distributed with a mean of 260 minutes and a
standard deviation of 50 minutes. What is the probability that a battery
lasts more than four hours?
Let X be the time until recharge for a battery in a laptop computer under
common conditions. X ∼ N(260, 502 )
Change the variable: Z = X σ−µ = X −260
50 , we have Z ∼ N(0, 1)
R∞ 1 −z 2
P(X ≥ 240) = P(Z ≥ −0.4) = √ e 2 dz = 0.6554
2π
−0.4
Or we can use the table: P(Z ≥ −0.4) = 1 − P(Z < −0.4) = 1 − 0.3446.
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 47 / 79
Normal distributions
Example 3
The weight of a sophisticated running shoe is normally distributed with a
mean of 12 ounces and a standard deviation of 0.5 ounce.
a) What is the probability that a shoe weighs more than 13 ounces?
b) What value of weight is exceeded with 95% probability?
c) What must the standard deviation of weight be in order for the
company to state that 99.9% of its shoes weighs less than 13 ounces?
d) If the standard deviation remains at 0.5 ounce, what must the mean
weight be for the company to state that 99.9% of its shoes weighs less
than 13 ounces?
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 48 / 79
a) X = 13 ⇒ Z = 2.
P(Z > 2) = 1 − P(Z ≤ 2) (use table) = 1 − 0.9773 = 0.0227
b) P(Z > z0 ) = 0.95 ⇒ P(Z ≤ z0 ) = 0.05 (use table) ⇒ z0 = −1.64
⇒ x0 = z0 · 0.5 + 12 = 11.18
c) In order for the company to state that 99.9% of its shoes weighs less
than X = 13 ounces: (use table) Z = 3.1
⇒ σ = X Z−µ = 13−12
3.1 = 0.3226.
d) If σ = 0.5, then µ = X − Z σ = 13 − 3.1 · 0.5 = 11.45.
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 49 / 79
Central limit theorem
Let X1 , X2 , · · · be a sequence of independent and identically distributed
(i.i.d.) random variables, each having mean µ and variance σ 2 .
Then the distribution of
X1 + ....Xn − nµ
√
σ n
tends to the standard normal N(0, 1) as n → ∞. That is, for
−∞ < a < ∞,
n o Ra −x 2 /2
√ n −nµ ≤ a → √1
P X1 +···+X e dx, as n → ∞
σ n 2π
−∞
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 50 / 79
Normal Approximation to the Binomial Distribution
Example
Assume that in a digital communication channel, the number of bits
received in error can be modeled by a binomial random variable, and
assume that the probability that a bit is received in error is 10−5 . If 16
million bits are transmitted, what is the probability that 150 or fewer errors
occur?
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 51 / 79
Normal Approximation to the Binomial Distribution
If X is a binomial random variable with parameters n and p,
Z = √X −np
np(1−p)
and
x−0.5−np
√
P(X ≥ x) = P(X ≥ x − 0.5) ≈ P Z ≥
np(1−p)
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 52 / 79
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 53 / 79
Return to the previous example
Assume that in a digital communication channel, the number of bits
received in error can be modeled by a binomial random variable, and
assume that the probability that a bit is received in error is 10−5 . If 16
million bits are transmitted, what is the probability that 150 or fewer errors
occur?
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 54 / 79
Normal Approximation to Poisson Distribution
If X is a Poisson random variable with E (X ) = λ and V (X ) = λ,
X√−λ
Z= λ
and
x−0.5−λ
P(X ≥ x) = P(X ≥ x − 0.5)Z ≈ P Z ≥ √
λ
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 55 / 79
Normal Approximation to Poisson Distribution
Example 1
The number of students who enroll in a psychology course is a Poisson
random variable with mean µ = 100. The professor in charge of the course
has decided that if the number enrolling is 120 or more, he will teach the
course in two separate sections, whereas if fewer than 120 students enroll,
he will teach all of the students together in a single section. What is the
probability that the professor will have to teach two sections?
∞
100i
The exact solution: e −100
P
i! : difficult to compute!
i=120
If X denotes the number of students who enroll in the course, we have
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 56 / 79
Normal Approximation to Poisson Distribution
Example 2
Assume that the number of asbestos particles in a squared meter of dust
on a surface follows a Poisson distribution with a mean of 1000. If a
squared meter of dust is analyzed, what is the probability that 950 or
fewer particles are found?
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 57 / 79
Log-normal distributions
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 58 / 79
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 59 / 79
Example
Suppose that the length of stay (in hours) at a hospital emergency
department is modeled with a lognormal random variable X with θ = 1.5
and ω = 0.4. Determine the following in parts (a) and (b):
a) Mean and variance
b) P(X < 8)
2 2
a) Mean E (X ) = e θ+ω /2= e 1.5+0.4
/2 = 4.855.
2 2 2 2
Variance V (X ) = e 2θ+ω e ω − 1 = e 3+0.4 e 0.4 − 1 = 4.0898.
R8 (ln x − 1.5)2
1√
b) P(X < 8) = x·0.4· 2π exp − dx = 0.9263
0 2 · 0.42
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 60 / 79
Exponential Distribution
The random variable X that equals the distance between successive events
from a Poisson process with mean number of events λ > 0 per unit interval
is an exponential random variable with parameter λ. The PDF of X is
µ = E (X ) = λ1 , and σ 2 = V (X ) = 1
λ2
Memoryless
If the random variable X has exponential distribution with parameter λ,
then X is memoryless random variable, that is
P(X > x + aX > a) = P(X > x), for a, x ≥ 0
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 61 / 79
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 62 / 79
Exponential Distribution
Computer Usage
In a large corporate computer network, user log-ons to the system can be
modeled as a Poisson process with a mean of 25 log-ons per hour. What is
the probability that there are no log-ons in an interval of six minutes?
Let X denote the time in hours from the start of the interval until the first
log-on. Then X has an exponential distribution with λ = 25 log-ons per
hour.
We are interested in the probability that X exceeds 6 minutes. Because λ
is given in log-ons per hour, we express all time units in hours. That is, 6
minutes = 0.1 hour.
R∞
Therefore, P(X > 0.1) = 25e −25x dx = e −25(0.1) = 0.082.
0.1
The CDF also can be used to obtain the same result as follows:
P(X > 0.1) = 1 − F (0.1) = e −25(0.1) = e −2.5 .
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 63 / 79
Exponential Distribution
Example 2
Web crawlers need to estimate the frequency of changes to Web sites to
maintain a current index for Web searches. Assume that the changes to a
Web site follow a Poisson process with a mean of 3.5 days.
a) What is the probability that the next change occurs in less than 2.0
days?
b) What is the probability that the time until the next change is greater
7.0 days?
c) What is the time of the next change that is exceeded with probability
90%
d) What is the probability that the next change occurs in less than 10.0
days, given that it has not yet occurred after 3.0 days?
Rx t −x
1 − 3.5
CDF: F (x) = 0 3.5 e dt = 1 − e 3.5
a) P(X < 2) = F (2) = 0.4353.
b) P(X > 7) = 1 − P(X < 7) = 1 − F (7) = 0.1353
−x0
c) P(X > x0 ) = 1 − P(X < x0 ) = e 3.5 = 0.9
⇒ x0 = −3.5 ln 0.9 = 0.3688
R 10 t
1 − 3.5
P(3<X <10) 3 3.5 e dt
d) P(X < 10|X > 3) = P(X >3) = R∞ 1 − t = 0.8647
3 3.5 e
3.5 dt
F (10) − F (3)
Or we can use: P(X < 10|X > 3) = = 0.8647.
1 − F (3)
Another method: Using the memoryless property of the exponential
random variable:
7
P(X < 10|X > 3) = P(X < (10 − 3)) = F (7) = 1 − e − 3.5 = 0.8647.
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 65 / 79
Erlang distributions and Gamma distributions
Processor Failure
The failures of the central processor units of large computer systems are
often modeled as a Poisson process. Typically, failures are not caused by
components wearing out but by more random failures of the large number
of semiconductor circuits in the units. Assume that the units that fail are
immediately repaired, and assume that the mean number of failures per
hour is 0.0001. Let X denote the time until four failures occur in a
system. Determine the probability that X exceeds 40, 000 hours.
Let the random variable N denote the number of failures in 40, 000 hours
of operation. The time until four failures occur exceeds 40, 000 hours if
and only if the number of failures in 40, 000 hours is three or less.
Therefore,
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 66 / 79
The assumption that the failures follow a Poisson process implies that N
has a Poisson distribution with
Therefore,
3
P e −4 4k
P(X > 40, 000) = P(N ≤ 3) = k! = 0.433
k=0
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 67 / 79
Gamma functions
The gamma function is
R∞
Γ(r ) = x r −1 e −x dx, for r > 0.
0
Gamma distributions
The random variable X with PDF
λr x r −1 e −λx
f (x) = Γ(r ) , for x > 0
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 68 / 79
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 69 / 79
Gamma distributions
Example 1
The time between failures of a laser in a cytogenics machine is
exponentially distributed with a mean of 25, 000 hours.
(a) What is the expected time until the second failure?
(b) What is the probability that the time until the third failure exceeds
50, 000 hours?
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 70 / 79
Gamma distributions
Example 2
Raw materials are studied for contamination. Suppose that the number of
particles of contamination per pound of material is a Poisson random
variable with a mean of 0.01 particle per pound.
(a) What is the expected number of pounds of material required to obtain
15 particles of contamination?
(b) What is the standard deviation of the pounds of materials required to
obtain 15 particles of contamination?
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 71 / 79
Weibull distributions
The Weibull distribution is often used to model the time until failure of
many different physical systems: bearing wear, some
semiconductorsfailures caused by external shocks to the system.
Weibull distributions
The random variable X with PDF
β−1 h i
x β
f (x) = βδ xδ
exp − δ , for x > 0
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 72 / 79
CDF, Mean and Variance
If X is a Weibull random variable with parameters δ and β,
β
−( δx )
CDF: F (X ) = 1 − e .
Mean: µ = E (x) = δΓ 1 + β1 .
h i2
Variance: σ 2 = V (X ) = δ 2 Γ 1 + β2 − δ 2 Γ 1 + β1 .
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 73 / 79
Weibull distributions
Example
Assume that the life of a packaged magnetic disk exposed to corrosive
gases has a Weibull distribution with β = 0.5 and the mean life is 600
hours. Determine the following:
a) Probability that a disk lasts at least 500 hours.
b) Probability that a disk fails before 400 hours.
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 74 / 79
Beta distributions
A continuous distribution that is flexible but bounded over a finite range is
useful for probability models. For example: The proportion of solar
radiation (∈ [0, 1]) absorbed by a material or the proportion (of the
maximum time) required to complete a task in a project.
Beta distributions
The random variable X with probability density function
Γ(α + β) α−1
f (x) = x (1 − x)β−1 , for x in [0, 1]
Γ(α)Γ(β)
is a beta random variable with parameters α > 0 and β > 0.
Beta distributions
If X has a beta distribution with parameters α and β
α αβ
E (X ) = α+β and V (X ) = (α+β)2 (α+β+1)
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 75 / 79
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 76 / 79
Beta distributions
Example 1
The maximum time to complete a task in a project is 2.5 days. Suppose
that the completion time as a proportion of this maximum is a beta
random variable with α = 2 and β = 3. What is the probability that the
task requires more than two days to complete?
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 77 / 79
Beta distributions
Example 2
An allele is an alternate form of a gene, and the proportion of alleles in a
population is of interest in genetics. An article in BMC Genetics
[“Calculating Expected DNA Remnants From Ancient Founding Events in
Human Population Genetics” (2008, Vol. 9:66)] used a beta distribution
with mean 0.3 and standard deviation 0.17 to model initial allele
proportions in a genetic simulation. Determine the parameters α and β for
this beta distribution.
µ = α = 0.3,
(
α+β α = 73 β,
We have q
αβ ⇔ 1.47
10β+7 = 0.0289
σ = = 0.17
(α+β)2 (α+β+1)
(
α = 1.8799,
⇔
β = 4.3865
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 78 / 79
Thank you for your attention!
(Phan Thi Khanh Van) Chap 3: Some special distributions July 2, 2020 79 / 79