This action might not be possible to undo. Are you sure you want to continue?
Random Processes
Title : Probability Theory and Random Processes
Course Code : 07B41MA106 (310), 4 Credits
Prerequisite : Nil
Objectives :
To study
● Probability: its applications in studying the outcomes of random experiments
● Random variables: types, characteristics, modeling random data
● Stochastic systems: their reliability
● Random Processes: types, properties and characteristics with special
reference to signal processing and trunking theory.
Learning Outcomes :students will be able to
(i) model real life random processes using appropriate statistical distributions;
(ii) compute the reliability of different stochastic systems;
(iii) apply the knowledge of random processes in signal processing and trunking
theory.
Evaluation Scheme
Evaluation Components
Weightage ( in percent)
Teacher Assessment
25
(based on assignments, quizzes,attendence etc.)
T1 (1 hour)
20
T2 (1hour 15 min)
25
T3 (1 hour 30 min)
30

Total 100
Reference Material :
1. T. Veerarajan. Probability, Statistics and Random
processes. Tata McGrawHill.
2. J. I. Aunon & V. Chandrasekhar. Introduction to Probability
and Random Processes. McGrawHill International Ed.
3. A. Papoulis & S. U. Pillai. Probability, Random Variables
and Stochastic Processes. Tata WcGrawHill.
4. Stark, H. and Woods, J.M. Probability and Random
Processes with Applications to Signal Processing.
.
The study of probabilities
originally
came from gambling!
Origins of Probability
Why are Probabilities
Important?
•
They help you to make good decisions, e.g.,
–
Decision theory
•
They help you to minimize risk, e.g.,
–
Insurance
•
They are used in averagecase
time complexity analyses of
 Computer algorithms.
•
They are used to model processes
in
 Engineering.
Random Experiments
•
An experiment whose outcome or result can
be predicted with certainty is called a
deterministic experiment.
•
Although all possible outcomes of an experiment
may be known in advance, the outcome of a
particular performance of the experiment cannot be
predicted owing to a number of unknown causes.
Such an experiment is called a random experiment.
•
A random experiment is an experiment that can be
repeated over and over, giving different results.
•
e.g A fair 6faced cubic die, the no. of telephone
calls received in a board in a 5min. interval.
Probability theory is a study of random or
unpredictable experiments and is helpful
in investigating the important features of
these random experiments.
Probability Definitions
•
For discrete math, we focus on the
discrete version of probabilities.
•
For each random experiment, there is
assumed to be a finite set of discrete
possible results, called outcomes. Each
time the experiment is run, one outcome
occurs. The set of all possible outcomes
is called the sample space.
Example.
If the experiment consists of flipping two
coins, then the sample space is:
S = {(H, H), (H, T), (T, H), (T, T)}
Example.
If the experiment consists of tossing two
dice, then the sample space is:
S = {(i, j)  i, j = 1, 2, 3, 4, 5, 6}
More Probability Definitions
•
A subset (say E) of the sample space is
called an event. In other words, events
are sets of outcomes.
( If the outcome of the experiment is
contained in E, then we say E has
occurred.)
For each event, we assign a number between 0
and 1, which is the probability that the event
occurs.
Example.
If the experiment consists of flipping two
coins, and E is the event that a head
appears on the first coin, then E is:
E = {(H, H), (H, T)}
Example.
If the experiment consists of tossing two
dice, and E is the event that the sum of the
two dice equals 7, then E is:
E = {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)}
Union: E ∪ F
Intersection:E ∩ F also denoted as EF
(If E ∩ F = φ , then E and F are said to be
mutual exclusive.)
NOTE
It may or may not be true that all
outcomes are equally likely. If they
are, then we assume their
probabilities are the same.
Definition of Probability in
Text
The probability of an event E is the sum of the
probabilities of the outcomes in E
p(E) = n(E)/n(S)
S in cases of number e exhasustiv
E to favourable cases of number
·
Example
•
Rolling a die is a random experiment.
•
The outcomes are: 1, 2, 3, 4, 5, and 6,
presumably each having an equal
probability of occurrence (1/6).
•
One event is “odd numbers”, which
consists of outcomes 1, 3, and 5. The
probability of this event is 1/6 + 1/6 +
1/6 = 3/6 = 0.5.
Example
• An urn contains 4 green balls and 6 red balls.
What is the probability that a ball chosen from the
urn will be green?
• There are 10 possible outcomes, and all are
assumed to be equally likely. Of these, 4 of them
yield a green ball. So probability is 4/10 = 0.4.
Example
•
What is the probability that a person wins the
lottery by picking the correct 6 lucky numbers out
of 40? It is assumed that every number has the
same probability of being picked (equally likely).
Using combinatorics, recall that the total
number of ways we can choose 6 numbers out
of 40 is:
C(40,6) = 40! / (34! 6!) = 3,838,380.
Therefore, the probability is 1/3,838,380.
Examples
• Consider an experiment in which a coin is tossed twice.
• Sample space: { HH, HT, TH, TT }
• Let E be the event that at least one head shows up on the
two tosses. Then E = { HH, HT, TH }
• Let F be the event that heads occurs on the first toss. Then
F = { HH, HT }
• A natural assumption is that all four possible events in the
sample space are equally likely, i.e., each has probability
¼.
Then the P(E) = ¾ and P(F) = ½.
Probability as a Frequency
Let a random experiment be repeated n times and
let an event A occur times out of the n trials.
A
n
A event the
of frequency relative the called is
n
n
ratio The
A
.
n
n
lim P(A) i.e., A, event the of y probabilit the called is
P(A), by denoted value, This alue. constant v a approach
to and stabilise to tendency a shows
n
n
increases, n As
A
n
A
∞ →
·
Frequency Definition of
Probability
•
Consider probability as a measure of the
frequency of occurrence.
–
For example, the probability of “heads” in a
coin flip is essentially equal to the number of
heads observed in T trials, divided by T, as T
approaches infinity.
T
heads of number
lim ) heads Pr(
T ∞ →
≈
Probability as a Frequency
• Consider a random experiment with possible outcomes
w
1
, w
2
, …,w
n
. For example, we roll a die and the possible
outcomes are 1,2,3,4,5,6 corresponding to the side that
turns up. Or we toss a coin with possible outcomes H
(heads) or T (tails).
•
We assign a probability p(w
j
) to each possible outcome w
j
in such a way that:
p(w
1
) + p(w
2
) +… + p(w
n
) = 1
• For the dice, each outcome has probability 1/6. For the
coin, each outcome has probability ½.
Example
To find the probability that a spare part produced
by a machine is defective.
If , out of 10,000 items produced , 500 are defective,
it is assumed that the probability of a defective item
is 0.05
Axioms of Probability
A
B
A B
A
B
A B
A
A
∪
Axioms (where A and B are events):
• 0 <= P(A) <= 1
• P(S) = 1; P({}) = 0
• P(A B) = P(A) + P(B) – P(A B)
• If A and B are disjoint then P(A B) =
P(A) + P(B) (mutually exclusive events)
•
∪
∩
∪
∩
∪
Example
•
Rolling a die is a random experiment.
•
The outcomes are: 1, 2, 3, 4, 5, and 6. Suppose the die is
“loaded” so that 3 appears twice as often as every other number.
All other numbers are equally likely. Then to figure out the
probabilities, we need to solve:
p(1) + p(2) + p(3) + p(4) + p(5) + p(6) = 1 and p(3) = 2*p(1)
and p(1) = p(2) = p(4) = p(5) = p(6).
Solving, we get p(1) = p(2) = p(4) = p(5) = p(6) = 1/7 and p(3) = 2/7.
•
One event is “odd numbers”, which consists of outcomes 1, 3,
and 5. The probability of this event is:
p(odd) = p(1) + p(3) + p(5) = 1/7 + 2/7 + 1/7 = 4/7.
0. ) P( point,
sample no containing (event) subset the is if , i.e.
zero, is event impossible the of y probabilit The
1 Theorem
· Φ
Φ
) P( P(S) ) P(S Hence
exclusive. mutually are
event impossible the and S event certain The
oof Pr
0 ) P(
) P( P(S) P(S)
S. S But
P(B) P(A)
B) P(A  P(B) P(A) B) P(A
events, 2 any are B and A If
Theorem
AB. and B A events exclusive
mutually the of union the is B and AB and B A
events exclusive mutually the of union the is A
: oof Pr
) AB ( P ) B A ( P ) B ( P &
) AB ( P ) B A ( P ) A ( P
B A
AB B A
S
A
B
) AB ( P ) B A ( P ) AB ( P ) B A ( P
) B ( P ) A ( P
) B A ( P ) B A ( P
Hence proved.
B A
AB B A
S
A
B
If event F occurs, what is the probability
that event E also occurs?
This probability is called conditional
probability and denoted as p(EF).
Conditional Probability
Definition of Conditional Probability
If p(F) > 0, then
( )
( )
( ) F p
F E p
F E p
∩
· 
Example.
An urn contains 8 red balls and 4 white
balls. We draw 2 balls from the urn
without replacement. What is the
probability that both balls are red?
Solution:
Let E be the event that both balls drawn are
red. Then
p(E) = C(8, 2)/C(12, 2)
or, we can solve the problem using conditional
probability approach,
Let E
1
and E
2
denote, respectively, the events
that the first and second balls drawn are red.
Then
p(E
1
E
2
) = p(E
1
) p(E
2
 E
1
) = (8/12) (7/11)
Multiplication Rule
( )
n
E E E E p ...
3 2 1
( ) ( ) ( ) ( )
1 2 1 2 1 3 1 2 1
...  ...  
−
·
n n
E E E E p E E E p E E p E p
Example.
A deck of 52 playing cards is randomly
divided into 4 piles of 13 cards each.
Compute the probability that each pile
has exactly one ace.
Solution:
Define events
E
1
= the first pile has exactly one ace
E
2
= the second pile has exactly one ace
E
3
= the third pile has exactly one ace
E
4
= the fourth pile has exactly one ace
p(E
1
E
2
E
3
E
4
) = p(E
1
)p(E
2
 E
1
)p(E
3
E
1
E
2
)p(E
4
E
1
E
2
E
3
)
p(E
1
) = C(4,1)C(48,12)/C(52,13)
p(E
2
E
1
) = C(3,1)C(36,12)/C(39,13)
p(E
3
E
1
E
2
) = C(2,1)C(24,12)/C(26,13)
p(E
4
E
1
E
2
E
3
) = C(1,1)C(12,12)/C(13,13)
p(E
1
E
2
E
3
E
4
) ≈ 0.1055
Let E and F be events. We can express E as
E = EF ∪ EF
c
Where F
c
is the complementary event of F.
Therefore, we have
p(E) = p(EF) +p(EF
c
) = p(EF)p(F) +p(EF
c
)p(F
c
)
Independent Events
Two events E and F are independent if
p(EF)=p(E)p(F)
Two events are not independent are said to be
dependent.
p(EF) = p(E)p(F) if and only if p(EF) = p(E).
If E and F are independent, then so are E and
F
c
.
ts. independen also are B & A
events the , t independen are B and A events the If
Theorem
heorem) addition t by )( B ( P ) B A ( P B) P(A such that
exclusive mutually are B A & B A events The
oof Pr
· ∩ + ∩
∩ ∩
) B A ( P ) B ( P ) B A ( P ∩ − · ∩ ∴
eorem) product th by )( B ( P ) A ( P ) B ( P − ·
[ ] ) B ( P ) A ( P ) A ( P 1 ) B ( P · − ·
Three events E, F, and G are independent if
p(EFG) = p(E)p(F)p(G)
p(EF) = p(E)p(F)
p(EG) = p(E)p(G)
p(FG) = p(F)p(G)
Example.
Two fair dice are thrown. Let E denote
the event that the sum of the dice is 7.
Let F denote the event that the first
die is 4 and let G be the event that the
second die is 3. Is E independent of
F? Is E independent of G? Is E
independent of FG?
p(E) = 6/36 = 1/6
p(F) = 1/6
p(G) = 1/6
p(EF) = 1/36
⇒ E and F are independent.
p(EG) = 1/36
⇒ E and G are independent.
p(FG) = 1/36
⇒ F and G are independent.
p(EFG) = 1/36
⇒ but, E and FG are NOT independent.
k 1,2,....., i
) B ( P ) B / A ( P
) B ( P ) B / A ( P
) A / P(B
then k, to 1 i for
ies probabilit l conditiona the be ) A / B ( P & ) P(A/B Let
S. space sample the of partition a be B ,....... B , B Let
Theorem s Baye'
k
1 j
j j
i i
i
i i
k 2 1
· ·
·
∑
·
. j i B B (i) i.e.,
S. space sample the of partition a be B ,.... B , B Given
oof Pr
j i
k 2 1
≠ ∀ φ · ∩
1
B
2
B
3
B
4
B
A
S
. i 0 ) B ( P ) iii ( S B ) ii (
i i
k
1 i
∀ > · ∪
·
law) ve distributi by (
) B (A ....... ) B (A ) B B ( A
) B (A ....... ) B (A ) B (A
Then S. with associated event the be A Let
k 3 2 1
k 2 1
∩ ∪ ∪ ∩ ∪ ∪ ∩ ·
∩ ∪ ∪ ∩ ∪ ∩
A S A ) B .... B B ( A
k 2 1
· ∩ · ∪ ∪ ∪ ∩ ·
φ · φ ∩ ·
≠ ∩ ∩ · ∩ ∩ ∩ ∩ ∩ ∩
∩ ∩ ∩
A
j i ), B B ( A ) B (A ....... ) B (A ) B (A
For, exclusive. mutually pairwise are
) B (A ),......., B (A ), B (A events the all Also
j i k 2 1
k 2 1
) B A ( P
... ) B A ( P ) B P(A P(A) Then
k
2 1
∩ +
+ ∩ + ∩ ·
y. probabilit on total
theorem obtain the we hence & ) B ( P ) P(A/B as
expressed be may ) B P(A each term However
j j
j
∩
∑
·
·
+
+ + ·
k
1 j
j j
k k
2 2 1 1
) B ( P ) B / A ( P
) B ( P ) B / A ( P
... ) B ( P ) B / A ( P ) B ( P ) B / A ( P ) A ( P
) A / B ( P ) A ( P ) B / A ( P ) B ( P ) A B ( P
i i i i
× · × · ∩
) A ( P
) B / A ( P ) B ( P
) A / B ( P
i i
i
×
· ∴
∑
·
×
×
·
k
1 j
j j
i i
) B / A ( P ) B ( P
) B / A ( P ) B ( P
Example.
In answering a question on a multiplechoice
test, a student either knows the answer or
guesses. Let p be the probability that the
student knows the answer and 1−p the
probability that the student guesses.
Assume that a student who guesses at the
answer will be correct with probability 1/m,
where m is the number of multiplechoice
alternatives. What is the (conditional)
probability that a student knew the answer to a
question, given that his answer is correct?
Solution:
C = the student answers the question correctly,
K = the student actually knows the answer
( )
( ) ( )
( ) ( ) ( ) ( ) K p K C p K p K C p
K p K C p
C K p
~ ~ 


+
·
( )( ) m p p
p
/ 1 1− +
·
( ) p m
mp
1 1 − +
·
Example.
When coin A is flipped it comes up heads
with probability ¼, whereas when coin B
is flipped it comes up heads with probability ¾.
Suppose that one coin is randomly chosen and
is flipped twice. If both flips land heads, what is
the probability that coin B was the one chosen?
Solution:
C = coin B is chosen
H = both flips show head
( )
( ) ( )
( ) ( ) ( ) ( ) C p C H p C p C H p
C p C H p
H C p
~ ~ 


+
·
,
`
.

×
,
`
.

× +
,
`
.

×
,
`
.

×
,
`
.

×
,
`
.

×
·
2
1
4
1
4
1
2
1
4
3
4
3
2
1
4
3
4
3
10
9
·
9 . 0 ≈
Example.
A laboratory test is 95 percent correct in detecting a certain disease
when the disease is actually present. However, the test also yields a
“false” result for 1 percent of the healthy people tested. If 0.5 percent
of the population has the disease, what is the probability a person
has the disease given that his test
result is positive?
Example.
A suspect is believed 60 percent guilty. Suppose now a new
piece of evidence shows the criminal is lefthanded. If 20
percent of the population is lefthanded,and it turns out that the
suspect is also lefthanded, then does this change the guilty
probability of the suspect? By how much?
Solution:
D = the person has the disease
E = the test result is positive
( ) ( )
( ) ( ) ( ) ( ) 995 . 0 01 . 0 005 . 0 95 . 0
005 . 0 95 . 0
+
·
294
95
·
323 . 0 ≈
( )
( ) ( )
( ) ( ) ( ) ( ) D p D E p D p D E p
D p D E p
E D p
~ ~ 


+
·
Example.
A suspect is believed 60 percent guilty. Suppose now a new
piece of evidence shows the criminal is lefthanded. If 20
percent of the population is lefthanded,and it turns out that
the suspect is also lefthanded, then does this change the
guilty probability of the suspect? By how much?
Solution:
G = the suspect is guilty
LH = the suspect is lefthanded
( )
( ) ( )
( ) ( ) ( ) ( ) G p G LH p G p G LH p
G p G LH p
LH G p
~ ~ 


+
·
( ) ( )
( ) ( ) ( )( ) 4 . 0 2 . 0 6 . 0 0 . 1
6 . 0 0 . 1
+
·
68
60
·
88 . 0 ≈
Random Variable
Definition:
A random variable (RV) is a function
that assigns a real number X(s) to every element
, S s ∈ where S is the sample space corresponding to
a random experiment E.
Discrete Random Variable
If X is a random variable (RV) which can take a finite
number or countably infinite number of values, X is
called a discrete RV.
Eg. 1. The number shown when a die is thrown
2. The number of alpha particles emitted by a
radioactive source are discrete RVs.
R S : X →
Example
Suppose that we toss two coins and consider the sample
Space associated with this experiment. Then
S={HH,HT,TH,TT}.
Define the random variable X as follows:
X is the number of heads obtained in the two tosses.
Hence X(HH) = 2, X(HT) = 1 = X(TH) & X(TT) = 0.
Note that to every s in S there corresponds exactly
one value X(s). Different values of x may lead to
the same value of S.
Eg. X(HT) = X(TH)
Probability Function
If X is a discrete RV which can take the values
,.... x , x , x
3 2 1
such that
,...) 3 , 2 , 1 i ( p provided function, y probabilit
point or function mass y probabilit or function
y probabilit the called is p then , p ) x X ( P
i
i i i
·
· ·
satisfy the following conditions:
∑
·
∀ ≥
i
i
i
1 p ) ii (
,& i , 0 p ) i (
Example of a Discrete PDF
•
Suppose that 10% of all households have
no children, 30% have one child, 40%
have two children, and 20% have three
children.
•
Select a household at random and let X =
number of children.
•
What is the pmf of X?
Example of a Discrete PDF
•
We may list each value.
–
P(X = 0) = 0.10
–
P(X = 1) = 0.30
–
P(X = 2) = 0.40
–
P(X = 3) = 0.20
Example of a Discrete PDF
•
Or we may present it as a chart.
x P(X = x)
0 0.10
1 0.30
2 0.40
3 0.20
Example of a Discrete PDF
•
Or we may present it as a stick graph.
x
P(X = x)
0 1 2 3
0.10
0.20
0.30
0.40
Example of a Discrete PDF
•
Or we may present it as a histogram.
x
P(X = x)
0 1 2 3
0.10
0.20
0.30
0.40
2) 0)(ii)P(X P(X (i) Find value.
positive some is where ,....., 2 , 1 , 0 i
i!
c
p(i)
by given is X variable random a of pmf. The
Example
i
> ·
λ ·
λ
·
1
! i
c have we , 1 p(i) Since
. Solution
0 i 0 i
i
·
λ
· ∑ ∑
∞
·
∞
·
. 1 ce have we ,
!
e As
0
· ·
∑
∞
·
λ λ
λ
i
i
i
λ −
λ
·
λ
· · e
! 0
e
0) (X P Hence
0 
) 2 X ( P 1 ) 2 X ( P ≤ − · >
¹
'
¹
¹
'
¹
λ
+ λ + − ·
λ −
λ − λ −
2
e
e e 1
2
!
e
) (X

x
x P
x
λ
λ
· ·
If X represents the total number of heads obtained,
when a fair coin is tossed 5 times, find the
probability distribution of X.
32
1
32
5
32
10
32
10
32
5
32
1
: P
5 4 3 2 1 0 : X
Continuous Random Variable
If X is an RV which can take all values (i.e., infinite
number of values ) in an interval, then X is called
a continuous RV.
¹
¹
¹
'
¹
≥
< ≤
<
·
1 x , 1
, 1 x 0 , x
, 0 x , 0
) x ( X
Probability Density Function
If X is a continuous RV,then f is said to be the
probability density function (pdf) of X , if it satisfies the
following conditions:
∫
∞
∞ −
· ∈ ∀ ≥ 1 dx ) x ( f ) ii ( and , R x , 0 ) x ( f ) i (
x
∫
· ≤ ≤
∞ < < < ∞
b
a
dx ) x ( f b) X P(a
, b a  with b a, any For (iii)
When X is a continuous RV
∫
· · ≤ ≤ · ·
a
a
0 dx ) x ( f ) a X a ( P ) a X ( P
This means that it is almost impossible that a continuous
RV assumes a specific value.
b) X P(a b) X P(a b) X P(a Hence < < · < ≤ · ≤ ≤
Probability Density Function
0.0
0.1
0.2
0.3
0.4
0.5
0.6
3 0 3 6 9 12 15
x
f
X
(x )
≤ ⋅
<
·
−
x e
x
x f
x
X
0 if
1
0 if 0
) (
/α
α
not. or function density y probabilit a is
1 x 0 , 4x f(x) function he whether t Check
Example
3
≤ ≤ ·
. 1 dx x 4 dx ) x ( f Also
[0,1]. x 0 f(x) Clearly . Solution
1
0
1
0
3
· ·
∈ ∀ ≥
∫ ∫
) 5 3 X 2 ( P ) iii )( 1 X 1)(ii)P( P(X (i) Obtain
elsewhere 0
2 x 2  4 / 1
f(x)
function density the has X variable random A
> + > <
¹
'
¹
< <
·
(¾,1/2,1/4)
Find the formula for the probability distribution
of the number of heads when a fair coin is
tossed 4 times.
Cumulative Distribution Function (cdf)
If X is an RV, discrete or continuous , then P(X<=x)
is called the cumulative distribution function of X
or distribution function of X and denoted as F(x).
∑
≤
·
x X
j
j
j
p F(x) , discrete is X If
∫
∞
· ≤ < ∞ ·
X

f(x)dx x) X P( F(x) , continuous is X If
Probability Density Function
0.0
0.1
0.2
0.3
0.4
0.5
0.6
3 0 3 6 9 12 15
x
f
X
(x )
≤ ⋅
<
·
−
x e
x
x f
x
X
0 if
1
0 if 0
) (
/α
α
Cumulative Distribution Function
0.0
0.2
0.4
0.6
0.8
1.0
1.2
3 0 3 6 9 12 15
x
F
X
(x )
≤ −
<
·
−
x e
x
x F
x
X
0 if 1
0 if 0
) (
/α
Probbility Density Function
0.0000
0.0004
0.0008
0.0012
0.0016
100 0 100 200 300 400 500 600 700 800
x
f
X
(x )
0 if 0
1
( ) if 0
0 if
X
x
f x x u
u
u x
<
· ≤ ≤
<
Cumulative Distribution Function
0.0
0.2
0.4
0.6
0.8
1.0
100 0 100 200 300 400 500 600 700 800
x
F
X
(x )
0 if 0
( ) if 0
1 if
X
x
x
F x x u
u
u x
<
· ≤ ≤
<
variable. random the
of function on distributi cumulative the (ii) K, (i) Find
otherwise 0
1 x 0 ), x  K(1
f(x) by given
is variable random a of density y probabilit the If
2
¹
'
¹
< <
·
function. on distributi the and K e min Deter
otherwise 0
x if
x 1
1
K.
f(x)
fuction density the has X variable random A
2
¹
¹
¹
'
¹
∞ < < ∞ −
+
·
¹
'
¹ < < −
·
otherwise 0
1 x 0 ] 3 / x x [ 2 / 3
3
¹
'
¹ ∞ < < −∞ π
·
π ·
−
otherwise , 0
x ), x (tan / 1
) x ( F
/ 1 K
1
Properties of the cdf F(x)
. ) F(x ) F(x then
, x x if , i.e. x, of unction f decreasing  non a is F(x) . 1
2 1
2 1
≤
<
. 1 ) ( F & 0 ) ( F . 2 · ∞ · −∞
). x ( F ) x ( F ) x P(X then ....., x x .... x x x
where ,...., x , x values taking RV discrete a is X If . 3
1 i i i i 1 i 3 2 1
2 1
− −
− · · < < < < < <
able. differenti is F(x) where points
all at ), x ( f ) x ( F
dx
d
then RV, continuous a is X If . 4 ·
X. of function mass y probabilit the
) x X ( P p where , p ) g(x E{g(X)}
as defined is g(X) of mean value the
or value expected then the RV, discrete a is X If
s Definition
i
i i i i
∑ · · ·
∫
·
x
R
g(x)f(x)dx E{g(X)}
then f(x), pdf with RV continuous a is X If
2
x x
iance var & mean its are X RV a sing characteri for used
commonly most are which values expected Two
σ µ
∫
∑
·
· · µ
x
R
i
i i x
continuous is X if xf(x)dx,
discrete is X if , p x ) X ( E
{ ¦
∫
∑
µ ·
µ − ·
µ − · σ ·
x
R
2
x
i
i
2
x i
2
x
2
x
continuous is X if , dx ) x ( f )  (x
discrete is X if , p ) x (
) X ( E ) x ( Var
The square root of variance is called the standard
deviation.
2 2
)} X ( E { ) X ( E ) X ( Var − ·
)) X ( E ce (sin ) X ( E
constant) a is ce (sin ) X ( E 2 ) X ( E
} X 2 X { E } ) X {( E ) X ( Var
x
2
x
2
x
2
x x
2
2
x x
2 2
x
· µ µ − ·
µ µ + µ − ·
µ + µ − · µ − ·
2 2
)} X ( E { ) X ( E ) X ( Var − ·
Example
Find the expected value of the number on a die when
thrown.(7/2)
X. of mean the evaluate (d) and X
of cdf the find (c) 2), X P(2 and 2) P(X Evaluate (b) K, Find (a)
3K 0.3 2K 0.2 K 0.1 : p(x)
3 2 1 0 1  2  : x
on distributi y probabilit following the has X variable random A
Example
< < <
3). by divisible is P(X & 5) P(X even), is P(X also Find
on. distributi the of variance and mean the find
1and is y probabilit total that the Verify
,..). 3 , 2 , 1 j ( 1/2 j) P(X by given is on distributi
discrete infinite an of function y probabilit The
j
≥
− · ·
P(x). function on distributi its find (b)
X.) RV continuous a pdf(of a is p(x) that show ) a (
0 x 0
0 x xe
p(x) If
2 / x
2
¹
¹
¹
'
¹
<
≥
·
variance. and mean find and k t tan cons the e min Deter
otherwise 0
0 , 0 x kxe
f(x) by defined function density
y probabilit the has variable random continuous A
Example
x 
¹
'
¹
> λ ≥
·
λ
) / 6 , / 2 , k (
2 2
λ λ λ ·
0 x , e 1 ) x ( F
2 / x
2
≥ − ·
−
Moments
If X is the discrete or continuous RV, is called rth
order raw moment of S about the origin and denoted
by .
) X ( E
r
r
' µ
¹
¹
¹
'
¹
· · µ
∫
∑
∞
∞ 
r
x
r
r
r
continuous is X if dx ) x ( f x
discrete is X if ) x ( f x
) X ( E '
( ) ( ) . ) X ( E Var(X)
origin about the moment first mean
), X ( E & ) X ( E by given
are origin about the moments second and first the Since
2
1
1
1
2
2
1
2
2 1
2
1
1
µ − µ · − µ ·
·
· µ · µ
) X ( Var )) X ( E X ( E
0 ) X ( E ) X ( E )) X ( E X ( E
2
2
1
· − · µ
· − · − · µ
. by denoted and X of
moment central order nth the called is ) X {( E
n
n
x
µ
µ −
X. of moments absolute called are } X { E & } X { E
n
x
n
µ −
X. of moments d generalise called are } a X { E & } ) a X {( E
n
n
− −
TwoDimensional Random Variables
variable. random l dimensiona  two a called
is Y) (X, Then S. s outcomes each number to real a assigning
each functions two be Y(s) Y and X(s) X Let E. experiment
random a with associated space sample the be S Let : s Definition
∈
· ·
Twodimensional continuous RV.
Twodimensional discrete RV.
,..... n ,..., 3 , 2 , 1 j ,......; m ,...., 3 , 2 , 1 i ), y , x ( ) Y , X (
i i
· · −
Probability Function of (X,Y)
Y) (X, of on distributi y probabilit joint the called is
... 3 , 2 , 1 j ,..., 2 , 1 i }, p , y , x triplets{ of set The
1 p ) ii (
j & i , 0 p (i)
provided Y) (X, of function mass y probabilit the called is p then
p ) y y , x P(x such that RV discrete l dimensiona  two a is Y) (X, If
ij i i
j i
ij
ij
ij
ij i i
· ·
·
∀ ≥
· · ·
∑∑
•
If (X,Y) is a twodimensional continuous RV.
The joint probability density function (pdf) f is
a function satisfying the following conditions
1 dxdy ) y , x ( f
y x , 0 ) y , x ( f
·
∞ < < ∞ − ∞ < < ∞ − ≥
∫ ∫
∞
∞ −
∞
∞ −
∫ ∫
· ≤ < ≤ <
2
1
2
1
x
x
y
y 2 1 2 1
dydx ) y , x ( f ] y Y y , x X x Pr[
∫ ∫
∑∑
∞ − ∞ −
·
·
x y
j i
ij
dvdu ) v , u ( f
p ) y , x ( F
Cumulative Distribution Function
Y) (X, of cdf the called is y} Y & x P{X y) F(x, then
), continuous or e RV(discret l dimensiona  two a is Y) , (X If
≤ ≤ ·
) ( ) , ( ) ( ) , (
increase. both, or y, or either x as function ing nondecreas a is ) , (
1 ) , (
0 ) , ( ) , ( ) , (
, 1 ) , ( 0
x F x F y F y F
y x F
F
F x F y F
y x y x F
X Y
· ∞ · ∞
· ∞ ∞
· −∞ −∞ · −∞ · −∞
∞ < < ∞ − ∞ < < ∞ − ≤ ≤
•
Properties of joint PDF
y x
y x F
y x f
y Y x X y x F
∂ ∂
∂
·
≤ ≤ ·
) , (
) , (
) , Pr( ) , (
2
Examples
tossing two coins
head for 1 for tail, 0 )} 1 , 1 ( ), 0 , 1 ( ), 1 , 0 ( ), 0 , 0 {( space sample
coin second with the associated variable random ...
coin first with the associated variable random ...
Y
X
1 ) 1 , 1 ( F
4
3
) 0 , 1 ( F
4
2
) 1 , 0 ( F
4
1
) 0 , 0 ( F
·
·
·
·
1/2
1/4
1
x
y
F(x,y)
Example
Three balls are drawn at random without replacement
from a box containing 2 white,3red and 4 black balls.
If X denotes the number of white balls drawn and Y
denotes the no of red balls drawn, find the joint
probability distribution of (X,Y).
Solutions
As there are only 2 white balls in the box, X can take
the values 0,1,2and Y can take the values 0,1,2,3.
P(X=0,Y=0)=P(drawing 3 balls none of which is
white or red)=P(all the 3 balls drawn are black)
21 / 1 C / C
3
9
3
4
· ·
P(X=0,Y=1)=3/14,P(X=0,Y=2)=1/7………
X Y
0 1 2 3
0
1
2
X Y
0 1 2 3
0 1/21 3/14 1/7 1/84
1 1/7 2/7 1/14 0
2 1/21 1/28 0 0
4). Y P(X & 1) 3/X P(Y
3) 1/Y P(X 3), Y 1, P(X 1), P(X find below, given
Y) (X, of on distributi y probabilit bivariate For the
≤ + ≤ ≤
≤ ≤ ≤ ≤ ≤
1 2 3 4 5 6
0 0 0 1/32 2/32 2/32 3/32
1 1/16 1/16 1/8 1/8 1/8 1/8
2 1/32 1/32 1/64 1/64 0 2/64
X
Y
(ans.7/8,9/32,18/32,9/28,13/32)
a 2 1 
y 2 x 
e 1 , 3 / 1 ), e 1 ( e ans.
a) (iii)P(X Y), (ii)P(X 1), Y 1, P(X (i) Compute
otherwise 0
y 0 , x 0 e 2e
y) f(x,
by given is Y) (X, of function density joint The
Example
− −
−
− −
< < < >
¹
'
¹
∞ < < ∞ < <
·
Marginal probability density function
For every fixed j
p(x
j
, y
1
) + p(x
j
, y
2
) + p(x
j
, y
3
) + … = p{X= x
j
} = f(x
j
)
and for every fixed k
p(x
1
, y
k
) + p(x
2
, y
k
) + p(x
3
, y
k
) + … = p{Y= y
k
} = f(y
k
)
The probability functions f(x
j
) and g(y
k
) are also
called marginal probability density functions.
∫ ∫
∞
∞ −
∞
∞ −
· · dx ) y , x ( f ) y ( f , dy ) y , x ( f ) x ( f
Y X
As an illustrative example, consider a joint pdf of the form
Integrating this wrt y alone and wrt x alone gives the two
marginal pdf

elsewhere 0
1 0 , 1 0 for ) 1 (
5
6
) , (
2
·
≤ ≤ ≤ ≤ − · y x y x y x f
1 y 0 )
3
y
1 (
5
6
) y ( f
1 x 0 )
2
x
1 (
5
6
) x ( f
Y
2
X
≤ ≤ − ·
≤ ≤ − ·
Independent Random Variables
Two random variables X and Y with joint
probability density function f(x,y) and
marginal probability functions and
If
F(x,y) = F(x) G(y)
Or p(x,y) =
for all x, y, then X and Y are independent.
) x ( f
X
) x ( f
X
) y ( f
Y
) y ( f
Y
Example
A machine is used for a particular job in the forenoon
and for a different job in the afternoon. The joint
probability of (X,Y), where X and Y represent the
number of times the machine breaks down in the
forenoon and in the afternoon respectively, is given
in the following table. Examine if X and Y are
independent RVs.
X Y 0 1 2
0 0.1 0.04 0.06
1 0.2 0.08 0.12
2 0.2 0.08 0.12
4 . 0 P ; 4 . 0 P ; 2 . 0 06 . 0 04 . 0 1 . 0 ) 0 ( f P
j , i P P P if , t independen are Y & X
* 2 * 1 * 0
ij j * i*
· · · + + · ·
∀ · ×
3 . 0 P ; 2 . 0 P ; 5 . 0 P
2 * 1 * 0 *
· · ·
01 1 * * 0
00 * 0 0 *
P 04 . 0 2 . 0 2 . 0 P P
P 1 . 0 5 . 0 2 . 0 P P
· · × · ×
· · × · ×
Hence the RVs X and Y are independent
X Y 0 1 2
0 0.1 0.04 0.06
1 0.2 0.08 0.12
2 0.2 0.08 0.12
t. independen are Y and X that ove Pr
otherwise 0
0 y , x , e e e  1
y) F(x,
by given is Y) (X, variable random continuous
the of function on distributi cumulative The
y) (x  y  x 
¹
'
¹
> + −
·
+
¹
'
¹
≥
·
∂ ∂
∂
·
+ −
otherwise 0
0 y , x e
y x
) y , x ( F
) y , x ( f
) y x (
2
¹
'
¹
≥
·
¹
'
¹
≥
·
− −
otherwise 0
0 y e
) y ( f
otherwise 0
0 x e
) x ( f
y
2
x
1
¹
'
¹
≥ −
<
·
¹
'
¹
≥ −
<
·
− −
0 y e 1
0 y 0
) y ( F
0 x e 1
0 x 0
) x ( F
x
2
x
1
) y , x ( F ) e 1 )( e 1 ( ) y ( F ) x ( F
y x
2 1
· − − ·
− −
t independen are Y and X then
0 y , 0 x , e y) f(x, by given is pdf joint r that thei Suppose
devices. electronic two of lifetimes the be Y and X Let
Example
y) (x 
≥ ≥ ·
+
Y. and X of ce independen Check the
1. y x 8xy,0 y) f(x, that Suppose
Example
≤ ≤ ≤ ·
Expectation of Product of random variables
If X
and Y are mutually independent random
variables, then the expectation of their product
exists and is
E(XY) = E(X) E(Y)
∞ < < ∞ < < λ λ ·
λ + λ −
y 0 , x 0 , e y) f(x, by given be pdf Let the
variables random continuous as modeled being are lightbulb
a of Y brightness the and X lifetime that the Assuming
) y x (
2 1
2 1
Find the joint distribution function
Example
A line of length a units is divided into two parts. If the
first part is of length X, find E(X), Var(X) and E{X(aX)}.
Expectation of Sum of random variables
If X
1
, X
2
, …, X
n
are random variables, then the
expectation of their sum exists and is
E(X
1
+ X
2
+…+ X
n
) = E(X
1
) + E(X
2
) +… + E(X
n
)
( ) ( ) ( ) ( )
∑ ∑
+ · +
k j
k j k k j
k j
j
y x p y y x p x Y E X E
, ,
, ,
( ) ( )
k j
k j
k j
y x p y x ,
,
∑
+ ·
( ) Y X E + ·
Example
What is the mathematical expectation of the sum of
points on n dice?
A box contains tickets among which
tickets bear the number r (r = 0,1,2,…,n). A group of m
tickets is drawn . Let S denote the sum of their
numbers. Find E(S) and Var S.
n
2
r
n
C
Ans. (7/2)n
Ans. (n/2)m
( ) ( )
∑
·
k j
k j k j
y x p y x XY E
,
,
( ) ( )
k j k
k j
j
y g x f y x
∑
·
,
( ) ( )
,
`
.

,
`
.

· ∑ ∑
k
k k
j
j j
y g y x f x
( ) ( ) Y E X E ·
E(XY) Find
1 x y x),0  24y(1 y) f(x, by given is Y) (X, of pdf joint the If
Example
≤ ≤ ≤ ·
y
=
x
0 x
y
∫ ∫
·
1
0
1
y
dxdy ) y , x ( xyf ) XY ( E
Ans. 4/15
Binomial Distribution (revisit)
Suppose that n Bernoulli trials, each of which results in
a success with probability p and results in a failure with
1–p, are performed. If S
n
represents the number of
successes that occur in the n Bernoulli trials, then S
n
is
said to be a binomial random variable with parameter
n and p.
Let X
k
be the number successes scored at the kth trial.
Since X
k
assumes only the values 0 and 1 with
corresponding probabilities q and p, we have
E(X
k
) = 0 • q + 1 • p = p
Since
S
n
= X
1
+ X
2
+…+ X
n
We have
E(S
n
) = E(X
1
+ X
2
+…+ X
n
)
= E(X
1
) + E(X
2
) +… + E(X
n
)
= np
Conditional Probability
Now, consider the case where the event M
depends on some other random variable Y.
( ) [ ]
{ ¦
0 ) Pr(
) Pr(
, Pr
Pr
>
≤
·
≤ ·
M
M
M x x
M x X M x F
Conditional Probability Density Function
or
 the continuous version of Bayes’ theorem
 another expression of the marginal pdf
) y ( f
) y , x ( f
) y  x ( f
Y
·
) x ( f
) y , x ( f
) x  y ( f
X
·
) (
) ( )  (
)  (
x f
y f y x f
x y f
X
Y
·
∫ ∫
∫ ∫
∞
∞ −
∞
∞ −
∞
∞ −
∞
∞ −
· ·
· ·
dx x f x y f dx y x f y f
dy y f y x f dy y x f x f
X Y
Y X
) ( )  ( ) , ( ) (
) ( )  ( ) , ( ) (
4). Y P(X & 1) 3/X P(Y
3) 1/Y P(X 3), Y 1, P(X 1), P(X find below, given
Y) (X, of on distributi y probabilit bivariate For the
≤ + ≤ ≤
≤ ≤ ≤ ≤ ≤
1 2 3 4 5 6
0 0 0 1/32 2/32 2/32 3/32
1 1/16 1/16 1/8 1/8 1/8 1/8
2 1/32 1/32 1/64 1/64 0 2/64
X
Y
(ans.7/8,9/32,18/32,9/28,13/32)
Suppose that p(x,y) the joint probability mass
function of X and Y , is given by p(0,0) =.4
p(0,1)=.2,p(1,0)=.1,p(1,1)=.3 Calculate the
conditional probability mass function of X given
that Y = 1
Ans. 2/5,3/5
Example
Suppose that 15 percent of the families in a certain
community have no children, 20% have 1, 35% have
2, & 30% have 3 children; suppose further that each
child is equally likely (and independently) to be a
boy or a girl. If a family is chosen at random from this
community, then B, the number of boys, and G , the
number of girls, in this family will have the joint
probability mass function .
j
i
0 1 2 3
0 .15 .10 .0875 .0375
1 .10 .175 .1125 0
2 .0875 .1125 0 0
3 .0375 0 0 0
1125 .
8
9 .
3
2
1
2
1
2
1
30 . ) GBB ( P ) BGB ( P ) BBG ( P
) GBB BBGorBGBor ( P ) 1 G , 2 B ( P
· ·
×
,
`
.

× × × · + + ·
· · ·
175 . 2
2
1
2
1
35 .
) GB ( P ) BG ( P
) BGorGB ( P ) 1 G , 1 B ( P
· × × × ·
+ ·
· · ·
If the family chosen has one girl, compute the
conditional probability mass function of the
number of boys in the family
Ans.8/31,14/31,9/31,0
( )
1 y 0 where , y Y
given that , X of density l conditiona the compute
otherwise 0
1 y 0 , 1 x 0 , y x 2 x
5
12
y) f(x,
by given is Y and X of density joint The
Example
< < ·
¹
¹
¹
'
¹
< < < < − −
·
) y ( f
) y , x ( f
) y  x ( f
Y
·
( )
y 3 4
y x 2 x 6
ans
−
− −
·
) 1 Y X ( P & ) Y X ( P
) 1 X /
2
1
Y ( P ),
2
1
1/Y P(X ),
2
1
P(Y 1), P(X Compute
. 1 y 0 , 2 x 0 ,
8
x
xy y) f(x,
by given is RV l dimensiona  two a of pdf joint The
Example
2
2
≤ + <
> < < > < >
≤ ≤ ≤ ≤ + ·
(ans. 19/24,1/4,5/6,5/19,53/480,13/480)
Variance of a Sum of random variables
If X and Y are random variables, then the variance of
their sum is
Var(X + Y) = E({(X+Y) – (µ
X
+ µ
Y
)}
2
)
( ) ( ) ( ) ( ) ( )( ) ( )
Y X
2
Y
2
X
Y X E 2 Y E X E µ − µ − − µ − + µ − ·
( ) ) ) (( 2 ) ( ) (
Y X
Y X E Y Var X Var µ µ − − − + ·
( ) ( ) ( ) ( )
Y X
XY E Y Var X Var µ µ − − + · 2
The covariance of X and Y is defined by
( ) ( ) ( ) ( ) ( )
Y X Y X
XY E Y X E Y X Cov µ µ µ µ − · − − · ,
Var(X + Y) = E({(X+Y) – (µ + µ )} )
2
x y
•
If X and Y are mutually independent, then
Cov(X,Y) = 0.
Q: Is the reverse of the above true?
•
If X and Y are mutually independent, then
Var(X + Y) = Var(X) + Var(Y)
•
If X
1
, …, X
n
are mutually independent, and
S
n
= X
1
+ …+ X
n
, then
Var(S
n
) = Var(X
1
) + … + Var(X
n
)
Q: Let S
n
be a binomial random
variable with parameter n and p.
Show that
Var(S
n
) = np(1p)
Example
Compute Var(X) when X represents the outcome
when we roll a fair die.
Solution
Since P(X=i)=1/6, i = 1,2,3,4,5,6, we obtain
∑
·
· ·
6
1 i
2 2
] i X [ P i ) X ( E
)
6
1
( 6 )
6
1
( 5 )
6
1
( 4 )
6
1
( 3 )
6
1
( 2 )
6
1
( 1
2 2 2 2 2 2
+ + + + + ·
=91/6
2 2
) X ( E ) X ( E ) X ( Var − ·
12 / 35
2
7
6
91
2
·
,
`
.

− ·
Compute the variance of the sum obtained when 10
independent rolls of a fair die are made.
Ans 175/6
Compute the variance of the number of heads resulting
from 10 independent tosses of a fair coin.
y x
xy
xy
xy
C
as defined is , by denoted
Y and X between n correlatio of t coefficien The
σ σ
· ρ
ρ
The correlation coefficient is a measure of dependence
between RV’s X and Y.
If = 0 , we say that X and Y are uncorrelated
If E(XY) = 0 , X and Y are said to be orthogonal RV’s.
xy
ρ
y x xy xy
C or 1 σ σ ≤ ≤ ρ
Example
Calculate the correlation coefficient for the following
heights (in inches) of fathers (X) & their sons (Y):
X Y
65 67
66 68
67 65
67 68
68 72
69 72
70 69
72 71
(Ans .603)
y x
xy
xy
C
σ σ
· ρ
2
2
2
2
n
y
n
y
n
x
n
x
n
y
n
x
n
xy
,
`
.

−
,
`
.

−
−
·
∑ ∑ ∑ ∑
∑ ∑ ∑
Example
If X,Y and Z are uncorrelated RVs with zero means
and standard deviations 5, 12 and 9 respectively
and if U=X+Y and V=Y+Z find the correlation
coefficient between U and V.
(ans 48/65)
1/11)  (ans.
Y. and X between t coefficien n correlatio the Find
elsewhere 0
1 y 1,0 x 0 y x
y) f(x, pdf joint
the have Y and X variables random Let the
Example
¹
'
¹
< < < < +
·
) Y , X ( g
p Y) (X,
Values Expected l Conditiona
ij
∑ · · · ·
i
j i j i j
) y Y / x X ( P ) y , x ( g } y Y / ) Y , X ( g { E
∑ ∑ ·
·
· ·
·
i
j *
ij
j i
j
j i
j i
i
p
p
) y , x ( g
} y Y { P
} y Y x X { P
) y , x ( g
∫
∫
∞
∞ −
∞
∞ −
·
·
dy ) x / y ( f ) y , x ( g } X / ) Y , X ( g { E
dx ) y / x ( f ) y , x ( g } Y / ) Y , X ( g { E
∫
∞
∞ −
· · µ dy ) x / y ( yf ) X / Y ( E
means l Conditiona
y/x
∫
∞
∞ −
µ − · µ − · σ dy ) x / y ( f ) y ( ) Y ( E
are variance l Conditiona
2
x / y
2
x / y
2
y/x
Example
The joint pdf of (X,Y) is given by f(x,y)=24xy, x>0,y>0,
x+y<=1,and 0, elsewhere, find the conditional
mean and variance of Y, given X.
E(Y/X) and E(X/Y) find , 0 y & x  1 y by bounded
semicircle over the d distribute uniformly is Y) (X, If
2
· ·
E(Y) E{E(Y/X)} and E(X) E{E(X/Y)} at verify th Also · ·
2
2
) x 1 ( 18 / 1 var
), x 1 ( 3 / 2 ) X / Y ( E , ) x 1 /( y 2 ) x / y ( f
− ·
− · − ·
Properties
(1)If X and Y are independent RV’s, then E(Y/X)=E(Y)
and E(X/Y)=E(X).
(2)E[E{g(X,Y)/X)=E{g(X,Y)} in particular E{E(X/Y)}=E(X)
(3)E(XY)=E[X.E(Y/X)]
)] X / Y ( E X ( E ) Y X ( E
2 2 2 2
·
Example
Three coins are tossed. Let X denote the number of
heads on the first two coins,Y denote the no of tails
on the last two, and z denote the number of heads
on the last two. Find
(a)The joint distribution of (i) X and Y (ii) X and Z
(b) Conditional distribution of Y given X = 1
(c) Find covariance of x,y and x,z
(d) Find E(Z/X=1)
(e)Give a joint distribution , that is not the joint
distribution of X and Z in (a), but has the same
marginals as of (b)
) x ( f ... ) x ( f ) x ( f ) x ,..., x , x ( f
t independen ) X ,... X , (X RVs
n 2 1 n 2 1
n 2 1
× × × ·
−
) x , f(x
) x , x , f(x
) x , x / f(x
) x f(
) x , x , f(x
) x / x , f(x
density l conditiona
3 2
3 2 1
3 2 1
3
3 2 1
3 2 1
·
·
Definition
Let X denote a random variable with probability
density function f(x) if continuous (probability
mass function p(x) if discrete)
Then
M(t) = the moment generating function of X
( )
tX
E e ·
( )
( )
if is continuous
if is discrete
tx
tx
x
e f x dx X
e p x X
∞
−∞
¹
¹
¹
·
'
¹
¹
¹
∫
∑
XY. of function generating  moment the Find
otherwise 0
0 y , 0 x xe
) y , x ( f If
Example
) 1 y ( x
¹
'
¹
> >
·
+ −
∫ ∫
∫ ∫
∫ ∫
∞ ∞
− − −
∞ ∞
− −
∞∞
+ −
·
·
·
0 0
) t 1 ( xy x
0 0
) y ( x xyt x
0 0
) 1 y ( x xyt
XY
dx } dy e { xe
dx } dy e e { xe
dydx xe e M : Solution
=1/1t
Properties
M
X
(0) = 1
∑ · ) x ( f e ) t ( M
tx
X
( )
) x ( f .....)
! 2
tx
tx 1 (
2
+ + + · ∑
1 0 t X
) X ( E ) x ( f x ) t ( ' M µ · · · ∑
·
) x ( f ...)
! 2
tx 2
x ( ) t ( ' M
2
X
+ + · ∑
2
2
0 t
X
2
3
2
X
2
) x ( f x ) t ( M
) x ( f ...)
! 3
tx 6
x ( ) t ( M
µ · ·
+ ·
∑
∑
·
( )
( ) ( )
0 derivative of at 0.
k
th
X X
m k m t t · ·
( )
2 3
3 2
1
1 .
2! 3! !
k
k
X
m t t t t t
k
µ µ µ
µ · + + + + + + L L
( )
( )
( )
continuous
discrete
k
k
k
k
x f x dx X
E X
x p x X
µ
¹
¹
· ·
'
¹
¹
∫
∑
Let X be a random variable with moment
generating function M
X
(t). Let Y = bX + a
Then M
Y
(t) = M
bX + a
(t) = E(e
[bX + a]t
) = e
at
M
X
(bt)
Let X andY be two independent random
variables with moment generating
functionM
X
(t)and M
Y
(t) .
Then M
X+Y
(t) = M
X
(t) M
Y
(t)
6. Let X and Y be two random variables
with moment generating function M
X
(t)
and M
Y
(t) and two distribution functions
F
X
(x) and F
Y
(y) respectively.
Let M
X
(t) = M
Y
(t) then F
X
(x) = F
Y
(x).
This ensures that the distribution of a
random variable can be identified by its
moment generating function
Example
If X represents the outcome, when a fair die is tossed,
find the MGF of X and hence find E(X) and Var(X).
If a RV X has the MGF M(t) = 3/(3t), obtain the
standard deviation of X. (ans, 1/3)
(Ans. 7/2,35/12)
t 4 t 3 t 2 t
e
10
4
e
10
3
e
10
2
e
10
1
) t ( M + + + ·
Find p.d.f.
t 4 t 3 t 2 t
e
10
4
e
10
3
e
10
2
e
10
1
) t ( M + + + ·
∑ ·
x
tx
) x ( f e ) t ( M
... e ) b ( f e ) a ( f e
10
4
e
10
3
e
10
2
e
10
1
bt at t 4 t 3 t 2 t
+ + · + + +
¹
¹
¹
'
¹
·
·
otherwise , 0
4 , 3 , 2 , 1 x ,
10
x
) x ( f
X. of mean the find
otherwise , 0
x 1 ,
x
1
f(x) p.d.f the has X If
2
¹
¹
¹
'
¹
∞ < <
·
) e ( E ) w (
by defined is X variable random a of function stic characteri The
iwx
X
· φ
¹
¹
¹
'
¹
·
∫
∑
∞
∞ −
continuous is X if , dx ) x ( f e
discrete is X if ), x ( f e
iwx
x
iwx
( )
1 1
wx sin wx cos wx sin i wx cos e
2 / 1
2 2
2 / 1
iwx
· ·
+ · + ·
∫
∞
∞ −
· · φ dx ) x ( f e ) e ( E ) w (
iwx iwx
x
1 dx ) x ( f dx ) x ( f e
iwx
· · ≤
∫ ∫
∞
∞ −
∞
∞ −
Hence the characteristic function always exist even
when momentgenerating function may not exist.
Properties of Characteristic Function
. i of powers asending of series in ) ( of expansion
in the
! n
i
of efficient  co the ) X ( E . 1
n n
n '
n
ω ω φ
ω
· · µ
∑ · · φ
x
iwx iwx
X
) x ( f e ) e ( E ) w (
( ) ( )
) x ( f ...
! 3
iwx
! 2
iwx
iwx 1
x
3 2
∑
,
`
.

+ + + + ·
( )
∑ ∑ ∑ + + + ·
x
2
2
x x
..... ) x ( f x
! 2
iw
) x ( xf iw ) x ( f
. d e ) (
2
1
f(x) then ), ( is f(x) function density
with X RV continuous a of function stic characteri the If 5.
). ( ) ( ) (
then RVs, t independen are Y and X If . 4
) a ( e ) ( then b, aX Y if and ) ( is
X RV a of function stic characteri the If . 3
ix
y x y x
x
ib
y x
∫
∞
∞ −
ω −
+
ω
ω ω φ
π
· ω φ
ω φ × ω φ · ω φ
ω φ · ω φ + · ω φ
one.  one is g(X) Y provided Y, of CF
the from found be can g(X) Y of function density
the , known is x of function density the If . 6
·
·
0
n
n
n
n
) (
d
d
i
1
' . 2
· ω
]
]
]
ω φ
ω
· µ
X. of pdf the Find
1 w , 0
1 w , w 1
) w (
by given
is X variable random a of function stic characteri The
x
¹
¹
¹
'
¹
>
≤ −
· φ
∫
∞
∞ −
−
φ
π
· dw e ) w (
2
1
f(x)
is X of pdf The
iwx
x
]
]
]
−
π
·
−
−
∫
dw e ) w 1 (
2
1
iwx
1
1
]
]
]
− + +
π
·
− −
−
∫ ∫
dw e ) w 1 ( dw e ) w 1 (
2
1
iwx
1
0
iwx
0
1
( ) x cos 1
x
1
) e e 2 (
x 2
1
2
ix ix
2
−
π
· − −
π
·
−
( )
∞ < < −∞
]
]
]
π
· x ,
2 / x
2 / x sin
2
1
2
∞ < < −∞
+
×
π
·
ω
x ,
x 1
1 1
f(x) function
density the has e is function stic characteri
the for which on distributi that the Show
2

∫
∞
∞ −
ω −
ω ω φ
π
· d e ) (
2
1
) x ( f
x i
). ( parameter on with distributi Poisson
a follows also ) X (X that prove , &
parameters on with distributi Poisson follow
that RVs t independen two are X & X If
2 1
2 1 2 1
2 1
λ + λ
+ λ λ
Reproductive property of Poisson distribution
) 1 e (
) t ( x
) 1 e (
) t ( x
i
2
2
i
1
1
e
e
− λ
− λ
ω
ω
· φ
· φ
RVs, t independen are X & X since
2 1
( ) ) 1 e (
) t ( x x
i
2 1
2 1
e
− λ + λ
+
ω
· φ
Joint Characteristic Function
). , ( by denoted and Y) (X, of function
stic characteri joint the called is ) E(e
then RV, l dimensiona  two a is Y) (X, If
2 1 xy
Y i X i
2 1
ω ω φ
ω + ω
0 , 0
2 1 xy
n
2
m
1
n m
n m
n m
xy
2 1
) , (
i
1
} Y X { E ) ii (
1 ) 0 , 0 ( ) i (
· ω · ω
+
+
]
]
]
ω ω φ
ω ∂ ω ∂
∂
·
· φ
∑∑
∫ ∫
ω + ω
∞
∞ −
∞
∞ −
ω + ω
·
· ω ω φ
i j
j i
y i x i
y i x i
2 1 xy
) y , x ( p e
dxdy ) y , x ( f e ) , (
2 1
2 1
. conversely and
) ( ) ( ) , (
t independen are Y and X If ) iv (
) , 0 ( ) ( & ) 0 , ( ) ( ) iii (
2 y 1 x 2 1 xy
xy y xy x
ω φ × ω φ · ω ω φ
ω φ · ω φ ω φ · ω φ
¹
¹
¹
¹
¹
'
¹
t · ·
· t ·
· ·
·
. else , 0
1 y x , 6 / 1
0 y , 1 x , 6 / 1
0 y x , 3 / 1
P
is PMF joint the if Y and X s r.v.' discrete
the of function stic characteri the Compute
XY
∑ ∑
− · − ·
+
· φ
1
1 k
1
1 l
XY
) l w k w ( i
2 1 XY
P e ) w , w (
2 1
( )
2 1 1 2 1 1 2 1
iw iw iw iw iw iw 0 iw 0 iw
e
6
1
e
6
1
e
6
1
e
6
1
e
3
1
+ − − + +
+ + + + ·
( ) ( ) ( ) ( )
( ) ( ) ( ) ( )
2 1 2 1 1 1
2 1 2 1 1 1
w w sin i w w cos
6
1
w sin i w cos
6
1
w w sin i w w cos
6
1
w sin i w cos
6
1
3
1
+ − + + −
+ + + + + + + ·
( ) ( ) ( )
2 1 1
w w cos
3
1
w cos
3
1
3
1
+ + + ·
( )
ed uncorrelat are they
that also and RVs mean zero both are Y and X
that Show . e ) , ( function
stic characteri joint the have Y and X RVs Two
Example
2
2
2
1
8 2
2 1 xy
ω − ω −
· ω ω φ
( )
0 , 0
8 2
1
2 1
2
2
2
1
e
i
1
E(X)
CF joint of property By the
· ω · ω
ω − ω −
]
]
]
ω ∂
∂
·
( )
[ ] 0 i 4 e
0 , 0
1
8 2
2 1
2
2
2
1
· ω ·
· ω · ω
ω − ω −
0 , 0
2 1 xy
n
2
m
1
n m
n m
n m
2 1
) , (
i
1
} Y X { E ) ii (
· ω · ω
+
+
]
]
]
ω ω φ
ω ∂ ω ∂
∂
·
( )
[ ] 0 i 16 e E(Y)
0 , 0
2
8 2
2 1
2
2
2
1
· ω ·
· ω · ω
ω − ω −
( )
( )
( )
0
} e 64 {
16 e
e
i
1
E(XY)
0 , 0
8 2
2 1
0 , 0
2
8 2
1
0 , 0
8 2
2 1
2
2
2 1
2
2
2
1
2 1
2
2
2
1
2 1
2
2
2
1
·
ω ω − ·
]
]
]
ω
ω ∂
∂
·
]
]
]
ω ∂ ω ∂
∂
·
· ω · ω
ω − ω −
· ω · ω
ω − ω −
· ω · ω
ω − ω −
0 ) Y ( E ) X ( E ) XY ( E C
xy
· × − ·
Compute the joint characteristic function of X and Y if
]
]
]
+ −
π
· ) y x (
2
1
exp
2
1
f
2 2
xy
∫ ∫
∞
∞ −
∞
∞ −
ω + ω
+ −
π
· φ dxdy e e
2
1
) w , w (
. Ans
y i x i
) y x (
2
1
2 1 xy
2 1
2 2
Random Variable
Binomial Distribution
The Bernoulli probability mass function is the density
function of a discrete variable X having 0 and 1 as
the only possible values
The pmf of X is given by P(0) = P{X = 0} = 1p
P(1) = P{P = 1} = p
where p, 0<=p<=1 is the probability that the trial is a
success.
(0,1). p some for equation above satisfies pmf if
variable random Bernoulli be to said is X variable random A
∈
An experiment consists of performing a sequence of
subexperiments. If each subexperiment is
identical, then the subexperiments are called
trials.
Bernoulli Trials
•
Each trial of an experiment that has only two possible
outcomes (success or failure) is called a “Bernoulli trial.”
•
If p is the probability of success, then (1p) is the
probability of failure.
•
The probability of exactly k successes in n independent
Bernoulli trials, with probability of success p and
probability of failure q = 1p, is given by a formula called
the Binomial Distribution:
C(n, k) p
k
q
(nk)
Example of Bernoulli Trials
•
Suppose I roll a die and I consider a 3 to be
success and any other number to be a failure.
•
What is the probability of getting exactly 5
successes if I roll the die 20 times?
•
Solution: C(20, 5) (1/6)
5
(5/6)
15
•
What is the probability of getting 5 or more
successes?
event. the of failure of y probabilit the p,  1 q
where , p q nC to equal is t trials independen n of out r times exactly occurs
event y that the probabilit then the p, is experiment s Bernoulli' a of trial single
a in success) of ty (probabili event an of occurrance of y probabilit the If
Theorem
r r n
r
·
−
r n r
q p
failures) r  n and successes r P(getting
usly. simultaneo failures r)  (n and successes
r getting means successes r exactly Getting
oof Pr
−
·
∴
failures. in result
should trials r)  (n remaining the successes, for
chosen are r trials the Once successes. for r trials
choosing of ways nC are There specified. not are
obtained are successes which the from trials, The
r
. q p ) successes r
exactly P(getting ways, nC these of each
In exclusive. mutually are ways nC These
r n r
r
r
−
·
. p q nC
y probabilit required the heorem, addition t by the ,
r r n
r
−
·
∴
Example
If war breaks out on the average once in 25 years,
find the probability that in 50 years at a strech,
there will be no war.
50 50 0
25
24
25
24
25
1
) 0 , 50 ( C P
; 50 n ,
25
24
q ,
25
1
p
,
`
.

·
,
`
.

,
`
.

·
· · ·
Example
Each of two persons A and B tosses 3 fair coins.
What is the probability that they obtain the same
number of heads?
P(A&B get the same no. of heads)
=P(they get no head each or 1 head each or 2 heads
each or 3 heads each)
= P(A gets 0 head) P(B gets o head)+
2
3
2
3
2
3
2
3
2
1
) 3 , 3 ( C
2
1
) 2 , 3 ( C
2
1
) 1 , 3 ( C
2
1
) 0 , 3 ( C
]
]
]
]
,
`
.

+
]
]
]
]
,
`
.

+
]
]
]
]
,
`
.

+
]
]
]
]
,
`
.

·
=5\16
Example.
A game consists of 5 matches and two players,
A and B. Any player who firstly wins 3 matches
will be the winner of the game.
If player A wins a match with probability 2/3.
Suppose matches are independent. What will
be the probability for player A to win the game?
Solution:
Player A wins 3 matches or more out of the 5
matches.
This event has probability equal to:
5 1 4 2 3
3
2
5
5
3
1
3
2
4
5
3
1
3
2
3
5
,
`
.

,
`
.

+
,
`
.

,
`
.

,
`
.

+
,
`
.

,
`
.

,
`
.

·
81
64
·
Example.
A sequence of independent trials is to be performed.
Each trial results in a success with probability p and
a failure with probability 1–p. What is the probability
that
(a) at least 1 success occurs in the first n trials;
(b) exactly k success occurs in the first n trials;
Solution:
(a)The probability of no success in the
first n trials is (1p)
n
. Thus, the answer
is 1–(1–p)
n
.
(b)
( )
k n
k
p p
k
n
−
−
,
`
.

1
Assuming that p remains the same for all repetitions,
if we consider n independent repetitions (or trials) of
E and if the random variable X denotes the
number of times the event A has occurred, then
X is called a binomial random variable with
parameters n and p
The pmf of a binomial random variable having
parameters (n,p) is given by
1 q p where , n ,......, 1 , 0 i , q p nC ) i ( P
i n i
i
· + · ·
−
Example
It is known that car produced by an automobile company
will be defective with probability 0.01 independently of
each other. The company sells the cars in packages of
10 and offers a moneyback guarantee that atmost 1 of
the10 cars is defective. What proportion of packages
sold must the company replace?
004 . } ) 99 )(. 01 (. C 10 ( ) 99 (. ) 01 (. C 10 { 1
) 1 X ( P ) 0 X ( P { 1 )} 1 X ( P { 1 ) 1 X ( P
9
1
10 0
0
· + − ·
· + · − · ≤ − · >
x n x
n
0 x
x
) p 1 ( p xnC ) X ( E mean
−
·
− · ·
∑
x n x
n
1 x
x
) p 1 ( p xnC
−
·
− ·
∑
x n x
n
1 x
) p 1 ( p
)! x n ( ! x
)! 1 n ( n
x
−
·
−
−
−
·
∑
[ ]
x n 1 x
n
1 x
) p 1 ( pp
! ) 1 x ( ) 1 n ( )! 1 x ( x
)! 1 n ( n
x
− −
·
−
− − − −
−
·
∑
[ ]
x n 1 x
n
1 x
) p 1 ( p
! ) 1 x ( ) 1 n ( )! 1 x (
)! 1 n (
np
− −
·
−
− − − −
−
·
∑
) 1 x ( ) 1 n ( 1 x
n
1 x
1 x
) p 1 ( p C 1 n np
− − − −
·
−
− − ·
∑
[ ] ∑
·
− −
− · − + − + ·
n
0 x
x n x
x
n
n
1 n
) p 1 ( p C ) p 1 ( p ce sin , )) p 1 ( p ( np np ·
x n x
x
n
0 x
2 2 1
2
) p 1 ( p nC x ) X ( E moment Second
−
·
− · · µ
∑
Show that variance is np(1p)
on. distributi the of terms first two the find , 2
deviation standard 6, mean on with distributi binomial a For
Example
) 2187 / 2 , ) 3 / 1 ((
9
1). P(X Find ly. respective 3 and 4 are
on distributi binomial of variance and mean The
≥
9899 . ) ) 4 / 3 ( 1 (
16
· −
. np say constant finite a is np (iii)
0) small(p very is trial single a in success of y probabilit The (ii)
) (n ly indefinite increased is trials of number The (i)
. assumption following under the function density
y probabilit Binomial the of case limiting a as obtained be
can riate poisson va the of function density y probabilit The
on Distributi Poisson
λ ·
→
∞ →
x n ) p 1 ( p
)! x n ( ! x
! n
) p 1 ( p nC x) P(X
as X variable
random Binomial a of function density y probabilit the Cosider
x x n x
x
− −
−
· − · ·
−
x n x
n
1
n ! x
) 1 x n )...( 1 n ( n
−
,
`
.

λ
−
,
`
.

λ + − −
·
x
n
x
x
n
1
n
1
n ! x
) 1 x n )....( 1 n ( n
,
`
.

λ
−
,
`
.

λ
−
λ
+ − −
·
x
n
x
n
1
n
1
! x
n
) 1 x n (
....
n
) 1 n (
n
n
,
`
.

λ
−
,
`
.

λ
−
λ
+ − −
·
( )
1. to tends /n)  (1
/n, 1  x  2/n,...1  1/n,1  1 terms the , n as given x, For
x
λ
∞ →
. e ) n / 1 ( Lt , Also
n
n
λ −
∞ →
· λ −
. e
! x
) x X ( P Lt , Hence
x
n
λ −
∞ →
λ
· ·
,.... 2 , 1 , 0 x
! x
e
x) P(X P(x) 0, some for if
parameter with variable random Poisson a be to said is
0,1,2,... values the of one on taking X variable random A
variable random Poisson
x 
·
λ
· · · > λ
λ
λ
1 e e
! x
e
! x
e
) x ( P
0 x
x
0 x
x
0 x
· ·
λ
·
λ
·
λ λ −
∞
·
λ −
∞
·
λ −
∞
·
∑ ∑ ∑
∑ ∑ ∑
∞
·
λ −
∞
·
λ −
∞
·
−
λ
·
λ
· · · ·
1 x
x
0 x
x
0 x
)! 1 x (
e
! x
e
x ) x X ( xP ) X ( E Mean
]
]
]
+
λ
+
λ
+ λ ·
−
λ
λ ·
λ −
∞
·
−
λ −
∑
...
! 2 ! 1
1 e
)! 1 x (
e
1 x
1 x
λ · λ ·
λ λ −
e e
λ · ) X ( Var
Example
The average number of radioactive particles passing
through a counter during 1 millisecond is in a
laboratory experiment is 4. What is the probability that
6 particles enter the counter in a given millisecond?
If the probability of a defective fuse from a manufacturing
unit is 2%, in a box of 200 fuses, find the probability
that exactly 4 fuses are defective.
! 4
e 4
,
! 6
4 e
4 4 6 4 − −
Example
At a busy traffic intersection the probability p of an
Individual car having an accident is very small say
p=0.0001. However, during a certain peak hours of
the day say between 4 p.m. and 6 p.m., a large
number of cars (say 1000) pass through the
intersection. Under these conditions what is the
probability of two or more accidents occurring during
that period.
In a component manufacturing industry, there is a small
probability of 1/500 for any component to be defective.
The components are supplied in packets of 10. Use
Poisson distribution to calculate the app no of packets
containing (i) no defective (ii)one defective components
in a consignment of 10,000 packets.
10000 0196 ,. 10000 9802 . × ×
Binomial Distribution
n ,..., 2 , 1 , 0 r ; q p C ) r X ( P
r n r
r
n
· · ·
−
If we assume that n trials constitute a set and if we
consider N sets, the frequency function of the
binomial distribution is given by f(r)=N p(r)
r n r
r
n
q p C N
−
·
Example
Fit a binomial distribution for the following data and
hence find the theoretical frequencies:
x : 0 1 2 3 4
f: 5 29 36 25 5
Ans. 7,26,37,34,6
6.89,19.14,23.94,17.74,8.63,2.88,0.67,0.1,0.01,0,0
The following data are the number of seeds
germinating out of 10 on damp filter paper for 80
set of seeds. Fit a binomial distribution to these
data:
x 0 1 2 3 4 5 6 7 8 9 10
y 6 20 28 12 8 6 0 0 0 0 0
Fit a Poisson distribution for the following distribution:
x: 0 1 2 3 4 5
f: 142 156 69 27 5 1
Ans. 147 147 74 25 6 1
107,141,93,41,4,0,0,0,0,0
Fit a Poisson distribution to the following data which
gives the number of yeast cells per square for
400 squares
No. of cells per square
(x)
0 1 2 3 4 5 6 7 8 9 10
No. of squares (f)
103 143 98 42 8 4 2 0 0 0 0
GEOMETRIC DISTRIBUTION
,... 2 , 1 n , p p)  (1 n) P(X then n, X let we If occurs.
success a until performed are success, of p y probabilit
a having each t trials, independen that Suppose
1  n
· · · ·
success. a
is nth trial the and failures are trials 1)  (n first that the
sufficient and necessary is it n, X order that in Since ·
∑ ∑
∞
·
∞
·
−
·
− −
· − · ·
1 n 1 n
1 n
. 1
) p 1 ( 1
p
p ) p 1 ( ) n X ( P
Geometric Random Variable
In a chemical engineering process industry it is
known that , on the average, 1 in every 100 items
is defective. What is the probability that the fifth
item inspected is the first defective item found.
0096 . ) 99 )(. 01 (. ans
4
·
At busy time, a telephone exchange will be working
busy with full capacity . So people cannot get a line to
use immediately. It may be of interest to know the
number of attempts necessary in order to get a
connection. Suppose that p = 0.05, then find the
probability that 5 attempts are necessary for a
successful call connection.
Ans .041
Mean
∑ ∑
∞
·
∞
·
− −
− · · − ·
1 n 1 n
1 n 1 n
p 1 t , nt p p ) p 1 ( n ) X ( E
....] t 4 t 3 t 2 1 [ p
3 2
+ + + + ·
p
1
p
p
)] p 1 ( 1 [ p ] t 1 [ p
2
2 2
· · − − · − ·
− −
) 1 t ce (sin <
∑ ∑
∞
·
−
∞
·
−
· − ·
1 n
1 n 2
1 n
1 n 2 2
p t n p ) p 1 ( n ) X ( E
∑ ∑
∑
∞
·
−
∞
·
−
∞
·
−
+ − ·
+ − ·
1 n
1 n
1 n
1 n
1 n
1 n
p nt p t )] 1 n ( n [
p t ] n ) 1 n ( n [
p / 1 t )] 1 n ( n [ pt
2 n
2 n
+ − · ∑
∞
·
−
p / 1 ....] t 4 . 5 t 3 . 4 t 2 . 3 2 [ pt
3 2
+ + + + + ·
p / 1 ] t 1 [ pt 2 p / 1 ...] t 6 t 3 1 [ pt 2
3 2
+ − · + + + + ·
−
2 3
p
p 2
p
) p 1 ( p 2 −
·
−
·
2
p
p 1
) X ( Var
−
·
Negative Binomial Distribution
Trials repeated until a fixed number of success occur.
Instead of finding the probability of r success in n trials
when n is fixed
Probability that rth success occurs on the xth trial.
Negative binomial experiment
The number of x trials to produce r success in a negative
binomial experiment is called a negative binomial
random variable and its probability distribution is
called the negative binomial distribution
The negative binomial distribution is used when the
number of successes is fixed and we're interested in
the number of failures before reaching the fixed number
of successes. An experiment which follows a negative
binomial distribution will satisfy the following requirements:
1.The experiment consists of a sequence of independent
trials.
2.Each trial has two possible outcomes, S or F.
3.The probability of success,is constant from
one trial to another.
4.The experiment continues until a total of r successes
are observed, where r is fixed in advance
Suppose we repeatedly throw a die, and consider
a "1" to be a "success". The probability of success
on each trial is 1/6. The number of trials needed to get
three successes belongs to the infinite set
{ 3, 4, 5, 6, ... }. That number of trials is a (displaced)
negativebinomially distributed random variable.
The number of failures before the third success
belongs to the infinite set { 0, 1, 2, 3, ... }. That number
of failures is also a negativebinomially distributed
random variable.
A Bernoulli process is a discrete time process, and
so the number of trials, failures, and successes are
integers. For the special case where r is an integer,
the negative binomial distribution is known as the
Pascal distribution.
A further specialization occurs when r = 1: in this
case we get the probability distribution of failures
before the first success (i.e. the probability of success
on the (k+1)th trial), which is a geometric distribution.
Let the random variable y denotes the no of failures
before the occurrence of the rth success. Then y+r
denotes the number of trials necessary to produce
exactly r success and y failures with the rth success
occurring at the (y+r)th trial.
[ ]
[ ] p 1 q , p q p C
p q p C g(y) y of pmf
y 1 r
) 1 r (
) 1 r y (
y 1 r
) 1 r (
) 1 n (
− · ·
· ·
−
−
− +
−
−
−
y r
) 1 r (
) 1 r y (
q p C
−
− +
·
Pat is required to sell candy bars to raise money for
the 6th grade field trip. There are thirty houses in the
neighborhood, and Pat is not supposed to return home
until five candy bars have been sold. So the child goes
door to door, selling candy bars. At each house, there
is a 0.4 probability of selling one candy bar and a 0.6
probability of selling nothing.
Example
What’s the probability that Pat finishes on the tenth
house?
A fair die is cast on successive independent trials until
the second six is observed. The probability of observing
exactly ten nonsixes before the second six is cast is
……..
3 2
1
4
10 2
1
11
4
3
4
1
C ,
6
5
6
1
C
,
`
.

,
`
.

,
`
.

,
`
.

Find the probability that a person tossing three coins will
get either all heads or all tails for the second time
on the fifth toss.
Relationship between the binomial and negative
binomial
Let X have a binomial distribution with parameters
n,p. Let Y have a negative binomial distribution with
parameters r and p. (That is Y = no of trials required
to obtain r successes with probability of success p).
Then
) r X ( P ) n Y ( P ) ii (
) r X ( P ) n Y ( P ) i (
< · >
≥ · ≤
The probability that an experiment will succeed is 0.8.
If the experiment is repeated until four successful
outcomes have occurred, what is the expected number
of repetitions required?
Ans. 1
( )
( )
]
]
]
−
− +
· ∑
∞
·
y r
0 y
q p
! y ! 1 r
! 1 r y
[ ] [ ]
y
y
) 1 r y (
0 y
r y r
y
) 1 r y (
0 y
q C p q p C
− +
∞
·
− +
∞
·
∑ ∑ · ·
( )
( )
y
0 y
r
q
! y ! 1 r
! 1 r y
p
]
]
]
−
− +
· ∑
∞
·
[ ]
y r
) 1 r (
) 1 r y (
0 y 0 y
q p C g(y)
−
− +
∞
·
∞
·
∑ ∑ ·
[ ] . 1 p p q 1 p
r r
r
r
· · − ·
−
−
2
p
rq
Variance
p
rq
E(X) Mean · · ·
( )
]
]
]
+
+
+ + · ... q
! 2
1 r r
rq 1 p
2 r
( )
( ) ( )
( )
( )
]
]
]
+
−
+
+
−
+
−
−
· ... q
! 2 ! 1 r
! 1 r
q
! 1 r
! r
! 1 r
! 1 r
p
2 r
PROBABILITY
DISTRIBUTIONS
Binomial Distribution
Discrete Distributions
n ,..., 2 , 1 , 0 r ; q p C ) r X ( P
r n r
r
n
· · ·
−
If we assume that n trials constitute a set and if we
consider N sets, the frequency function of the
binomial distribution is given by f(r)=N p(r)
r n r
r
n
q p C N
−
·
Example
Fit a binomial distribution for the following data and
hence find the theoretical frequencies:
x : 0 1 2 3 4
f: 5 29 36 25 5
Ans. 7,26,37,34,6
6.89,19.14,23.94,17.74,8.63,2.88,0.67,0.1,0.01,0,0
The following data are the number of seeds
germinating out of 10 on damp filter paper for 80
set of seeds. Fit a binomial distribution to these
data:
x 0 1 2 3 4 5 6 7 8 9 10
y 6 20 28 12 8 6 0 0 0 0 0
Fit a Poisson distribution for the following distribution:
x: 0 1 2 3 4 5
f: 142 156 69 27 5 1
Ans. 147 147 74 25 6 1
107,141,93,41,4,0,0,0,0,0
Fit a Poisson distribution to the following data which
gives the number of yeast cells per square for
400 squares
No. of cells per square
(x)
0 1 2 3 4 5 6 7 8 9 10
No. of squares (f)
103 143 98 42 8 4 2 0 0 0 0
Probbility Density Function
0.0000
0.0004
0.0008
0.0012
0.0016
100 0 100 200 300 400 500 600 700 800
x
f
X
(x )
0 if 0
1
( ) if 0
0 if
X
x
f x x u
u
u x
<
· ≤ ≤
<
Cumulative Distribution Function
0.0
0.2
0.4
0.6
0.8
1.0
100 0 100 200 300 400 500 600 700 800
x
F
X
(x )
0 if 0
( ) if 0
1 if
X
x
x
F x x u
u
u x
<
· ≤ ≤
<
If X is uniformly distributed over the interval [0,10]
compute the probability (a) 2<X<9(b) 1<X<4
(c)X<5 (d) X>6
Ans. 7/10,3/10,5/10,4/10
∫ ∫
−
· · ·
b
a
b
a
dx
a b
1
x dx ) x ( xf ) X ( E mean
) a b (
2
1
mean + ·
Buses arrive at a specific stops at 15min. Intervals
starting at 7 A.M., that is , they arrive at 7,7:15,7:30
and so on. If a passenger arrives at the stop at a
random time that is uniformly distributed between
7 and 7:30A.M., find the probability that he waits
(a) less than 5 min (b) at least 12 min. for a bus.
Ans. 1/3,1/5
) 2 2  X P( find (3,3), in on distributi uniform has X If <
) 1 X ( P ) 1 X P( such that a
find 0, a a), (a, in on distributi uniform has X If
> · <
>
Ans. 1/2
a = 2
Example:
The total time it takes First Bank to process a loan application is
uniformly distributed between 3 and 7 days. What is the
probability that the application will be processed in less than 4
days?
What is the probability that it will take more than 6.5 days?
25 . )
3 7
1
)( 3 4 ( ) 4 3 ( ·
−
− · ≤ ≤x P
125 . )
3 7
1
)( 5 . 6 7 ( ) 7 5 . 6 ( ·
−
− · ≤ ≤x P
25 .
1
·
−a b
0 7
b
3
a
5
Total Area
)
1
)( (
a b
a b
−
−
6.5 4
[ ]
( )
3
a ab b
) a b ( 3
a b
x
) a b ( 3
1
dx ) x ( f x ) X ( E
2
b a
) X ( E
) X ( E ) X ( E iance var
2 2 3 3
b
a
3
b
a
2 2
2 2
+ +
·
−
−
·
−
· ·
+
·
− ·
∫
( )
12
a b
12
ab 2 b a
4
b ab 2 a
3
b ab a
var
2
2 2
2 2 2 2
−
·
− +
·
+ +
−
+ +
·
( ) ) a b ( 1 n
a b
) E(X moments
1 n 1 n
n
− +
−
· ·
+ +
} E(X))  E{(X moments central
r
r
· µ
dx
2
b a
x
a b
1
r
b
a
∫
¹
'
¹
¹
'
¹
+
−
−
·
( ) ( ) 1 r a b
2
b a
2
a b
1 r
2
b a
x
a b
1
1 r 1 r
b
a
1 r
+ −
,
`
.

−
−
,
`
.

−
·
+
,
`
.

+
−
−
·
+ + +
even is r if
2
a b
1 r
1
odd is r if 0
r
¹
¹
¹
'
¹
,
`
.

−
+
·
1,2,3,.. n for
2
a b
1 n 2
1
, 0
n 2
n 2 1 n 2
·
,
`
.

−
+
· µ · µ
−
dx
a b
1
*
2
b a
x
E(X)  X E( deviation mean
b
a
∫
−
+
− ·
·
( ) a b
4
1
− ·
∫
∞
−
> · Γ
Γ
0
1 x t 
0 x , dt t e (x)
as defined is by denoted function, gamma The
[ ] ( ) dt e t 1 x t e ) x ( . 1
operties Pr
t 2 x
0
0
1 x t − −
∞
∞
− −
∫
− + − · Γ
( )
) 1 x ( ) 1 x ( dt t e ) 1 x (
0
1 1 x t
− Γ − · − ·
∫
∞
− − −
1 dt e ) 1 ( . 2
0
t
· · Γ
∫
∞
−
( ) ( ) 1  n 1  n (n)
n put x . 3
Γ · Γ
·
( ) ( )
)! 1 n (
1 . 2 ).... 2 n )( 1 n (
...... ) 2 n ( 2 n 1 n
− ·
− − ·
· − Γ − − ·
1 ! 0 ) 1 ( · · Γ
Exponential Distribution
If the occurances of events over nonoverlapping
intervals are independent, such as arrival times of
telephone calls or bus arrival times at a bus stop,
then the waiting time distribution of these events
can be shown to be exponential
•
Time between arrivals to a queue (e.g. time
between people arriving at a line to check out in a
department store. (People, machines, or telephone calls may
wait in a queue)
•
Lifetime of components in a machine
0 parameter
otherwise 0
0 x e
f(x)
on distributi l Exponentia
x 
> λ
¹
'
¹
≥ λ
·
λ
0 parameter
otherwise 0
0 x e
f(x)
on distributi l Exponentia
x 
> λ
¹
'
¹
≥ λ
·
λ
dx e x ) X ( E ' moments
x
0
r r
r
λ −
∞
λ · · µ ·
∫
r r
0
y r
r
! r ) 1 r (
dy e y
1
λ
·
λ
+ Γ
·
λ
·
∫
∞
−
Example
Let X have an exponential distribution with mean of
100 . Find the probability that X<90
Ans. 0.593
Customers arrive in a certain shop according to an
approximate Poisson process at a mean rate of 20
per hour. What is the probability that the shopkeeper
will have to wait for more than 5 minutes for his first
customer to arrive?
15
e . ans
Memoryless Property of the Exponential Distribution
0 t s, any for t), P(X s) t/X s P(X
then d, distribute lly exponentia is X If
> > · > + >
( )
k
k
x
k
x
e e dx e ) k X ( P
λ −
∞
λ −
∞
λ −
· − · λ · >
∫
{ ¦
{ ¦ s X P
s X & t s X P
s) t/X s P(X Now
>
> + >
· > + >
{ ¦
{ ¦
s
) t s (
e
e
s X P
t s X P
λ −
+ λ −
·
>
+ >
·
). t X ( P e
t
> · ·
λ −
If x represents the lifetime of an equipment then
above property states that if the equipment has
been working for time s, then the probability that it
will survive an additional time t depends only on t
(not on s) and is identical to the probability of survival
for time t of a new piece of equipment.
Equipment does not remember that it has been in
use for time s.
A crew of workers has 3 interchangeable machines,
of which 2 must be working for the crew to its job.
When in use, each machine will function for an
exponentially distributed time having parameters
before breaking down. The workers decide to initially
use machines A and B and keep machine C in
reserve to replace whichever of A or B breaks down
first. They will then be able to continue working until
one of the remaining machines breaks down. When
the crew is forced to stop working because only
one of the machines has not yet broken down, what
is the probability that the still operable machine is
machine C?
λ
Suppose the life length of an appliance has an
exponential distribution with
years. 10 · λ
A used appliance is bought by someone. What is
the probability that it will not fail in the next 5 years?
Suppose that the amount of waiting time a customer
spends at a restaurant has an exponential distribution
with a mean value of 5 minutes Then find the
probability that a customer will spend more than
10 minutes in the restaurant
0.368
0.1353
Example
Suppose that the length of a phone call in minutes is
an exponenetial random variable with parameter
. 10
1
· λ
If A arrives immediately ahead of B at a public
telephone booth, find the probability that B will have
to wait (i) more than 10 minutes, and (ii) between
10 and 20 minutes.
(ans. 0.368,0.233)
Erlang distribution or General Gamma distribution
¹
¹
¹
'
¹
≥
Γ
λ
·
> > λ
λ − −
otherwise 0,
0 x for ,
) k (
e x
f(x)
by given is pdf its if 0, k 0,
parameter n with distribuio Gamma General or n distribuio
Erlang an follow to said is X RV continuous A
x 1 k k
1 dx ) x ( f
0
·
∫
∞
k. parameter with
on distributi gamma simple or on distributi Gamma
. 0 k ; 0 x ,
) k (
e x
pdf
, 1
x 1 k
> ≥
Γ
·
· λ
− −
0.
parameter on with distributi l exponentia the to
reduces on distributi Erlang the 1, k When
> λ
·
3. an smaller th not is X y that probabilit the Find
0 for x 2e
0 for x 0
f(x)
by given function density with
on distributi gamma the has X variable random The
Example
2x 
¹
'
¹
>
≤
·
6
x 2
3
x 2
e
2
e
2 dx e 2 ) 3 X ( P
−
−
∞
−
·
]
]
]
−
· · ≥
∫
Suppose that an average of 30 customers per hour
arrive at a shop in accordance with a Poisson Process
That is, if a minute is our unit , then . What
is the probability that the shopkeeper wait more than
5 minutes before both of the first two customer arrive?
2 / 1 · λ
Solution.
If X denotes the waiting time in minutes until the
second customer arrives, then X has Erlang(Gamma)
distribution with k = 2, 2 / 1 · λ
( )
dx e
) k (
x
) 5 X ( P
x
5
1 k
λ −
∞
−
∫
Γ
λ λ
· >
=0.287
Mean and Variance of Erlang Distribution
) X ( E ' moments
r
r
· µ ·
dx e x
) k (
x 1 r k
0
k
λ − − +
∞
∫
Γ
λ
·
∫
∞
− − +
+
λ Γ
λ
·
0
t 1 r k
r k
k
dt e t
1
) k (
) k (
) r k ( 1
r
Γ
+ Γ
λ
·
2
k
(X) var
k
) X ( E mean
λ
·
λ
· ·
) e ( E ) t ( M . f . g . m
tx
X
· ·
dx e e x
) k (
0
tx x 1 k
k
∫
∞
λ − −
Γ
λ
·
∫
∞
− λ − −
Γ
λ
·
0
x ) t ( 1 k
k
dx e x
) k (
( )
∫
∞
− −
− λ
Γ
λ
·
0
y 1 k
k
k
dy e y
t
1
) k (
( )
k
k
t − λ
λ
·
k
t
1
−
,
`
.

λ
− ·
( )
n
k ...
2
k
1
k
t
1
+ +
−
,
`
.

λ
− ·
) t ( M )..... t ( M ) t ( M ) t ( M
n 2 1 n 2 1
X X X ) X ... X X (
·
+ + +
n 2 1
n 2 1
k k k
X X X
k
1 .....
k
1
k
1
) t ( M )..... t ( M ) t ( M
− − −
,
`
.

λ
−
,
`
.

λ
−
,
`
.

λ
− ·
) t ( M
) X ... X X (
n 2 1
+ + +
·
Reproductive Property
The sum of a finite number of Erlang variables is also
an Erlang variable.
) k ..... k k , ( parameter
with variable Erlang an also is X .... X X
then ), k , ),......( k , ( ), k , ( parameters with
variables Erlang t independen are X ,...., X , X if
n 2 1
n 2 1
n 2 1
n 2 1
+ + + λ
+ + +
λ λ λ
profit? expected the maximise employ to
company the should ns salesperso many how person,
per 8000 Rs. is cost sales the If . n 80 k and
1/2, ion with distribuit Erlang an having RV
as regarded be may rupees of ds in thousan sales
gross its persons, sales n employes company a If
·
· λ
Let X represent the gross sales (in Rupees) by n
Sales persons
X will have Erlang distribution
Weibull Distribution
0 x , e x ) x ( f
x 1
> α β ·
β
α − − β
β α, parameter
. parameter on with distributi l exponentia the
to reduces on distributi Weibull 1, when
α
· β
∫
∞
α − − β +
β
α β · · µ
0
x 1 r r
r
dx e x ) X ( E '
dy
y
e
y
1
1
y
1
1
r
0
−
β
−
β
− +
β
∞
,
`
.

α
,
`
.

α
·
∫
0 , > β α
]
]
]
]
¹
'
¹
¹
'
¹
,
`
.

+
β
Γ −
,
`
.

+
β
Γ α ·
,
`
.

+
β
Γ α · µ · ·
β −
β
−
2
/ 2
1
1
1
1
1
2
) X ( Var
1
1
' ) X ( E Mean
service? of months 2 first the during
replaced be to have will tube no y that probabilit the
is what another, one of tly independen function
tubes these If 2. and 25 parameters
on with distributi Weibull a follows that RV a
as considered be may which years) length(in
life a has set radio a of tubes 6 the of Each
· β · α
0 x , e x f(x) by given is f(x) function density its
then each tube, of length life the represents X If
x 1 
> α β ·
β
α − β
0 x , xe 50 ) x ( f
2
x 25
> ·
−
( )
36 / 25
6
1
x 25
6 / 1
x 25
e
e dx xe 50 ) 6 / 1 X ( p
2 2
−
∞
−
∞
−
·
− · · > ·
∫
( ) 0155 . 0 e
months) 2 first the
during replaced be not to are tubes 6 the P(all
6
25/36 
· ·
P( a tube is not to be replaced during the first
2 months)
Properties of a
Normal Distribution
•
Continuous Random Variable
•
Symmetrical in shape (Bell shaped)
•
The probability of any given range of
numbers is represented by the area under
the curve for that range.
•
Probabilities for all normal distributions are
determined using the Standard Normal
Distribution.
Probability for a
Continuous Random Variable
Probability Density Function for
Normal Distribution
0 , , x > σ ∞ < µ < −∞ ∞ < < ∞ −
e
x
x f
) (
2
1
) (
2
2
1
−
·
−
) , ( N σ µ
e
)
4
7 x
(
32
1
) x ( f
2
2
1
+
π
·
−
0 , , x > σ ∞ < µ < −∞ ∞ < < ∞ −
) 4 , 7 ( N −
∫
∞
∞ −
·1 dx ) x ( f
Standard Normal Distribution
N(0,1)
. z , e
2
1
) z (
2
z
2
∞ < < −∞
π
· φ
−
. and z into
ly respective f and x changing by & 1 , 0
φ
· σ · µ
N(0,1) on distributi has then Z
,
 X
Z if and ) , N( on distributi has X If
σ
µ
· σ µ
. tabulated are (z)dz (z), of values
z
0
∫
φ φ
) , ( N X σ µ −
∫
∞
∞ −
· dx ) x ( xf ) X ( E
( )
∫
∞
∞ −
σ µ − −
π σ
· dx xe
2
1
2 2
2 / x
( )
,
`
.

σ
µ −
· σ + µ
π
·
∫
∞
∞ −
−
2
x
t dt e t 2
1
2
t
∫ ∫
∞
∞ −
∞
∞ −
− −
σ
π
+
π
µ
· dt te
2
dt e
2 2
t t
. µ · π
π
µ
·
2
iance var , similarly σ ·
Figure 6.3
Figure 6.5
Determining the Probability for a
Standard Normal Random Variable
•
P(∞≤ Z ≤ 1.62) = .5 + .4474 = .9474
•
P(Z > 1.62) = 1  P(∞≤ Z ≤ 1.62) =
1  .9474 = .0526
z 0.00 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09
0.0 0.0000 0.0040 0.0080 0.0120 0.0160 0.0190 0.0239 0.0279 0.0319 0.0359
0.1 0.0398 0.0438 0.0478 0.0517 0.0557 0.0596 0.0636 0.0675 0.0714 0.0753
0.2 0.0793 0.0832 0.0871 0.0910 0.0948 0.0987 0.1026 0.1064 0.1103 0.1141
0.3 0.1179 0.1217 0.1255 0.1293 0.1331 0.1368 0.1406 0.1443 0.1480 0.1517
0.4 0.1554 0.1591 0.1628 0.1664 0.1700 0.1736 0.1772 0.1808 0.1844 0.1879
0.5 0.1915 0.1950 0.1985 0.2019 0.2054 0.2088 0.2123 0.2157 0.2190 0.2224
0.6 0.2257 0.2291 0.2324 0.2357 0.2389 0.2422 0.2454 0.2486 0.2517 0.2549
0.7 0.2580 0.2611 0.2642 0.2673 0.2704 0.2734 0.2764 0.2794 0.2823 0.2852
0.8 0.2881 0.2910 0.2939 0.2969 0.2995 0.3023 0.3051 0.3078 0.3106 0.3133
0.9 0.3159 0.3186 0.3212 0.3238 0.3264 0.3289 0.3315 0.3340 0.3365 0.3389
1.0 0.3413 0.3438 0.3461 0.3485 0.3508 0.3513 0.3554 0.3577 0.3529 0.3621
1.1 0.3643 0.3665 0.3686 0.3708 0.3729 0.3749 0.3770 0.3790 0.3810 0.3830
1.2 0.3849 0.3869 0.3888 0.3907 0.3925 0.3944 0.3962 0.3980 0.3997 0.4015
1.3 0.4032 0.4049 0.4066 0.4082 0.4099 0.4115 0.4131 0.4147 0.4162 0.4177
1.4 0.4192 0.4207 0.4222 0.4236 0.4251 0.4265 0.4279 0.4292 0.4306 0.4319
1.5 0.4332 0.4345 0.4357 0.4370 0.4382 0.4394 0.4406 0.4418 0.4429 0.4441
1.6 0.4452 0.4463 0.4474 0.4484 0.4495 0.4505 0.4515 0.4525 0.4535 0.4545
1.7 0.4554 0.4564 0.4573 0.4582 0.4591 0.4599 0.4608 0.4616 0.4625 0.4633
1.8 0.4641 0.4649 0.4656 0.4664 0.4671 0.4678 0.4686 0.4693 0.4699 0.4706
1.9 0.4713 0.4719 0.4726 0.4732 0.4738 0.4744 0.4750 0.4756 0.4761 0.4767
2.0 0.4772 0.4778 0.4783 0.4788 0.4793 0.4798 0.4803 0.4808 0.4812 0.4817
2.1 0.4821 0.4826 0.4830 0.4834 0.4838 0.4842 0.4846 0.4850 0.4854 0.4857
2.2 0.4861 0.4864 0.4868 0.4871 0.4875 0.4878 0.4881 0.4884 0.4887 0.4890
2.3 0.4893 0.4896 0.4898 0.4901 0.4904 0.4906 0.4909 0.4911 0.4913 0.4916
2.4 0.4918 0.4920 0.4922 0.4925 0.4927 0.4929 0.4931 0.4932 0.4934 0.4936
2.5 0.4938 0.4940 0.4941 0.4943 0.4945 0.4946 0.4948 0.4949 0.4951 0.4952
2.6 0.4953 0.4955 0.4956 0.4957 0.4959 0.4960 0.4961 0.4962 0.4963 0.4964
2.7 0.4965 0.4966 0.4967 0.4968 0.4969 0.4970 0.4971 0.4972 0.4973 0.4974
2.8 0.4974 0.4975 0.4976 0.4977 0.4977 0.4978 0.4979 0.4979 0.4980 0.4981
2.9 0.4981 0.4982 0.4982 0.4983 0.4984 0.4984 0.4985 0.4985 0.4986 0.4986
3.0 0.4987 0.4987 0.4987 0.4988 0.4988 0.4989 0.4989 0.4989 0.4990 0.4990
3.1 0.4990 0.4991 0.4991 0.4991 0.4992 0.4992 0.4992 0.4992 0.4993 0.4993
3.2 0.4993 0.4993 0.4994 0.4994 0.4994 0.4994 0.4994 0.4995 0.4995 0.4995
3.3 0.4995 0.4995 0.4995 0.4996 0.4996 0.4996 0.4996 0.4996 0.4996 0.4997
3.4 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4998
Determining the probability of any Normal Random
Variable
Interpreting Z
•
In figure Z = 0.8 means that the value 360
is .8 standard deviations below the mean.
•
A positive value of Z designates how many
standard deviations (σ ) X is to the right of
the mean (µ ).
•
A negative value of Z designates how may
standard deviations (σ ) X is to the left of the
mean (µ ).
Example: A group of achievement scores are
normally distributed with a mean of 76 and a
standard deviation of 4. If one score is randomly
selected what is the probability that it is at least 80.
76
80
4 · σ
1587 . 3413 . 5 .
) 1 z 0 ( P 5 . ) 1 z ( P ) 80 x ( P
1
4
76 80 x
Z
· −
· ≤ ≤ − · ≥ · ≥
·
−
·
σ
µ −
·
0 1
.
3
4
1
3
.1587
76
80
4 · σ
1587 . 3413 . 5 .
) 1 0 ( 5 . ) 1 ( ) 80 (
1
4
76 80
· −
· ≤ ≤ − · ≥ · ≥
·
−
·
−
·
z P z P x P
u x
Z
σ
Continuing, what is the probability that it is less than 70.
70 76
.0668 4332 . 5 .
) 0 . 0 5 . 1 ( 5 . ) 5 . 1 ( ) 70 (
5 . 1
4
76 70
· −
· ≤ ≤ − − · − ≤ · ≤
− ·
−
·
−
·
z P z P x P
u x
Z
σ
1.5 0
.
4
3
3
2
.0668
What proportion of the scores occur within 70 and 85.
9210 . 4878 . 4332 .
) 25 . 2 0 . 0 ( ) 0 . 0 5 . 1 (
) 25 . 2 5 . 1 ( ) 85 70 (
25 . 2
4
76 85
5 . 1
4
76 70
· +
· ≤ ≤ + ≤ ≤ −
· ≤ ≤ − · ≤ ≤
·
−
·
−
·
− ·
−
·
−
·
z P z P
z P x P
u x
Z
u x
Z
σ
σ
.
4
3
3
2
.
4
8
7
8
1.5 0 2.25
Time required to finish an exam is known to be normally
distributed with a mean of 60 Min. and a Std Dev. of 12
minutes. How much time should be allowed in order for
90% of the students to finish?
12 · σ
60 x
.9
36 . 75
60 ) 12 ( 28 . 1
·
· +
· +
− ·
−
·
x
x
x z
x z
x
z
µ σ
µ σ
σ
µ
An automated machine that files sugar sacks has an adjusting device to
change the mean fill per sack. It is now being operated at a setting that
results in a mean fill of 81.5 oz. If only 1% of the Sacks filled at this
setting contain less than 80.0 oz, what is the value of the variance for this
population of fill weights. (Assume Normality).
2.33 0
.01
4144 .
6437 .
33 . 2
) 5 . 81 0 . 80 (
5 . 81 0 . 80
33 . 2
01 . 80 5 . 81
2
·
·
−
−
·
−
· −
−
·
· · ·
σ
σ
σ
σ
µ
u x
Z
PROB x
Moment generating function of N(0,1)
dz ) z ( e ) e ( E ) t ( M
tz tZ
Z
φ · ·
∫
∞
∞ −
dz e e
2
1
2 / z tz
2
−
∞
∞ −
∫
π
·
( )
dz e
2
1
2 / tz 2 z
2
− −
∞
∞ −
∫
π
·
( )
dz e
2
1
2 / t t z
2
2
¹
'
¹
¹
'
¹
− − − ∞
∞ −
∫
π
·
u 2
du
e
2
1
e
u 2 / t
2
−
∞
∞ −
∫
π
·
2 / t 2 / t
2 2
e ) 2 / 1 (
1
e · Γ
π
·
( )
dz e
2
1
e
2
t z
2 / t
2
2
−
−
∞
∞ −
∫
π
·
) t ( M ) t ( M
) , N( of function generating moment The
Z X µ + σ
· ·
σ µ
) e ( E
) z ( t µ + σ
·
) e ( E e
z t t σ µ
·
) t ( M e
Z
t
σ ·
µ
( ) ( ) 2 / t t 2 / t t
2 2 2
e e e
σ + µ σ µ
· ·
∞ + +
σ
+ µ +
σ
+ µ + · ..... )
4
t
(
! 2
t
)
2
t
(
! 1
t
1
2
2 2 2
) a b, N(a on distributi the has
b aX Y then ) , N( on distributi the has X If
σ + µ
+ · σ µ
( ) ( ) 2 / t t
X
2
e ) t ( M
σ + µ
·
) t ( M ) t ( M
b aX Y +
·
) at ( M e
X
bt
·
( ) ( ) 2 / t a a t bt
2 2
e e
σ + µ
·
( ) ( ) ( ) 2 / t a b a t
2 2
e
σ + + µ
·
) a b, N(a MGF σ + µ
) 1 , 0 ( N ) .
1
,
1
N( on distributi the has
 X
then Z ), , N( on distributi has X If
· σ
σ σ
µ
− µ
σ
σ
µ
· σ µ
. a iance var &
a mean with RV normal a also is X a
then , variance and mean with RVs
normal t independen n be ) n ,...., 2 , 1 i ( X If
on distributi normal of property Additive
n
1 i
i
2
2
i
n
1 i
i i
n
1 i
i i
2
i i
i
∑
∑ ∑
·
· ·
σ
µ
σ µ
·
ce) independen by ( M ...... M ) t ( M ) t ( M
, n n 2 2 1 1
n
1 i
i i
X a X a X a
X a
·
∑
·
( ) 2 / t a t a
2 / t a t a 2 / t a t a
2 2
i
2
i i i
2 2
2
2
2 2 2
2 2
1
2
1 1 1
e
........ e . e
σ + µ
σ + µ σ + µ
∑ ∑
·
·
When n is very large and neither p nor q is very small
X—B(n,p)
npq
np  X
Z
by given is Z variable binomial dard tan s
·
.
npq
1
size step with
npq
np
to
npq
np 
from
varies Z 1, size step n with to 0 from varies X as
∞ < < −∞
π
· ·
−
∞
∞ −
∫
z , dt e
2
1
) z ( F ) z ( P
2 / t
2
Let X be the number of times that a fair coin, flipped 40
times, land heads. Find P(X=20). Use normal
approximation and compare it to the exact solution.
P(X=20)=P(19.5<X<20.5)
,
`
.
 −
<
−
<
−
·
10
20 5 . 20
10
20 X
10
20 5 . 19
P
1272 . ) 16 . ( ) 16 (. · − φ − φ ·
0
1 2 3 4 5 6 7 8 9 10 11 12
0
1 2 3 4 5 6 7 8 9 10 11 12
If 20% of the momory chips made in a certain plant
are defective, what are the probabilities that in a
lot of 100 randomly chosen for inspection
(a) at most 15 will be defective?
(b) exactly 15 will be defective ?
4 , 20 ) 20 (. 100 · σ · · µ
1292 . 0 )
4
20 5 . 15
( F ·
−
0454 . 0
4
20 5 . 14
F
4
20 5 . 15
F ·
,
`
.

−
−
,
`
.

−
Fit a normal distribution to the following distribution
and hence find the theoretical frequencies:
Class Freq
6065 3
6570 21
7075 150
7580 335
8085 336
8590 135
9095 26
95100 4

1000
. parameter on with distributi Rayleigh a follows X then
, x 0 , e
x
f(x) is X RV continuous a of pdf the If
2 2
a 2 / x
2
σ
∞ < <
σ
·
−
In communication systems, the signal amplitude values
of a randomly received signal usually can be modeled
as a Rayleigh distribution.
0 x , e x ) x ( f
x 1
> α β ·
β
α − − β
. rate linear has Rayleigh Thus
on. Distributi Rayleigh the as known is
2 & 1/ with Weibull of case special The
2
· β σ · α
( )
¹
¹
¹
'
¹
<
∞ ≤ ≤
Γ
·
· · λ
−
−
0 x 0
x 0 , e
2 ) 2 / r (
x
f(x) is X of pdf The
integer. positive a is r where
r/2 k and 1/2 on with distributi gamma a have X Let
2 / x
2 / r
1 r/2
freedom. of
degrees r with ) r ( on distributi square  chi a has X
2
χ
( )
( )
r
2 / 1
2 / r k
) X ( E · ·
λ
·
( )
( )
r 2
4 / 1
2 / r k
) X ( Var
2
· ·
λ
·
Mean equals the number of degrees of freedom and
the variance equals twice the number of degrees of
freedom.
2 / 1 t , ) t 2 1 ( ) t ( M
2 / r
X
< − ·
−
df \p .005 .01 .025 .05 .10 .90 .95 .975 .99 .995
1 .00004 .00016 .00098 .0039 .0158 2.71 3.84 5.02 6.63 7.88
2 .0100 .0201 .0506 .1026 .2107 4.61 5.99 7.38 9.21 10.60
3 .0717 .115 .216 .352 .584 6.25 7.81 9.35 11.34 12.84
4 .207 .297 .484 .711 1.064 7.78 9.49 11.14 13.28 14.86
5 .412 .554 .831 1.15 1.61 9.24 11.07 12.83 15.09 16.75
.6 .676 .872 1.24 1.64 2.20 10.64 12.59 14.45 16.81 18.55
7 .989 1.24 1.69 2.17 2.83 12.02 14.07 16.01 18.48 20.28
8 1.34 1.65 2.18 2.73 3.49 13.36 15.51 17.53 20.09 21.96
9 1.73 2.09 2.70 3.33 4.17 14.68 16.92 19.02 21.67 23.59
10 2.16 2.56 3.25 3.94 4.87 15.99 18.31 20.48 23.21 25.19
11 2.60 3.05 3.82 4.57 5.58 17.28 19.68 21.92 24.73 26.76
12 3.07 3.57 4.40 5.23 6.30 18.55 21.03 23.34 26.22 28.30
13 3.57 4.11 5.01 5.89 7.04 19.81 22.36 24.74 27.69 29.82
14 4.07 4.66 5.63 6.57 7.79 21.06 23.68 26.12 29.14 31.32
15 4.6 5.23 6.26 7.26 8.55 22.31 25 27.49 30.58 32.80
16 5.14 5.81 6.91 7.96 9.31 23.54 26.30 28.85 32.00 34.27
18 6.26 7.01 8.23 9.39 10.86 25.99 28.87 31.53 34.81 37.16
20 7.43 8.26 9.59 10.85 12.44 28.41 31.41 34.17 37.57 40.00
24 9.89 10.86 12.40 13.85 15.66 33.20 36.42 39.36 42.98 45.56
30 13.79 14.95 16.79 18.49 20.60 40.26 43.77 46.98 50.89 53.67
40 20.71 22.16 24.43 26.51 29.05 51.81 55.76 59.34 63.69 66.77
60 35.53 37.48 40.48 43.19 46.46 74.40 79.08 83.30 88.38 91.95
120 83.85 86.92 91.58 95.70 100.62 140.23 146.57 152.21 158.95 163.64
df \p .005 .01 .025 .05 .10 .90 .95 .975 .99 .995
20.5). X P(3.25 Find ). 10 ( be X Let
2
≤ ≤ χ
( )
5.23). P(X find , variable random
the of m.g.f. the is , 2 / 1 t , 2t  1 If
6
< χ
<
−
.025 0 c) P(X & 0.95 d) X P(c
that so d and c constant the determine , ) 5 ( is X If
2
· < · < <
χ
Ans.0.95
0.05
0.831, 12.8
Beta Distribution
otherwise 0
b x 0 , ) x 1 ( x
) , B(
1
f(x) if
& parameters e nonnegativ on with distributi
beta have to said is x variable random The
1 1
< < −
¹
'
¹
β α
·
β α
− β − α
( ) ( )
( ) ( )
( ) β + α Γ
β Γ α Γ
· θ θ θ ·
− · β α
∫
∫
π
− β − α
− β α
2
0
1 2 1 2
1
0
1 1 
d cos sin 2
dx ) x 1 ( x ) , B( where
(0,1) on
on distributi uniform is on distributi beta 1, when · β · α
(0,1) on on distributi uniform
y than flexibilit greater provides function beta
shapes. of variety a on takes distributi
beta the , & of values on the Depending β α
( ) ( ) 1
,
2
2
+ β + α β + α
α β
· σ
β + α
α
· µ
year. given any in repairs require will sections
highway the of half most at y that probabilit the (b)
: year given any in repairs require sections
highway the of percentage what average on the (a)
Find 2.
and 3 on with distributi beta the having variable
random a is year given any in repairs requiring
highway of proportion the country, certain a In
· β
· α
∫
· − · ≤
·
+
· µ
1/2
0
2
16 / 5 dx ) x 1 ( 12x 1/2) (b)P(X
year. given any in repairs require sections
high way the of % 60 . e . i , 60 . 0
2 3
3
) a (
The lognormal distribution
0 t ,
t
t
log
s 2
1
exp
2 st
1
f(t) by
given is pdf on whose distributi lognormal a follows T then
), , N( on distributi normal a follows T log X If
2
m
2
≥
]
]
]
]
¹
'
¹
¹
'
¹
−
π
·
σ µ ·
. t log by given
parameter, location the is failure to mean time) time(or median
the t and parameter shape a is s where
m
, m
µ ·
σ ·
[ ] 1 ) s exp( ) s exp( t ) T var(
)
2
s
exp( t ) t ( E MTTF
2 2
2
m
T
2
2
m
− · σ ·
· ·
{ ¦ t log T log P ) t T ( P ) t ( F ≤ · ≤ ·
¹
'
¹
¹
'
¹
≤ ·
¹
'
¹
¹
'
¹
≤
σ
−
·
¹
'
¹
¹
'
¹
σ
µ −
≤
σ
µ −
·
m
m
m
t
t
log
s
1
Z P
t
t
log
s
1 t log T log
P
t log T log
P
(t). and R(T) compute can We λ
0.95. of y reliabilit a for component the of life design the Find (c)
hours. 3000 for component the of y reliabilit the Find (b)
SD. and MTTF the Compute (a)
0.20. s and hours 5000 with t
on distributi normal log a has component a of wearout Fatigue
M
· ·
5101hours,1030hours
∫
∞
·
3000
dt ) t ( f ) 3000 ( R
∫
∞
φ ·
)
5000
3000
log(
2 . 0
1
dz ) z (
9946 . 0 dz ) z (
55 . 2
· φ ·
∫
∞
−
3598.2
z 0.00 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09
0.0 0.0000 0.0040 0.0080 0.0120 0.0160 0.0190 0.0239 0.0279 0.0319 0.0359
0.1 0.0398 0.0438 0.0478 0.0517 0.0557 0.0596 0.0636 0.0675 0.0714 0.0753
0.2 0.0793 0.0832 0.0871 0.0910 0.0948 0.0987 0.1026 0.1064 0.1103 0.1141
0.3 0.1179 0.1217 0.1255 0.1293 0.1331 0.1368 0.1406 0.1443 0.1480 0.1517
0.4 0.1554 0.1591 0.1628 0.1664 0.1700 0.1736 0.1772 0.1808 0.1844 0.1879
0.5 0.1915 0.1950 0.1985 0.2019 0.2054 0.2088 0.2123 0.2157 0.2190 0.2224
0.6 0.2257 0.2291 0.2324 0.2357 0.2389 0.2422 0.2454 0.2486 0.2517 0.2549
0.7 0.2580 0.2611 0.2642 0.2673 0.2704 0.2734 0.2764 0.2794 0.2823 0.2852
0.8 0.2881 0.2910 0.2939 0.2969 0.2995 0.3023 0.3051 0.3078 0.3106 0.3133
0.9 0.3159 0.3186 0.3212 0.3238 0.3264 0.3289 0.3315 0.3340 0.3365 0.3389
1.0 0.3413 0.3438 0.3461 0.3485 0.3508 0.3513 0.3554 0.3577 0.3529 0.3621
1.1 0.3643 0.3665 0.3686 0.3708 0.3729 0.3749 0.3770 0.3790 0.3810 0.3830
1.2 0.3849 0.3869 0.3888 0.3907 0.3925 0.3944 0.3962 0.3980 0.3997 0.4015
1.3 0.4032 0.4049 0.4066 0.4082 0.4099 0.4115 0.4131 0.4147 0.4162 0.4177
1.4 0.4192 0.4207 0.4222 0.4236 0.4251 0.4265 0.4279 0.4292 0.4306 0.4319
1.5 0.4332 0.4345 0.4357 0.4370 0.4382 0.4394 0.4406 0.4418 0.4429 0.4441
1.6 0.4452 0.4463 0.4474 0.4484 0.4495 0.4505 0.4515 0.4525 0.4535 0.4545
1.7 0.4554 0.4564 0.4573 0.4582 0.4591 0.4599 0.4608 0.4616 0.4625 0.4633
1.8 0.4641 0.4649 0.4656 0.4664 0.4671 0.4678 0.4686 0.4693 0.4699 0.4706
1.9 0.4713 0.4719 0.4726 0.4732 0.4738 0.4744 0.4750 0.4756 0.4761 0.4767
2.0 0.4772 0.4778 0.4783 0.4788 0.4793 0.4798 0.4803 0.4808 0.4812 0.4817
2.1 0.4821 0.4826 0.4830 0.4834 0.4838 0.4842 0.4846 0.4850 0.4854 0.4857
2.2 0.4861 0.4864 0.4868 0.4871 0.4875 0.4878 0.4881 0.4884 0.4887 0.4890
2.3 0.4893 0.4896 0.4898 0.4901 0.4904 0.4906 0.4909 0.4911 0.4913 0.4916
2.4 0.4918 0.4920 0.4922 0.4925 0.4927 0.4929 0.4931 0.4932 0.4934 0.4936
2.5 0.4938 0.4940 0.4941 0.4943 0.4945 0.4946 0.4948 0.4949 0.4951 0.4952
2.6 0.4953 0.4955 0.4956 0.4957 0.4959 0.4960 0.4961 0.4962 0.4963 0.4964
2.7 0.4965 0.4966 0.4967 0.4968 0.4969 0.4970 0.4971 0.4972 0.4973 0.4974
2.8 0.4974 0.4975 0.4976 0.4977 0.4977 0.4978 0.4979 0.4979 0.4980 0.4981
2.9 0.4981 0.4982 0.4982 0.4983 0.4984 0.4984 0.4985 0.4985 0.4986 0.4986
3.0 0.4987 0.4987 0.4987 0.4988 0.4988 0.4989 0.4989 0.4989 0.4990 0.4990
3.1 0.4990 0.4991 0.4991 0.4991 0.4992 0.4992 0.4992 0.4992 0.4993 0.4993
3.2 0.4993 0.4993 0.4994 0.4994 0.4994 0.4994 0.4994 0.4995 0.4995 0.4995
3.3 0.4995 0.4995 0.4995 0.4996 0.4996 0.4996 0.4996 0.4996 0.4996 0.4997
3.4 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4998
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue reading from where you left off, or restart the preview.