You are on page 1of 103

MA8391

839 - PROBABILITY
O AND
N STATISTICS
S S CS

UNIT-I
PROBABILITY AND RANDOM VARIABLES
Syllabus

2 IFETCE/H&S UNIT-II/T. SOUPRAMANIEN/IIYEAR/IV SEM/MA6451/PRP/UNIT-1/VER 1.1 6-Apr-21


Random variable:
A random variable X can be considered as a function that
maps all elements in the sample space S into points on
the real line. The notation X(s)
X(s)=xx means that x is the real
value associated with the outcome s by the random
variable X.

Example:
I the
In th experiment
i t off throwing
th i a coin
i twice
t i the
th sample
l
space S is S={HH,HT,TH,TT}.

Let X be a random variable such that the possible number


of heads in the Sample space. Therefore X takes the
values X={0,1,2}
Probability distribution:
The values assumed by the random variable X presented with
corresponding probability is known as the probability distribution
of X.

Example:
Consider the experiment tossing a coin twice the possible out
comes are S={HH, HT, TH, TT}

Let
et the random variable X denotes the number of heads then
X={0,1,2} the probability distribution of X is given by

x 0 1 2
P(x) 1/4 2/4 1/4
Random Variable

Discrete RV
Discrete RV Continuous RV
Continuous RV

Cumulative  Probability  Cumulative 


Probability 
y
Di t ib ti
Distribution  D it
Density  Di t ib ti
Distribution 
Mass Function Function Function Function
Discrete random variable :

om
.c
or
at
ot
nn
FA
D
.P
w
w
w
r-
to
ta
no
An
F
PD
of
on
si
er
lV
ia
Tr
a
ith
w
ed
uc
om
.c
or
at
ot
nn
FA
D
.P
w
w
w
r-
to
ta
no
An
F
PD
of
on
si
er
lV
ia
Tr
a
ith
w
ed
uc
or non‐decreasing  function 
dx
dx
Problem ‐ 1
Problem‐2
Let X be a random variable such that P(X
P(X=‐2)
2) = P(X
P(X=‐1)
1) = P(X
P(X=1)
1) =
P(X=2) and P(X<0) = P(X=0) = P(X>0). Determine the probability
mass function of X and distribution function of X.

Solution: Given X takes values ‐2, ‐1, 0, 1, 2


Given that P(X=‐2) = P(X=‐1) = P(X=1) = P(X=2)
Let P(X=‐2) = P(X=‐1) = P(X=1) = P(X=2) = k (say)

Given P(X
P(X<0)
0) = P(X
P(X=0)
0) = P(X
P(X>0).
0).
P(X<0) = P(X=‐2) + P(X=‐1)
=k+k
= 2k.
2k
Therefore P(X=0) = 2k
The probability mass function of X is
x -2 -1 0 1 2
P(x) k k 2k k k

x -2 -1 0 1 2
P(x) 1/6 1/6 2/6 1/6 1/6

The distribution function of X, F(x) is
F(x)
-2 1/6 F(-2) = P(-2) = 1/6
-1 1/6

0 2/6

1 1/6

2 1/6
PROBLEM ‐ 3

a + 3 a + 5 a + 7 a + 9 a + 11a + 13 a + 15 a + 17 a = 1
⇒ 81a = 1
1
⇒ a=
81

1 1
= 9× =
81 9
‐ 4  4

Solution:‐ x 1 2 3 4 5 6
P(x) 1/6 1/6 1/6 1/6 1/6 1/6
⎛ 1⎞ ⎛ 1⎞ ⎛ 1⎞ ⎛ 1⎞ ⎛ 1⎞ ⎛ 1⎞
E ( X ) = ∑ x p( x) = ⎜1× ⎟ + ⎜ 2 × ⎟ + ⎜ 3 × ⎟ + ⎜ 4 × ⎟ + ⎜ 5 × ⎟ + ⎜ 6 × ⎟
x ⎝ 6⎠ ⎝ 6⎠ ⎝ 6⎠ ⎝ 6⎠ ⎝ 6⎠ ⎝ 6⎠
1 7
= (1 + 2 + 3 + 4 + 5 + 6) =
6 2
⎛ 1⎞ ⎛ 1⎞ ⎛ 1⎞ ⎛ 1⎞ ⎛ 1⎞ ⎛ 1⎞
E ( X 2 ) = ∑ x 2 p ( x ) = ⎜ 12 × ⎟ + ⎜ 2 2 × ⎟ + ⎜ 3 2 × ⎟ + ⎜ 4 2 × ⎟ + ⎜ 5 2 × ⎟ + ⎜ 6 2 × ⎟
x ⎝ 6⎠ ⎝ 6⎠ ⎝ 6⎠ ⎝ 6⎠ ⎝ 6⎠ ⎝ 6⎠
1 1 6(6 + 1)(2 × 6 + 1) 91
= (12 + 2 2 + 3 2 + 4 2 + 5 2 + 6 2 ) = =
6 6 6 6
2
91 ⎛ 7 ⎞ 35
Var(X)= E ( X ) − [ E ( X ) ] = −⎜ ⎟ =
2 2

6 ⎝2⎠ 12
PROBLEM ‐ 5
PROBLEM ‐ 6
This first moment about the origin, namely the Mean is given by

⎡ d ⎤
μ'
1

= M (0) =
X
⎢⎣ dt ( M X (t ) )⎥ =X
⎦ t =0

The second moment about origin is given by

⎡ d 2 ⎤
μ '
2 = M X′′ ( 0 ) = ⎢ (M X (t ) ) ⎥
2
⎣ dt ⎦ t=0

In general, we get
⎡ d r

μr = ⎢ r ( M X (t )) ⎥
'

Note:-
⎣ dt ⎦ t =o

( )
2
σ = Var ( x) = μ2 = μ − μ1
2 '
2
Note:
Note‐1

Note‐2
Problem-1
Problem-2
Problem-3
Problem-4
Problem-5
Problem-5
This first moment about the origin, namely the Mean is given by

⎡ d ⎤
μ'
1

= M (0) =
X
⎢⎣ dt ( M X (t ) )⎥ =X
⎦ t =0

The second moment about origin is given by

⎡ d 2 ⎤
μ '
2 = M X′′ ( 0 ) = ⎢ (M X (t ) ) ⎥
2
⎣ dt ⎦ t=0

In general, we get
⎡ d r

μr = ⎢ r ( M X (t )) ⎥
'

Note:-
⎣ dt ⎦ t =o

( )
2
σ = Var ( x) = μ2 = μ − μ1
2 '
2
Note:
Note‐1

Note‐2
Problem-1
Problem-2
Problem-3
Problem-4
Problem-5
[Since we know that                               ]
[

]
0
BERNOULLI RANDOM VARIABLES
P[ X = x]

Remark:‐
Moment Generating function of binomial distribution:
We know that
∞ n
M x (t ) = ∑ e p( x) =
tx
∑ e tx
nC x p x n− x
q
x
x =0
n (a + b)n = nC0an + nC1an−1b + nC2an−2b2 + + nCnbn
= ∑ nC x (e t p ) x q n − x = nC0bn + nC1bn−1a + nC2bn−2a2 + + nCn an
x =0

= nC 0 q n + nC1 ( pe t ) q n −1 + nC 2 ( pe t ) 2 q n − 2 + ......
C n ( pe t ) n q n − n
+ nC
= nC 0 q n + nC1 ( pe t ) q n −1 + nC 2 ( pe t ) 2 q n − 2 + ......+ ( pe t ) n
= ( q + pe t ) n

∴ M X (t ) = (q + pet )n
Mean and Variance of Binomial distribution using M.G.F.

Mean= E(x) = μ ′ = ⎡ d M (t )⎤ M X (t ) = ( p e t + q ) n
1 ⎢⎣ dt X ⎥⎦
t =o
n −1
= [n( pe + q )
t t
pe ]t =0
= np ( p + q ) n −1
[∴p+q=1
= np
⎡ d2 ⎤
E ( X 2 ) = μ 2' = ⎢ 2 M X (t ) ⎥
⎣ dt ⎦ t =o
⎡d t⎤ ⎡d ⎤
= ⎢ [ n ( pe + q ) pe ⎥ = np ⎢ [( pe t + q ) n −1 e t ⎥
t n −1

⎣ dt ⎦ t =0 ⎣ dt ⎦ t =0
= np[ e t ( n − 1)( pe t + q ) n − 2 pe t + ( pe t + q ) n −1 e t ]t = 0
= np[( n − 1)( p + q ) n − 2 p + ( p + q ) n −1 ] = np[( n − 1) p + 1]
= npp (npp − p + 1))
= np (np + q)
= n p + npq
2 2

We know that

= n 2 p 2 + npq
pq − (npp ) 2
= npq
‐1

P[ X > 3] =

P[ X > 2] =

⇒ 2 p −1 ≥ 0
1
⇒ p ≥
2
‐2

= P(3) + P(4) + P(5) + P(6)


3 3 4 2 5 1 6 0
⎛1⎞ ⎛ 2⎞ ⎛1⎞ ⎛2⎞ ⎛1⎞ ⎛2⎞ ⎛1⎞ ⎛2⎞
= 6C3 ⎜ ⎟ ⎜ ⎟ + 6C4 ⎜ ⎟ ⎜ ⎟ + 6C5 ⎜ ⎟ ⎜ ⎟ + 6C6 ⎜ ⎟ ⎜ ⎟
⎝ 3⎠ ⎝ 3⎠ ⎝ 3⎠ ⎝3⎠ ⎝ 3⎠ ⎝3⎠ ⎝ 3⎠ ⎝3⎠
160 + 60 + 12 + 1 233
= 6
= 6
3 3
Problem ‐3
‐4
‐5
POISSON DISTRIBUTION
Moment generating function of the Poisson distribution
We know that the m.g.f.
g of random variable ‘X’ is ggiven byy
−λ x
∞ ∞
⎛ e λ ⎞
M x (t ) = ∑ e p ( x) = ∑ e ⎜⎜
tx tx
⎟⎟
x x =0 ⎝ x! ⎠

e − λ (λ e t ) x ∞
( λ e t x
)
=∑ =e ∑
−λ

x =0 x! x =0 x!
⎧⎪ t 2
⎛ λe ⎞ ⎫⎪
−λ
= e ⎨1 + λe + ⎜⎜
t
⎟⎟ + .....⎬
⎪⎩ ⎝ 2 ⎠ ⎪⎭
− λ λe t
=e e
λ ( e t −1)
=e
λ ( e t −1)
∴ M X (t ) = e
Mean and Variance of Poisson distribution using M.G.F.
′ ⎡d ⎤
E(x)) = μ 1 = ⎢ M X (t ) ⎥ λ ( et −1)
M
Mean= E(
⎣ dt ⎦ t=o M X (t ) = e
λ ( e t −1)
= [e λ e t ]t = 0
=e λ ( e 0 −1)
λ e 0 = e λ (1−1) λ e 0 = e 0 λ e 0

⎡ d2 ⎤
E( X ) = μ
2 '
2=⎢ dt 2 M X ( t ) ⎥
⎣ ⎦ t =o

=λ⎢
⎣ dt
e (
⎡ d λ ( e t −1) t ⎤
e ⎥ =λ e
⎦ t=0
)⎡
⎣ (
λ ( e t −1)
λe e + e
t t
) (
λ ( e t −1)
et ⎤ )
⎦ t=0
= λ [ λ + 1] = λ 2 + λ
Var ( x ) = μ 2 = μ − μ1′ ( )
2
'
2 = λ 2 + λ − (λ 2 ) = λ

Hence in the Poisson distribution, Mean= Variance=λ


Given P(X=2) = 9P(X=4) + 90P(X=6)
e− λ λ 2 e− λ λ 4 e− λ λ 6
⎛ 9λ 2
90λ 4

=9 + 90 =e λ ⎜
2 −λ
+ ⎟
2! 4! 6! ⎝ 4! 6! ⎠
1 ⎛ 9λ 2 90λ 4 ⎞
=⎜ + ⎟
2 ⎝ 4! 6! ⎠
1 3λ 2 λ 4
= +
2 8 8

− 3 ± 9 + 16 − 3 ± 5
λ =
2
=
2 2
GEOMETRIC DISTRIBUTION
The geometric random variable is used to describe the number of 
The geometric random variable is used to describe the number of
Bernoulli trials until the first success occurs.
Let X
Let X be a random variable that denotes the number of Bernoulli 
be a random variable that denotes the number of Bernoulli
trials up to and including the trial that results in the first success. If 
the first success occurs on the x‐th trial, then we know that the first x‐
1t i l
1 trials resulted in failures. 
lt d i f il

Since the trials are independent, the PMF of a geometric random 
variable, X, is given by
P[ X = x ] = (1 − p ) x −1 p , x = 1, 2, 3, …

Also one can write as


P[[ X = x ] = q x −1 p , x = 1,, 2,, 3,, …
where q = 1 − p .
Moment generating function of Geometric distribution
p
M X (t ) =
(1 − qet )
Mean and variance of a geometric distribution:
q q
Mean = Var ( X ) = 2
p p
Problem:
A die is cast until 6 appears. What is the probability that it must be
cast more that five times?

Solution:
1 1 5
Here p = , q = 1 − p = 1 − =
6 6 6 x −1
x −1 ⎛5⎞ ⎛1⎞
W know
We k th t P ( X = x) = q p ∴ P( X = x) = ⎜ ⎟ ⎜ ⎟
that
⎝6⎠ ⎝6⎠
P ( X > 5) = 1 − P ( X ≤ 5) = 1 − {P(1) + P(2) + P(3) + P(4) + P(5)
⎧⎪ 1 5 1 ⎛ 5 ⎞ 2 1 ⎛ 5 ⎞3 1 ⎛ 5 ⎞ 4 1 ⎫⎪
= 1− ⎨ + +⎜ ⎟ +⎜ ⎟ +⎜ ⎟ ⎬
⎩⎪ 6 6 6 ⎝ 6 ⎠ 6 ⎝ 6 ⎠ 6 ⎝ 6 ⎠ 6 ⎭⎪
1 ⎪ 5 ⎛ 5 ⎞ ⎛ 5 ⎞ ⎛ 5 ⎞ ⎫⎪
⎧ 2 3 4

= 1 − ⎨1 + + ⎜ ⎟ + ⎜ ⎟ + ⎜ ⎟ ⎬
6 ⎪⎩ 6 ⎝ 6 ⎠ ⎝ 6 ⎠ ⎝ 6 ⎠ ⎪⎭
= 0.4021
0 4021
UNIFORM DISTRIBUTION
UNIFORM DISTRIBUTION
A continuous random variable ‘X’ is said to have a uniform
distribution if its probability density function is given by
⎧ k, a < x < b
f ( x) = ⎨
⎩0,
0 otherwise
where ‘a’ and ‘b’ are the two parameters of the uniform distribution
and k is a constant

The p.d.f of a uniform variate ‘X’ in (a, b) is given by

⎧ 1
⎪ , a< x<b
f ( x) = ⎨ b − a
⎪⎩ 0, otherwise
Moment generating function of uniform distribution

e bt − e at
M X (t ) =
(b − a )t

Mean and variance of uniform distribution

a+b
Mean =
2

(b − a ) 2
Var ( X ) =
12
Problem: 1
A bus arrives every 20 minutes, at a specified stop, beginning at 6.40 a.m.
and
d continuing
ti i until til 8.40
8 40 a.m. A passenger arrives
i randomly
d l between
b t 7 00
7.00
a.m. and 7.30a.m. what is the probability that the passenger has to wait for
more than 5 minutes for a bus.

Solution:
Let X be the number of minutes past 7.00 a.m. when the passenger arrives.
The bus timings between 7.00 and 7.30 am, are 7.00 am,7.20 am.

If the passenger has to wait for more than five minutes, he must arrive
between 7.00 am and 7.15 am or 7.20 am and 7.30 am.

Also X is a uniformly distributed random variable between 7.00 am and


7.30 am.
⎧1
⎪ , 0 < X < 30
f ( x) = ⎨ 30
⎪⎩ 0, otherwise
The required probability is P[0 < X < 15] + P[20 < X < 30]
15 30
= ∫ f ( x)dx + ∫ f ( x)dx
0 20

1 1
= [15 − 0] + [30 − 20]
30 30
15 10 25 5
= + = =
30 30 30 6

∴ The probability that the passenger has to wait for more


that 5 minutes for a bus = 0.833
Problem-2

Solution:-
Problem-3
Exponential Distribution

Moment ggeneratingg function of Exponential


p distribution

M X ( t ) = E [ e tx ] = ∫0
e tx f ( x ) d x

∞ ∞

∫ ∫
−λ x
= e λe
tx
dx = λ e − (λ −t) x d x
0 0

⎡ e − (λ −t) x
⎤ λ
= λ ⎢ ⎥ =
⎣ − ( λ − t ) ⎦0 λ − t
∞ r
⎛ y⎞ dy
= λ ∫ ⎜ ⎟ e− y
0⎝
λ⎠ λ Put y = λ x ⇒ dy = λ dx
dy
⇒ dx =
λ

⎡ ∞

∫x
r −1 −x
⎢∵ r = e dx and ( r + 1) = r ! ⎥
⎣ 0 ⎦

⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ (1)
⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅ (1)
Problem-1

You might also like