Professional Documents
Culture Documents
Ma6451 RP PDF
Ma6451 RP PDF
By
Mrs. V.Sumathi
ASSISTANT PROFESSOR
being prepared by me and it meets the knowledge requirement of the university curriculum.
Name: V. Sumathi
Designation: AP
This is to certify that the course material being prepared by Mss.V.Sumathi is of adequate
quality. He has referred more than five books amont them minimum one is from aborad author.
Signature of HD
Name: Mr.N.Ramkumar
SEAL
S.NO CONTENTS Page.NO
UNIT I
RANDOM VARIABLES
1. Introduction 1
2. Discrete Random Variables 1
3 Continuous Random Variables 5
4 Moments 14
5 Moment generating functions 14
6 Binomial distribution 18
7 Poisson distribution 21
8 Geometric distribution 25
9 Uniform distribution 27
10 Exponential distribution 29
11 Gamma distribution 31
UNIT II
TWO –DIMENSIONAL RANDOM VARIABLES
11 Introduction 37
12 Joint distribution 37
13 Marginal and Conditional Distribution 38
14 Covariance 43
15 Correlation Coefficient 44
16 Problems 41
17 Linear Regression 45
18 Transformation of random variables 46
19 Problems 47
UNIT III
RANDOM PROCESSES
20 Introduction 49
21 Classification 50
22 stationary processes 51
23 Markov processes 55
24 Poisson processes 56
25 Random Telegraph processes 57
UNIT IV
CORRELATION AND SPECTRAL DENSITIES
26 Introduction 60
27 Auto Correlation functions 60
28 Properties 61
29 Cross Correlation functions 63
30 Properties 64
31 Power spectral density 65
32 66
Properties
33 Cross spectral density 66
34 Properties 67
UNIT V
LINER SYSTEMS WITH RANDOM INPUTS
35 Introduction 71
36 Linear time invariant systems 72
37 Problems 72
38 Linear systems with random inputs 73
39 Auto Correlation and Cross Correlation functions of inputs and 74
outputs
40 System transfer function 75
41 Problems 76
MA6451 PROBABILITY AND RANDOM PROCESSES LTPC3104
OBJECTIVES: To provide necessary basic concepts in probability and random processes for
applications such as random signals, linear systems etc in communication engineering.
UNIT I RANDOM VARIABLES 9+3 Discrete and continuous random variables – Moments –
Moment generating functions – Binomial, Poisson, Geometric, Uniform, Exponential, Gamma
and Normal distributions.
UNIT II TWO - DIMENSIONAL RANDOM VARIABLES 9+3 Joint distributions – Marginal and
conditional distributions – Covariance – Correlation and Linear regression – Transformation of
random variables.
UNIT III RANDOM PROCESSES 9+3 Classification – Stationary process – Markov process -
Poisson process – Random telegraph process.
UNIT IV CORRELATION AND SPECTRAL DENSITIES 9+3 Auto correlation functions – Cross
correlation functions – Properties – Power spectral density – Cross spectral density –
Properties.
UNIT V LINEAR SYSTEMS WITH RANDOM INPUTS 9+3
Linear time invariant system – System transfer function – Linear systems with random inputs –
Auto correlation and Cross correlation functions of input and output.
TOTAL (L:45+T:15): 60 PERIODS
OUTCOMES:
The students will have an exposure of various distribution functions and help in acquiring
skills in handling situations involving more than one variable. Able to analyze the response of
random inputs to linear time invariant systems.
TEXT BOOKS:
1. Ibe.O.C., “Fundamentals of Applied Probability and Random Processes", Elsevier, 1st Indian
Reprint, 2007.
2. Peebles. P.Z., "Probability, Random Variables and Random Signal Principles", Tata McGraw
Hill, 4th Edition, New Delhi, 2002.
REFERENCES:
1. Yates. R.D. and Goodman.D.J., "Probability and Stochastic Processes", 2nd Edition, Wiley
India Pvt. Ltd., Bangalore, 2012.
2. Stark. H., and Woods. J.W., "Probability and Random Processes with Applications to Signal
Processing", 3rd Edition,Pearson Education, Asia, 2002.
3. Miller. S.L. and Childers.D.G., "Probability and Random Processes with Applications to Signal
Processing and Communications", Academic Press, 2004.
4. Hwei Hsu, "Schaum‟s Outline of Theory and Problems of Probability, Random Variables and
Random Processes", Tata McGraw Hill Edition, New Delhi, 2004.
5. Cooper. G.R., McGillem. C.D., "Probabilistic Methods of Signal and System Analysis", 3rd
Indian Edition, Oxford University Press, New Delhi, 2012.
MA6451 PROBABILITY AND RANDOM PROCESSES
UNIT - I
RANDOM VARIABLES
Introduction
Thus a random variable X can be considered as a function that maps all elements in
the sample space S into points on the real line. The notation X(S)=x means that x is the
value associated with the outcomes S by the Random variable X.
1
MA6451 PROBABILITY AND RANDOM PROCESSES
2
MA6451 PROBABILITY AND RANDOM PROCESSES
The function p(x) satisfying the above two conditions is called the probability mass
function (or) probability distribution of the R.V.X. The probability distribution {x i , p i } can be
displayed in the form of table as shown below.
X = xi x1 x2 ……. xi
P(X = xi ) = p i p1 p2 ……. pi
Notation
Let ‘S’ be a sample space. The set of all outcomes ‘S’ in S such that
X(S) = x is denoted by writing X = x.
P(X = x) = P{S : X(s) = x}
|||ly P(x ≤ a) = P{S : X() ∈ (-∞, a)}
and P(a < x ≤ b) = P{s : X(s) ∈ (a, b)}
P(X = a or X = b) = P{(X = a) ∪ (X = b)}
P(X = a and X = b) = P{(X = a) ∩ (X = b)} and so on.
Theorem :2
If ‘X’ is a random variable and f(•) is a continuous function, then f(X) is a random
variable.
Note
If F(x) is the distribution function of one dimensional random variable then
I. 0 ≤ F(x) ≤ 1
II. If x < y, then F(x) ≤ F(y)
III. F(-∞) = lim F(x) = 0
x →−∞
IV. F(∞) = lim F(x) = 1
x →∞
V.If ‘X’ is a discrete R.V. taking values x1 , x 2 , x3
Where x1 < x2 < x i-1 x i …….. then
P(X = xi ) = F(x i ) – F(xi-1 )
Example:1.2.1
A random variable X has the following probability function
Values of X 0 1 2 3 4 5 6 7 8
Probability P(X) a 3a 5a 7a 9a 11a 13a 15a 17a
3
MA6451 PROBABILITY AND RANDOM PROCESSES
Table 1
Values of X 0 1 2 3 4 5 6 7 8
p(x) a 3a 5a 7a 9a 11a 13a 15a 17a
∑ p(x ) = 1
i =0
i
4
MA6451 PROBABILITY AND RANDOM PROCESSES
Remark
5
MA6451 PROBABILITY AND RANDOM PROCESSES
1. In the case of discrete R.V. the probability at a point say at x = c is not zero. But in the case
of a continuous R.V.X the probability at a point is always zero.
∞
P ( X = c ) = ∫ f (x) dx = [ x ]c = C − C = 0
C
−∞
It is denoted by
∞
µ 'r =
∫ x f (x) dx
r
−∞
Thus
6
MA6451 PROBABILITY AND RANDOM PROCESSES
µ1' =
E(X) (µ1' about origin)
µ '2 =
E(X 2 ) (µ '2 about origin)
∴ Mean = X =µ1' =E(X)
And
Variance = µ '2 −µ '2 2
Variance= E(X 2 ) − [E(X)]2 (a)
th
* r moment (abut mean)
Now
∞
E {X − E ( X )}
r
= ∫ {x − E(X)}r f (x) dx
−∞
∞
= ∫ {x − X} f (x) dx
r
−∞
Thus
∞
µr = ∫ {x − X} f (x) dx
r
−∞ (b)
Where µr = E[X − E(X) ] r
∞ ∞
= ∫ x f (x) dx − ∫ x f (x) dx
−∞ −∞
=
X − X ∫ f (x) dx
∞
∞ f (x) dx =
1
−∞
−∞∫
= X−X
µ1 =0
Put r = 2 in (B), we get
∞
µ2 = ∫ (x − X) f (x) dx
2
−∞
Variance =
µ2 =E[X − E(X)]2
Which gives the variance interms of expectations.
Note
Let g(x) = K (Constant), then
7
MA6451 PROBABILITY AND RANDOM PROCESSES
∞
E g (=
X ) E (=
K) ∫ K f (x) dx
−∞
∞
∞ f (x) dx =1
= K ∫ f (x) dx −∞∫
−∞
= K.1 = K
Thus E(K) = K ⇒ E[a constant] = constant.
1.3.4 EXPECTATIONS (Discrete R.V.’s)
Let ‘X’ be a discrete random variable with P.M.F p(x)
Then
E(X) = ∑ x p(x)
x
For discrete random variables ‘X’
E(X r ) = ∑ x p(x)
r
x (by def)
If we denote
=
E(X r
) µ 'r
Then
µ 'r= =
E[X r
] ∑ x p(x)
r
Put r = 1, we get
Mean µ 'r = ∑ x p(x)
Put r = 2, we get
µ '2= =
E[X 2
] ∑ x p(x)
2
x
∴
µ 2 = µ '2 −µ1' 2 = E(X 2 ) − {E(X)}
2
8
MA6451 PROBABILITY AND RANDOM PROCESSES
Theorem 3
If ‘X’ is a random variable with pdf f(x) and ‘a’ is a constant, then
(i) E[a G(x)] = a E[G(x)]
(ii) E[G(x)+a] = E[G(x)+a]
Where G(X) is a function of ‘X’ which is also a random variable.
Theorem 4
If ‘X’ is a random variable with p.d.f. f(x) and ‘a’ and ‘b’ are constants, then
E[ax + b] = a E(X) + b
Cor 1:
If we take a = 1 and b = –E(X) = – X , then we get
E(X- X ) = E(X) – E(X) = 0
Note
1 1
E ≠
X E(X)
E[log (x)] ≠ log E(X)
E(X2) ≠ [E(X)]2
1.3.7 EXPECTATION OF A LINEAR COMBINATION OF RANDOM VARIABLES
Let X 1 , X 2 , ……, X n be any ‘n’ random variable and if a 1 , a 2 , ……, a n are constants, then
E[a 1 X 1 + a 2 X 2 + ……+ a nX n ] = a 1 E(X 1 ) + a 2 E(X 2 )+ ……+ a nE(X n )
Result
If X is a random variable, then
Var (aX + b) = a2Var(X) ‘a’ and ‘b’ are constants.
Covariance :
If X and Y are random variables, then covariance between them is defined as
Cov(X, Y) = E{[X - E(X)] [Y - E(Y)]}
= E{XY - XE(Y) – E(X)Y + E(X)E(Y)}
Cov(X, Y) = E(XY) – E(X) . E(Y) (A)
If X and Y are independent, then
E(XY) = E(X) E(Y)
Sub (B) in (A), we get
Cov (X, Y) = 0
∴ If X and Y are independent, then
9
MA6451 PROBABILITY AND RANDOM PROCESSES
Cov (X, Y) = 0
Note
(i) Cov(aX, bY) = ab Cov(X, Y)
(ii) Cov(X+a, Y+b) = Cov(X, Y)
(iii) Cov(aX+b, cY+d) = ac Cov(X, Y)
(iv) Var (X 1 + X 2 ) = Var(X 1 ) + Var(X 2 ) + 2 Cov(X 1 , X 2 )
If X 1 , X 2 are independent
Var (X 1 + X 2 ) = Var(X 1 ) + Var(X 2 )
EXPECTATION TABLE
Discrete R.V’s Continuous R.V’s
∞
1. E(X) = ∑x p(x) 1. E(X) = ∫ x f (x) dx
−∞
∞
2. E(X ) =µ =∑ x p(x)
r '
r
r
2. E(X r ) =µ 'r = ∫ x r f (x) dx
x −∞
∞
3. Mean = µ 'r =∑ x p(x) 3. Mean = µ 'r = ∫ x f (x) dx
−∞
∞
4. µ '2 =∑ x 2 p(x) 4. µ '2 = ∫ x 2 f (x) dx
−∞
5. Variance = µ '2 −µ1' 2 = E(X2) – {E(X)}2 5. Variance = µ '2 −µ1' 2 = E(X2) – {E(X)}2
Now
6
E(X) = ∑ x i p(x i )
i =1
= x1 p(x 1 ) + x2 p(x2 ) + x3 p(x3 ) + x 4 p(x 4 ) + x5 p(x 5 ) + x6 p(x6 )
= 1 x (1/6) + 1 x (1/6) + 3 x (1/6) + 4 x (1/6) + 5 x (1/6) + 6 x (1/6)
= 21/6 = 7/2 (1)
10
MA6451 PROBABILITY AND RANDOM PROCESSES
6
E(X) = ∑ x i p(x p )
i =1
= x1 2p(x1 )+x 2 2p(x 2 )+x3 2p(x3 )+x 4 2p(x4 )+x 5 2p(x 5 )+x 6 p(x6 )
= 1(1/6) + 4(1/6) + 9(1/6) + 16(1/6) + 25(1/6) + 36(1/6)
1 + 4 + 9 + 16 + 25 + 36 91
= = (2)
6 6
Variance (X) = Var (X) = E(X2) – [E(X)]2
2
91 7 91 49 35
= – = − =
6 2 6 4 12
Example :2
Find the value of (i) C (ii) mean of the following distribution given
C(x − x 2 ), 0 < x <1
f (x) =
0 otherwise
Solution
C(x − x 2 ), 0 < x <1
Given f (x) = (1)
0 otherwise
∞
∫ f (x) dx =1
−∞
1
∫ C(x − x ) dx =
2
1 [using (1)] [∴ 0<x<1]
0
1
x 2 x3
C − = 1
2 3 0
1 1
C − = 1
2 3
3 − 2
C =1
6
C
=1 C=6 (2)
6
Sub (2) in (1), f(x) = 6(x – x2), 0< x < 1 (3)
∞
Mean = E(x) = ∫ x f (x) dx
−∞
1
= ∫ x 6(x − x 2 ) dx [from (3)] [∴ 0 < x < 1]
0
1
= ∫ (6x 2 − x 3 ) dx
0
11
MA6451 PROBABILITY AND RANDOM PROCESSES
1
6x 3 6x 4
= −
3 4 0
∴ Mean = ½
Mean C
½ 6
12
MA6451 PROBABILITY AND RANDOM PROCESSES
x
F(x) = ∫ f (x) dx
−∞
0 x
= ∫ f (x) dx + ∫ f (x) dx
−∞ 0
x
x x 2 x3
x
= 0 + ∫ 6x(1 − x) dx = 6 ∫ x(1 − x) dx = 6 −
0 0 2 3 0
= 3x 2 − 2x 3
(iii) When x > 1, then
x
F(x) = ∫ f (x) dx
−∞
0 1 x
= ∫ 0dx + ∫ 6x(1 − x) dx + ∫ 0 dx
−∞ 0 0
1
= 6 ∫ (x − x 2 ) dx =1
0
Using (1), (2) & (3) we get
0, x<0
= 3x 2 − 2x 3 ,
F(x) 0 < x <1
1, x >1
Example:1.4.2
e − x , x≥0
(i) If f (x) = defined as follows a density function ?
0, x<0
(ii) If so determine the probability that the variate having this density will fall in the interval (1,
2).
Solution
e − x , x≥0
Given f (x) =
0, x<0
(a) In (0, ∞), e-x is +ve
∴f(x) ≥ 0 in (0, ∞)
∞ 0 ∞
(b) ∫ f (x) dx = ∫ f (x) dx + ∫ f (x) dx
−∞ −∞ 0
0 ∞
= ∫ 0dx + ∫ e − x dx
−∞ 0
∞
= −e − x − e −∞ + 1
=
0
13
MA6451 PROBABILITY AND RANDOM PROCESSES
=1
Hence f(x) is a p.d.f
(ii) We know that
b
P(a ≤ X ≤ b) = ∫ f (x) dx
a
2 2
P(1 ≤ X ≤ 2) = ∫ f (x) dx = ∫ e − x dx = [−e − x ]2+1
1 1
2
= ∫ e − x dx = [−e − x ]2+1
1
= -e-2 + e-1 = -0.135 + 0.368 = 0.233
Example:1.4..3
A probability curve y = f(x) has a range from 0 to ∞. If f(x) = e-x, find the mean and
variance and the third moment about mean.
Solution
∞
Mean = ∫ x f (x) dx
0
∞ ∞
= ∫ x e − x dx = x[−e − x ] − [e − x ]
0
0
Mean = 1
∞
Variance µ 2= ∫ (x − Mean) 2 f (x) dx
0
∞
= ∫ (x − 1) 2 e − x dx
0
µ 2 =1
Third moment about mean
b
µ3= ∫ (x − Mean)3 f (x) dx
a
Here a = 0, b = ∞
b
µ3 = ∫ (x − 1)3 e − x dx
a
{ }
∞
= (x − 1)3 ( −e − x ) − 3(x − 1) 2 (e − x ) + 6(x − 1)( −e − x ) − 6(e − x )
0
= -1 + 3 -6 + 6 = 2
µ3 = 2
1.5 MOMENT GENERATING FUNCTION
Def : The moment generating function (MGF) of a random variable ‘X’ (about origin) whose
probability function f(x) is given by
M X(t) = E[etX]
14
MA6451 PROBABILITY AND RANDOM PROCESSES
t t2 tr
M X (t) =µ '0 + µ1' + µ '2 + ..... + µ 'r (A)
1! 2! r!
Differenting (A) W.R.T ‘t’, we get
2t ' t 3 '
'
M X (t) =µ + µ 2 + µ3 + .....
'
1 (B)
2! 3!
Put t = 0 in (B), we get
M X ' (0) =µ1' =Mean
d
Mean = M1' (0) (or) dt (M X (t))
t =0
15
MA6451 PROBABILITY AND RANDOM PROCESSES
2! r!
Result:
M CX (t) = E[e tcx ] (1)
M X (t) = E[e ] ctx
(2)
From (1) & (2) we get
M CX (t) = M X (ct)
Example :1.5.4
If X 1 , X2, ….., Xn are independent variables, then prove that
t (X1 + X 2 +....+ X n )
M X1 + X2 +....+ Xn (t) = E[e ]
= E[e tX1 .e tX 2 .....e tX n ]
= E(e tX1 ).E(e tX 2 ).....E(e tX n )
[∴ X 1 , X 2 , ….., X n are independent]
16
MA6451 PROBABILITY AND RANDOM PROCESSES
Example:1.5.5
− at t
X−a
Prove that if ∪ = , then M ∪ (t) = e h .M X h , where a, h are constants.
h
Proof
By definition
M ∪ (t) = E e tu M X (t) = E[e tx ]
t Xh−a
= E e
tXn − tan
= E e
tX − ta
= E[ e h ] E[ e h ]
− ta tX
=e h E[ e ] h [by def]
− ta t
h h
=e . MX
− at
t X−a
∴ M ∪ (t) = e h
.M X , where ∪ = and M X(t) is the MGF about origin.
h h
Example:1.5.6
Find the MGF for the distribution where
2
3 at x = 1
1
=f (x) = at x 2
3
0 otherwise
Solution
2
Given f (1) =
3
1
f (2) =
3
f(3) = f(4) = …… = 0
MGF of a R.V. ‘X’ is given by
17
MA6451 PROBABILITY AND RANDOM PROCESSES
M X (t) = E[e tx ]
∞
= ∑ e tx f (x)
x =0
= e0 f(0) + et f(1) + e2t f(2) + …….
= 0 +et f(2/3) + e2t f(1/3) + 0
= 2/3et + 1/3e2t
et
∴ MGF is M=
X (t) [2 + e t ]
3
1.6 Discrete Distributions
The important discrete distribution of a random variable ‘X’ are
1. Binomial Distribution
2. Poisson Distribution
3. Geometric Distribution
1.6.1 BINOMIAL DISTRIBUTION
Def : A random variable X is said to follow binomial distribution if its probability law is given
by
P(x) = p(X = x successes) = nC x p x qn-x
Where x = 0, 1, 2, ……., n, p+q = 1
Note
Assumptions in Binomial distribution
i) There are only two possible outcomes for each trail (success or failure).
ii) The probability of a success is the same for each trail.
iii) There are ‘n’ trails, where ‘n’ is a constant.
iv) The ‘n’ trails are independent.
Example :1.6.1
Find the Moment Generating Function (MGF) of a binomial distribution about origin.
Solution
n
WKT M X (t) = ∑ e tx p(x)
x =0
Let ‘X’ be a random variable which follows binomial distribution then MGF about origin is
given by
n
E[e=
tX
] =
M X (t) ∑ e p(x)
tx
x =0
n
= x n −x
∑ e nC x p q
tx
p(x) = nC x p x q n − x
x =0
n
n−x
= ∑ (e ) p nC x q
tx x
x =0
n
n−x
= ∑ (pe ) nC x q
t x
x =0
18
MA6451 PROBABILITY AND RANDOM PROCESSES
∴ M X (t) =
(q + pe t ) n
Example:1.6.2
Find the mean and variance of binomial distribution.
Solution
M X (t) = (q + pe t ) n
∴ M 'X (t) n(q + pe t ) n −1.pe t
=
Put t = 0, we get
M 'X (0) = n(q + p) n −1.p
Mean = E(X) = np [ (q + p) =
1] Mean M 'X (0)
M"X (t) = np (q + pe t ) n −1.e t + e t (n − 1)(q + pe t ) n − 2 .pe t
Put t = 0, we get
M"X (t) = np (q + p) n −1 + (n − 1)(q + p) n − 2 .p
= np [1 + (n − 1)p ]
= np + n 2 p 2 − np 2
= n 2 p 2 + np(1 − p)
M"X (0) = n 2 p 2 + npq [ 1 − p =q]
M"X (0) = =
E(X 2
) n 2 p 2 + npq
Var ( X ) = E(X 2 ) − [E(X)]2= n 2 / p 2 + npq − n 2 / p 2= npq
Var (X) = npq
S.D = npq
Example :1.6.3
Find the Moment Generating Function (MGF) of a binomial distribution about mean
(np).
Solution
Wkt the MGF of a random variable X about any point ‘a’ is
M x(t) (about X = a) = E[et(X-a)]
Here ‘a’ is mean of the binomial distribution
M X(t) (about X = np) = E[et(X-np)]
= E[etX . e-tnp)]
= e-tnp . [-[e-tX)]]
= e-tnp . (q+pet)-n
= (e-tp)n. (q + pet)n
∴ MGF about mean = (e-tp)n. (q + pet)n
Example :1.6.4
Additive property of binomial distribution.
Solution
19
MA6451 PROBABILITY AND RANDOM PROCESSES
∴ M X+Y (
t) =
M X ( t ) .M Y ( t ) [ X & Y are independent R.V.’s]
( ) . (q + p 2e t )
n1 n2
= q1 + p1e t 2
Example :1.6.5
=
If M X (t) (=
q+pe ) , M (t) ( q+pe )
t n1
Y
t n2
, then
20
MA6451 PROBABILITY AND RANDOM PROCESSES
n = 25
∴ The binomial distribution is
P(X = x) = p(x) = nC x px qn-x
= 25C x(1/5)x(4/5)n-x, x = 0, 1, 2, ….., 25
M X(t) = eλ (e −1)
t
Hence
21
MA6451 PROBABILITY AND RANDOM PROCESSES
e −λ λ x∞ ∞ x. e −λ λλ x −1
= ∑ x. = ∑
= x 0= x! x 0 x!
∞ x.λ x −1
= 0 + e −λ . λ ∑
x =1 x!
−
∞ λ x 1
= λ e −λ . ∑
x =1 (x − 1)!
λ2
= λ e −λ 1 + λ + + .....
2!
= λ e −λ .eλ
Mean = λ
∞ ∞ e −λ λ x
µ = E[X ] = ∑ x .p(x) = ∑ x .
'
2
2 2 2
= x 0= x 0 x!
−λ
∞ e λ x
= ∑ {x(x − 1) + x}.
x =0 x!
∞ x(x − 1)e λ −λ x ∞ x.e −λ λ x
= ∑ +∑
= x 0= x! x 0 x!
x −2
∞ λ
= e −λ λ 2 ∑ +λ
x = 0 (x − 2)(x − 3)....1
∞ λ x −2
= e −λ λ 2 ∑ +λ
x = 0 (x − 2)!
−λ 2 λ λ2
= e λ 1 + + + .... + λ
1! 2!
= λ2 + λ
Variance µ 2 = E(X 2 ) − [E(X)]2 = λ2 + λ − λ2 = λ
Variance = λ
Hence Mean = Variance =λ
Note : * sum of independent Poisson Vairates is also Poisson variate.
3
∴ P(X=1) = e −λ λ = (Given)
10
3
= λe −λ = (1)
10
e −λ λ 2 1
P(X=2) = = (Given)
2! 5
e −λ λ 2 1
= (2)
2! 5
3
(1) ⇒ e −λ λ = (3)
10
2
(2) ⇒ e −λ λ 2 = (4)
5
(3) 1 3
⇒ =
(4) λ 4
4
λ=
3
e −λ λ 0
∴ P(X=0) = = e −4/3
0!
−λ 3
e λ e −4/3 (4 / 3)3
P(X=3) = =
3! 3!
Example :1.7.2
If X is a Poisson variable
P(X = 2) = 9 P(X = 4) + 90 P(X=6)
Find (i) Mean if X (ii) Variance of X
Solution
e −λ λ x
P(X=x) = , x = 0,1, 2,.....
x!
Given P(X = 2) = 9 P(X = 4) + 90 P(X=6)
e −λ λ 2 e −λ λ 4 e −λ λ 6
=9 + 90
2! 4! 6!
1 9λ 90λ 2 4
= +
2 4! 6!
1 3λ 2
λ 4
= +
2 8 8
3λ 2
λ4
1= +
4 4
23
MA6451 PROBABILITY AND RANDOM PROCESSES
λ 4 + 3λ 2 − 4 =0
λ2 = 1 or λ2 = -4
λ =±1 or λ = ± 2i
∴ Mean = λ = 1, Variance = λ = 1
∴ Standard Deviation = 1
1.7.3 Derive probability mass function of Poisson distribution as a limiting case of Binomial
distribution
Solution
We know that the Binomial distribution is
P(X=x) = nC x pxqn-x
n!
= p x (1 − p) n − x
(n − x)! x!
1.2.3.......(n − x)(n − x + 1)......np n (1 − p) n
=
1.2.3.....(n − x) x! (1 − p) x
x
1.2.3.......(n − x)(n − x + 1)......n p
(1 − p)
n
=
1.2.3.....(n − x) x! 1− p
n(n − 1)(n − 2)......(n − x + 1) λ x λ
n
1
= x
1−
x! n λ n
x
1 −
n
−x
n(n − 1)(n − 2)......(n − x + 1) λ λ
n
= x 1 − 1 −
x! n n
1 2 x − 1
11 − 1 − ...... 1 − n−x
n n n x λ
P(X=x) = λ 1 −
x! n
n−x
λ 1 2
x
x − 1 λ
= 11 − 1 − ...... 1 − 1 −
x! n n n n
When n→∞
λx 1 2 x − 1 λ
n−x
24
MA6451 PROBABILITY AND RANDOM PROCESSES
n −x
λ
lt 1 − e −λ
=
n →∞
n
1 2 x −1
and lt 1 − = lt 1 − .....= lt 1 − = 1
n →∞
n n →∞
n n →∞
n
λ x −λ
∴ P(X=x) = = e , x 0,1, 2,...... ∞
x!
25
MA6451 PROBABILITY AND RANDOM PROCESSES
(1 − qe t )pe t − pe t (−qe t ) pe t
M 'X ( t ) = =
(1 − qe t ) 2 (1 − qe t ) 2
∴ E(X) = M 'X ( 0 ) = 1/p
∴ Mean = 1/p
d pe t
µ"X (t) =
dt (1 − qe t ) 2
Variance
(1 − qe t ) 2 pe t − pe t 2(1 − qe t )(−qe t )
=
(1 − qe t ) 4
(1 − qe t ) 2 pe t + 2pe t qe t (1 − qe t )
=
(1 − qe t ) 4
1+ q
M"X (0) = 2
p
(1 + q) 1 q
Var (X) = E(X2) – [E(X)]2 = 2
− 2 ⇒ 2
p p p
q
Var (X) = 2
p
Note:
Another form of geometric distribution
P[X=x] = qxp ; x = 0, 1, 2, ….
p
M X (t) =
(1 − qe t )
Mean = q/p, Variance = q/p2
Example:1.8.2
If the MGF of X is (5-4et)-1, find the distribution of X and P(X=5)
Solution
Let the geometric distribution be
P(X = x) = qxp, x = 0, 1, 2, …..
The MGF of geometric distribution is given by
p
(1)
1 − qe t
1
4 −1
Here M X(t) = (5 - 4e ) ⇒ 5 1 − e t
t -1
(2)
5
4 1
Company (1) & (2) we get q= ; p=
5 5
∴ P(X = x) = pqx, x = 0, 1, 2, 3, …….
26
MA6451 PROBABILITY AND RANDOM PROCESSES
x
1 4
=
5 5
5
1 4 45
P(X = 5) = = 6
5 5 5
1.9 CONTINUOUS DISTRIBUTIONS
If ‘X’ is a continuous random variable then we have the following distribution
1. Uniform (Rectangular Distribution)
2. Exponential Distribution
3. Gamma Distribution
4. Normal Distribution
1. 9.1 Uniform Distribution (Rectangular Distribution)
Def : A random variable X is set to follow uniform distribution if its
1
, a<x<b
f (x) = b − a
0, otherwise
* To find MGF
∞
M X (t) = ∫ e tx f (x)dx
−∞
b 1
= ∫ e tx dx
a b−a
a
1 e tx
=
b − a t b
1
= e bx − eat
(b − a)t
∴ The MGF of uniform distribution is
e bt − eat
M X (t) =
(b − a)t
* To find Mean and Variance
∞
E(X) = ∫ x f (x)dx
−∞
b
x2
1 b 1 b 2 a
= ∫ bx
dx = ∫ x dx =
a b−a b−a a b−a
b −a
2 2
b+a a+b
= = =
2(b − a) 2 2
27
MA6451 PROBABILITY AND RANDOM PROCESSES
a+b
Mean µ1' =
2
Putting r = 2 in (A), we get
b x2 b
µ = ∫ x f (x)dx
'
= ∫ 2
dx
a b−a
2
a
a 2 + ab + b 2
=
3
∴ Variance = µ 2 − µ1' 2
'
b 2 + ab + b 2 b + a (b − a) 2
2
= − =
3 2 12
(b − a) 2
Variance =
12
28
MA6451 PROBABILITY AND RANDOM PROCESSES
1 1
2/ ∫ dx = 1
/α
12
⇒α = 2
Note:
1. The distribution function F(x) is given by
0 −α < x < α
x − a
=
F(x) a≤x≤b
b − a
1 b<x<∞
2. The p.d.f. of a uniform variate ‘X’ in (-a, a) is given by
1
−a < x < a
F(x) = 2a
0 otherwise
29
MA6451 PROBABILITY AND RANDOM PROCESSES
−1
λ 1 t
M X(t)= = = 1 −
λ − t 1− t λ
λ
2
t t tr
=+ 1 + 2 + ..... + r
λ λ λ
t t 2!
2
t r t!
=1 + + 2 + ..... + r
λ 2! λ r! λ
r
∞ t
M X(t) = ∑
r =0 λ
t1 1
∴= Mean µ1' Coefficient
= of
1! λ
2
t 2
=µ '2 Coefficient
= of
2! λ 2
2 1 1
Variance= µ 2 =µ '2 − µ1' 2 = 2 − 2 =
λ λ λ2
1 1
Variance = 2 Mean =
λ λ
Example: 1.10.1
Let ‘X’ be a random variable with p.d.f
1 −3x
e x>0
F(x) = 3
0
otherwise
Find 1) P(X > 3) 2) MGF of ‘X’
Solution
WKT the exponential distribution is
F(x) = λe-λx, x > 0
1
Here λ =
3
∞ ∞ 1 −x
P(x>3) = ∫ f (x) dx = ∫ e 3 dx
3 3 3
P(X>3) = e-1
λ
MGF is M X (t) =
λ−t
30
MA6451 PROBABILITY AND RANDOM PROCESSES
1 1
3 3 1
= = =
1 1 − 3t 1 − 3t
−t
3 3
1
M X(t) =
1 − 3t
Note
If X is exponentially distributed, then
P(X > s+t / x > s) = P(X > t), for any s, t > 0.
and = dx
=0, elsewhere
31
MA6451 PROBABILITY AND RANDOM PROCESSES
= 0 ,when x=0 , x=
Now V(X) = = =
= a (a-b) From (1) and (2)
TUTORIAL QUESTIONS
32
MA6451 PROBABILITY AND RANDOM PROCESSES
Example :1
Given the p.d.f. of a continuous random variable ‘X’ follows
6x(1 − x), 0 < x <1
f (x) = , find c.d.f. for ‘X’
0 otherwise
Solution
6x(1 − x), 0 < x <1
Given f (x) =
0 otherwise
x
=
The c.d.f is F(x) ∫ f (x) dx , − ∞ < x < ∞
−∞
(i) When x < 0, then
x
F(x) = ∫ f (x) dx
−∞
x
= ∫ 0 dx =0
−∞
(ii) When 0< x < 1, then
x
F(x) = ∫ f (x) dx
−∞
0 x
= ∫ f (x) dx + ∫ f (x) dx
−∞ 0
x
x x x 2 x3
= 0 + ∫ 6x(1 − x) dx = 6 ∫ x(1 − x) dx = 6 −
0 0 2 3 0
= 3x 2 − 2x 3
(iii) When x > 1, then
x
F(x) = ∫ f (x) dx
−∞
0 1 x
= ∫ 0dx + ∫ 6x(1 − x) dx + ∫ 0 dx
−∞ 0 0
1
6 ∫ (x − x 2 ) dx =1
0
33
MA6451 PROBABILITY AND RANDOM PROCESSES
Example :2
A random variable X has the following probability function
Values of X 0 1 2 3 4 5 6 7 8
Probability P(X) a 3a 5a 7a 9a 11a 13a 15a 17a
∑ p(x ) = 1
i =0
i
Example :3
The mean and SD of a binomial distribution are 5 and 2, determine the distribution.
Solution
Given Mean = np = 5 (1)
SD = npq = 2 (2)
(2) np 4 4
⇒ =⇒ q=
(1) npq 5 5
4 1 1
∴ p =1 − = ⇒ p=
5 5 5
Sub (3) in (1) we get
n x 1/5 = 5
n = 25
∴ The binomial distribution is
P(X = x) = p(x) = nC x px qn-x
= 25C x(1/5) (4/5)n-x,
x
x = 0, 1, 2, ….., 25
Example :4
If X is a Poisson variable
35
MA6451 PROBABILITY AND RANDOM PROCESSES
36
MA6451 PROBABILITY AND RANDOM PROCESSES
UNIT – II
Introduction
In the previous chapter we studied various aspects of the theory of a single R.V. In this
chapter we extend our theory to include two R.V's one for each coordinator axis X and Y
of the XY Plane.
DEFINITION : Let S be the sample space. Let X = X(S) & Y = Y(S) be two functions each
assigning a real number to each outcome s ∈ S. hen (X, Y) is a two dimensional random
variable.
2.1 Types of random variables
1. Discrete R.V.’s
2. Continuous R.V.’s
Discrete R.V.’s (Two Dimensional Discrete R.V.’s)
If the possible values of (X, Y) are finite, then (X, Y) is called a two dimensional discrete
R.V. and it can be represented by (x i , y), i = 1,2,….,m.
In the study of two dimensional discrete R.V.’s we have the following
5 important terms.
• Joint Probability Function (JPF) (or) Joint Probability Mass Function.
• Joint Probability Distribution.
• Marginal Probability Function of X.
• Marginal Probability Function of Y.
• Conditional Probability Function.
37
MA6451 PROBABILITY AND RANDOM PROCESSES
38
MA6451 PROBABILITY AND RANDOM PROCESSES
15
28 , y = 0
3
=
PY (y) = , y 1
7
1
28 , y = 2
Example :2.3.1
2
(2x + 3y), 0 < x < 1, 0 < y <1
Show that the function f (x, y) = 5
0 otherwise
is a joint density function of X and Y.
Solution
39
MA6451 PROBABILITY AND RANDOM PROCESSES
2
(2x + 3y), 0 < x < 1, 0 < y <1
Given f (x, y) = 5
0 otherwise
(i) f (x, y) ≥ 0 in the given interval 0 ≤ (x,y) ≤ 1
∞ ∞ 11 2
(ii) ∫ ∫ f (x, y) dx dy = ∫ ∫ (2x + 3y) dx dy
−∞ −∞ 00 5
1
2 1 1 x2
= ∫ ∫ 2 + 3xy dy
5 00 2 0
1
21 2 3y 2 2 3
= ∫ (1 + 3y) dy = y + = 1 +
50 5 2 0 5 2
2 5
= =1
5 2
Since f(x,y) satisfies the two conditions it is a j.d.f.
Example :2.3.2
The j.d.f of the random variables X and Y is given
8xy, 0 < x < 1, 0<y<x
f (x, y) =
0, otherwise
Find (i) f X(x) (ii) fY (y) (iii) f(y/x)
Solution
We know that
(i) The marginal pdf of ‘X’ is
∞ x
=
f X (x) = f(x) = ∫ f (x, y)dy ∫=
8xy dy 4x 3
−∞ 0
= 4x , 0 < x < 1
f (x) 3
Result
Marginal pdf g Marginal pdf y F(y/x)
2y
4x3, 0<x<1 4y, 0<y<x ,0 < y < x, 0 < x < 1
x2
2.4 REGRESSION
* Line of regression
The line of regression of X on Y is given by
σy
x=
− x r. (y − y)
σx
The line of regression of Y on X is given by
σy
y=
− y r. (x − x)
σx
* Angle between two lines of Regression.
1 − r 2 σyσx
tan θ =
r σ x 2 + σ y2
* Regression coefficient
Regression coefficients of Y on X
σy
r. = b YX
σx
Regression coefficient of X and Y
σx
r. = b XY
σy
∴ Correlation coefficient r =
± b XY × b YX
Example:2.4.1
1. From the following data, find
(i) The two regression equation
(ii) The coefficient of correlation between the marks in Economic and Statistics.
(iii) The most likely marks in statistics when marks in Economic are 30.
Marks in Economics 25 28 35 32 31 36 29 38 34 32
Marks in Statistics 40 46 49 41 36 32 31 30 33 39
Solution
41
MA6451 PROBABILITY AND RANDOM PROCESSES
(X − X) (Y − Y) (X − X) (Y − Y)
2 2 2
X Y X − X = X − 32 X − Y = Y − 38
25 43 -7 5 49 25 -35
28 46 -4 8 16 64 -32
35 4 3 11 9 121 33
32 41 0 3 0 9 0
31 36 -1 -2 1 4 2
36 32 4 -6 16 36 -24
29 31 -3 -7 09 49 +21
38 30 6 -8 36 64 -48
34 33 2 -5 4 25 -48
32 39 0 1 0 1 100
320 380 0 0 140 398 -93
∑ X 320 ∑ Y 380
Here =
X = = 32 and=
Y = = 38
n 10 n 10
Coefficient of regression of Y on X is
∑ (X − X)(Y − Y) −93
b YX = = = − 0.6643
∑ (X − X)
2
140
Coefficient of regression of X on Y is
∑ (X − X)(Y − Y) −93
b XY = = = − 0.2337
∑ (Y − Y)
2
398
Equation of the line of regression of X and Y is
X−X =b XY (Y − Y)
X – 32 = -0.2337 (y – 38)
X = -0.2337 y + 0.2337 x 38 + 32
X = -0.2337 y + 40.8806
Equation of the line of regression of Y on X is
Y−Y =b YX (X − X)
Y – 38 = -0.6643 (x – 32)
Y = -0.6643 x + 38 + 0.6643 x 32
= -0.6642 x + 59.2576
Coefficient of Correlation
r2 = bYX × b XY
= -0.6643 x (-0.2337)
r = 0.1552
r = ± 0.1552
r = ± 0.394
Now we have to find the most likely mark, in statistics (Y) when marks in economics (X) are 30.
y = -0.6643 x + 59.2575
42
MA6451 PROBABILITY AND RANDOM PROCESSES
2.5 COVARIANCE
Def : If X and Y are random variables, then Covariance between X and Y is defined as
Cov (X, Y) = E(XY) – E(X) . E(Y)
Cov (X, Y) = 0 [If X & Y are independent]
2.6 CORRELATION
Types of Correlation
• Positive Correlation
(If two variables deviate in same direction)
• Negative Correlation
(If two variables constantly deviate in opposite direction)
Solution
X Y U = X – 68 V = Y – 68 UV U2 V2
65 67 -3 -1 3 9 1
66 68 -2 0 0 4 0
67 65 -1 -3 3 1 9
67 68 -1 0 0 1 0
68 72 0 4 0 0 16
69 72 1 4 4 1 16
70 69 2 1 2 4 1
72 71 4 3 12 16 9
∑U = 0 ∑V = 0 ∑ UV = 24 ∑ U 2 = 36 ∑ V 2 = 52
Now
∑U 0
U= = = 0
n 8
∑V 8
V= = = 1
n 8
Cov (X, Y) = Cov (U, V)
∑ UV 24
⇒ − UV = −0= 3 (1)
n 8
∑U
2
2 36
σ=
U − U= − 0= 2.121 (2)
n 8
∑V
2
2 52
σ=
V − V= − 1= 2.345 (3)
n 8
Cov(U, V) 3
∴r(X, Y) = r(U, V) = =
σ U .σ V 2.121 x 2.345
= 0.6031 (by 1, 2, 3)
Example :2.6.2
1
Let X be a random variable with p.d.f. f (x) = , − 1 ≤ x ≤1 and let
2
Y = x2, find the correlation coefficient between X and Y.
Solution
44
MA6451 PROBABILITY AND RANDOM PROCESSES
∞
E(X) = ∫ x.f (x) dx
−∞
1
1 1 1 x2 11 1
= ∫ x. dx = = − = 0
−1 2 2 2 −1 2 2 2
E(X) = 0
∞
E(Y) = ∫ x 2 .f (x) dx
−∞
1
1 1 1 x3 11 1 1 2 1
= ∫ x . dx =
2
= + = . =
−1 2 2 3 −1 23 3 2 3 3
E(XY) = E(XX2)
1
x4
∞
= E(X )= ∫ x .f (x)
3
= dx = 3
0
−∞ 4 −1
E(XY) = 0
Cov(X, Y)
∴r(X, Y) = ρ(X, Y) = =0
σX σY
ρ = 0.
Note : E(X) and E(XY) are equal to zero, noted not find σ x &σ y .
∂ (x, y)
f UV (u, V) = f XY (x, y)
∂ (u, v)
Example : 1
If the joint pdf of (X, Y) is given by fxy (x, y) = x+y, 0 ≤ x, y ≤ 1, find the pdf of ∪ = XY.
Solution
Given f xy (x, y) = x + y
Given U = XY
Let V=Y
u
=x =&y V
v
45
MA6451 PROBABILITY AND RANDOM PROCESSES
∂x 1 ∂x − u ∂y ∂y
= = . =; 0;= 1 (1)
∂u V ∂v V ∂u2
∂v
∂y ∂x
1 −u
∂ (x, y) ∂u ∂v 1 1
=
∴J = = V V2 = −1 =
∂ (u, v) ∂y ∂y v v
0 1
∂u ∂v
1
⇒|J|= (2)
V
The joint p.d.f. (u, v) is given by
f uv (u, v) = f xy (x, y) |J|
1
= (x + y)
|v|
1 u
= + u (3)
Vv
The range of V :
Since 0 ≤ y ≤ 1, we have 0 ≤ V ≤ 1 (∴ V = y)
The range of u :
Given 0≤x≤1
u
⇒ 0 ≤ ≤ 1
v
⇒ 0≤u≤v
Hence the p.d.f. of (u, v) is given by
1u
f uv (u, v) = + v , 0 ≤ u ≤ v, 0 ≤ v ≤ 1
v v
Now
∞
f U (u) = ∫ f u,v (u, v) dv
−∞
1
= ∫ f u,v (u, v) dv
u
u 1
= ∫ 2 + 1 dv
u v
1
v −1
= v + u.
−1 u
∴fu (u) = 2(1-u), 0 < u < 1
p.d.f of (u, v) p.d.f of u = XY
46
MA6451 PROBABILITY AND RANDOM PROCESSES
1u
=
f uv (u, v) + v fu (u) = 2(1-u), 0 < u < 1
v v
0 ≤ u ≤ v, 0 ≤ v ≤ 1
TUTORIAL QUESTIONS
WORKEDOUT EXAMPLES
Example 1
The j.d.f of the random variables X and Y is given
8xy, 0 < x < 1, 0<y<x
f (x, y) =
0, otherwise
Find (i) f X(x) (ii) fY (y) (iii) f(y/x)
Solution
We know that
(i) The marginal pdf of ‘X’ is
∞ x
=
f X (x) = f(x) = ∫ f (x, y)dy ∫=
8xy dy 4x 3
−∞ 0
= 4x , 0 < x < 1
f (x) 3
Solution
∞
E(X) = ∫ x.f (x) dx
−∞
1
11 1 x2 11 1
= ∫ x. dx = = − = 0
−1 2 2 2 −1 2 2 2
E(X) = 0
∞
E(Y) = ∫ x 2 .f (x) dx
−∞
1
1 1 1 x3 11 1 1 2 1
= ∫ x . dx =
2
= + = . =
−1 2 2 3 −1 23 3 2 3 3
E(XY) = E(XX2)
1
x4
∞
= E(X )= ∫ x .f (x)
3
= dx =
3
0
−∞ 4 −1
E(XY) = 0
Cov(X, Y)
∴r(X, Y) = ρ(X, Y) = =0
σX σY
ρ = 0.
Note : E(X) and E(XY) are equal to zero, noted not find σ x &σ y .
Result
Marginal pdf g Marginal pdf y F(y/x)
2y
4x3, 0<x<1 4y, 0<y<x ,0 < y < x, 0 < x < 1
x2
48
MA6451 PROBABILITY AND RANDOM PROCESSES
UNIT - III
RANDOM PROCESSES
Introduction
In chapter 1, we discussed about random variables. Random variable is a function of the
possible outcomes of a experiment. But, it does not include the concept of time. In the
real situations, we come across so many time varying functions which are random in
nature. In electrical and electronics engineering, we studied about signals.
Generally, signals are classified into two types.
(i) Deterministic
(ii) Random
Here both deterministic and random signals are functions of time. Hence it is
possible for us to determine the value of a signal at any given time. But this is not
possible in the case of a random signal, since uncertainty of some element is always
associated with it. The probability model used for characterizing a random signal is called
a random process or stochastic process.
REMARK
i) If t is fixed, then {X(s, t)} is a random variable.
ii) If S and t are fixed {X(s, t)} is a number.
iii) If S is fixed, {X(s, t)} is a signal time function.
NOTATION
Here after we denote the random process {X(s, t)} by {X(t)} where the index set T is assumed to
be continuous process is denoted by {X(n)} or {Xn}.
49
MA6451 PROBABILITY AND RANDOM PROCESSES
Deterministic Process
A process is called deterministic if future value of any sample function can be predicted
from past values.
Example :3.3.1
Show that a first order stationary process has a constant mean.
Solution
Let us consider a random process {X(t 1 )} at two different times t 1 and t 2 .
51
MA6451 PROBABILITY AND RANDOM PROCESSES
∞
∴ E X ( t1 ) = xf ( x, t1 )dx
∫
−∞
[f(x,t 1 ) is the density form of the random process X(t 1 )]
∞
∴ E X ( t 2 ) = xf ( x, t 2 )dx
∫
−∞
[f(x,t 2 ) is the density form of the random process X(t 2 )]
Let t 2 = t 1 + C
∞ ∞
X ( t 2 )
∴E= ∫ xf ( x, t=
+ C )dx ∫ xf ( x, t )dx
1 1
−∞ −∞
= E X ( t1 )
Thus E X ( t 2 ) =E X ( t1 )
Mean process {X(t1)} = mean of the random process {X(t 2 )}.
Definition 2:
If the process is first order stationary, then
Mean = E(X(t)] = constant
3.3.4 Second Order Stationary Process
A random process is said to be second order stationary, if the second order density
function stationary.
f ( x1 , x 2 ;=
t1 , t 2 ) f ( x1 , x 2 ; t1 + C, t 2 + C ) ∀x1 , x 2 and C.
E ( X12 ) , E ( X 22 ) , E ( X1 , X 2 ) denote change with time, where
X = X(t 1 ); X2 = X(t 2 ).
3.3.5 Strongly Stationary Process
A random process is called a strongly stationary process or Strict Sense Stationary
Process (SSS Process) if all its finite dimensional distribution are invariance under translation of
time 't'.
f X (x 1 , x2 ; t 1 , t 2 ) = fX(x 1 , x2 ; t 1 +C, t 2 +C)
f X (x 1 , x2 , x3 ; t 1 , t 2 , t 3 ) = fX (x 1 , x2 , x 3 ; t 1 +C, t 2 +C, t 3 +C)
In general
f X (x 1 , x 2 ..x n ; t 1 , t 2 …t n) = f X(x1 , x2 ..x n ; t 1 +C, t 2 +C..t n +C) for any t 1 and any real number
C.
52
MA6451 PROBABILITY AND RANDOM PROCESSES
Let X(t 1 ) and X(t 2 ) be the two given numbers of the random process {X(t)}. The auto
correlation is
R XX ( t1 , t 2 ) = E {X ( t1 ) xX ( t 2 )}
Mean Square Value
Putting t 1 = t 2 = t in (1), we get
R XX (t,t) = E[X(t) X(t)]
⇒ R XX ( t, t ) = E X 2 ( t ) is the mean square value of the random process.
3.3.8 Auto Covariance of A Random Process
{ }
E X ( t1 ) − E ( X ( t1 ) ) X ( t 2 ) − E ( X ( t 2 ) )
C XX ( t1 , t 2 ) =
= R XX ( t1 , t 2 ) − E X ( t1 ) E X ( t 2 )
Correlation Coefficient
The correlation coefficient of the random process {X(t)} is defined as
C XX ( t1 , t 2 )
ρXX ( t1 , t 2 ) =
Var X ( t1 ) xVar X ( t 2 )
Where C XX (t 1 , t 2 ) denotes the auto covariance.
53
MA6451 PROBABILITY AND RANDOM PROCESSES
1
,0 < C < 2π
f ( θ ) = 2π
0 ,otherwise
∞
∴ E[X(t)] = ∫ X ( t ) f ( θ ) dθ
−∞
2π
1
= ∫ A ω ( ω t + θ ) 2 π dθ
0
A 2π
= sin ( ωt + θ ) 0
2π
A
Sin ( 2π + ωt ) − Sin ( ωt + 0 )
2π
=
A
= [Sinωt − sin ωt ]
2π
= 0 constant
Since E[X(t)] = a constant, the process X(t) is a stationary random process.
ne −λt ( λt )
∞ n
= ∑
n =0 n
e −λt ( λt )
∞ n
= ∑
n =1 n −1
( λt )
∞ n
=e −λt
∑
n =1 n −1
−λt
λt ( λt ) 2
=e + + ...
0! 1!
54
MA6451 PROBABILITY AND RANDOM PROCESSES
λt ( λt ) 2
= ( λt ) e 1 +
−λt
+ + ...
1 2
= ( λt ) e e
−λt λt
= λt , depends on t
Hence Poisson process is not a stationary process.
3.7 ERGODIC RANDOM PROCESS
Time Average
The time average of a random process {X(t)} is defined as
T
1
X ( t ) dt
2T −∫T
XT =
Ensemble Average
The ensemble average of a random process {X(t)} is the expected value of the random
variable X at time t
Ensemble Average = E[X(t)]
Ergodic Random Process
{X(t)} is said to be mean Ergodic
If lim X T = µ
T →∞
T
1
X ( t ) dt =
T →∞ 2T ∫
⇒ lim µ
−T
Mean Ergodic Theorem
Let {X(t)} be a random process with constant mean µ and let X T be its time average.
Then {X(t)} is mean ergodic if
lim Var X T = 0
T →∞
Correlation Ergodic Process
The stationary process {X(t)} is said to be correlation ergodic if the process {Y(t)} is
mean ergodic where
Y(t) = X(t) X(t+λ)
E y ( t ) = lim YT when YT is the time average of Y(t).
|T| →∞
55
MA6451 PROBABILITY AND RANDOM PROCESSES
1.The probability of raining today depends only on previous weather conditions existed
for the last two days and not on past weather conditions.
2.A different equation is markovian.
Markov Process
Pn ( t ) =
n!
56
MA6451 PROBABILITY AND RANDOM PROCESSES
X ( t1 ) =n 2 , t 2 > t1
= P X ( t1 ) =
n1 .P [the even occurs n2 -n times in the interval (t 2 =t 1 )
{λ ( t − t1 )}
−λ ( t 2 − t1 ) n 2 = n1
e −λt1 ( λt1 ) 1 e
n
, n 2 ≥ n1
2
= .
n1 n 2 − n1
e −λt 2 .λ n 2 .t1n1 ( t 2 − t1 )n 2 − n1
, n 2 ≥ n1
= n, n 2 − n1
0 , otherwise
P N ( t ) = r = , r = 0,1, 2,...
r!
A typical sample function of telegraph process.
(0,1)
(0,-1)
57
MA6451 PROBABILITY AND RANDOM PROCESSES
TUTORIAL QUESTIONS
Example:2 Given an example of stationary random process and justify your claim.
Solution:
Let us consider a random process X(t) = A as (wt + θ) where A &ω are custom and 'θ' is
uniformly distribution random Variable in the interval
(0, 2π).
Since 'θ' is uniformly distributed in (0, 2π), we have
1
,0 < C < 2π
f ( θ ) = 2π
0 ,otherwise
∞
∴ E[X(t)] = ∫ X ( t ) f ( θ ) dθ
−∞
58
MA6451 PROBABILITY AND RANDOM PROCESSES
2π
1
= ∫ A ω ( ω t + θ ) 2 π dθ
0
A 2π
= sin ( ωt + θ ) 0
2π
A
Sin ( 2π + ωt ) − Sin ( ωt + 0 )
2π
=
A
= [Sinωt − sin ωt ]
2π
= 0 constant
Since E[X(t)] = a constant, the process X(t) is a stationary random process.
Example:3.which are not stationary .Examine whether the Poisson process {X(t)} given by the
e −λt ( λt )
probability law P{X(t)=n] = , n = 0, 1, 2, ….
n
Solution
We know that the mean is given by
∞
E X ( t ) =∑ nPn ( t )
n =0
ne −λt ( λt )
∞ n
= ∑
n =0 n
e −λt ( λt )
∞ n
= ∑
n =1 n −1
( λt )
∞ n
=e −λt
∑
n =1 n −1
−λt
λt ( λt ) 2
= e + + ...
0! 1!
λt ( λt ) 2
= ( λt ) e 1 + +
−λt
+ ...
1 2
= ( λt ) e e
−λt λt
= λt , depends on t
Hence Poisson process is not a stationary process.
59
MA6451 PROBABILITY AND RANDOM PROCESSES
UNIT - 4
CORRELATION AND SPECTRAL DENSITY
Introduction
The power spectrum of a time series x(t) describes how the variance of the data x(t) is
distributed over the frequency components into which x(t) may be decomposed. This
distribution of the variance may be described either by a measure µ or by a statistical
cumulative distribution function S(f) = the power contributed by frequencies from 0 upto
f. Given a band of frequencies [a, b) the amount of variance contributed to x(t) by
frequencies lying within the interval [a,b) is given by S(b) - S(a). Then S is called the
spectral distribution function of x.
The spectral density at a frequency f gives the rate of variance contributed by
frequencies in the immediate neighbourhood of f to the variance of x per unit frequency.
PROPERTY: 1
The mean square value of the Random process may be obtained from the auto correlation
function.
R XX(τ), by putting τ = 0.
is known as Average power of the random process {X(t)}.
PROPERTY: 2
R XX(τ) is an even function of τ.
R XX (τ) = R XX (-τ)
PROPERTY: 3
If the process X(t) contains a periodic component of the same period.
PROPERTY: 4
If a random process {X(t)} has no periodic components, and
E[X(t)] = X then
60
MA6451 PROBABILITY AND RANDOM PROCESSES
R XX ( τ ) X = lim R XX ( τ )
2
lim= (or)X
|T| →∞ |T| →∞
i.e., when τ→∞, the auto correlation function represents the square of the mean of the random
process.
PROPERTY: 5
The auto correlation function of a random process cannot have an arbitrary shape.
Solution:
(i) Given R XX(τ) = 5 Sin nπ
R XX (–τ) = 5 Sin n(–π) = –5 Sin nπ
R XX(τ) ≠ R XX (–τ), the given function is not an auto correlation function.
1
(ii) Given R XX (τ) =
1 + 9τ 2
1
R XX (–τ) = = R XX ( τ )
1 + 9 ( −τ )
2
Example : 2
Find the mean and variance of a stationary random process whose auto correlation
function is given by
2
R XX ( τ ) = 18 +
6 + τ2
Solution
2
Given R XX ( τ ) = 18 +
6 + τ2
=
X2 lim R XX ( τ )
| τ | →∞
2
= lim 18 +
| τ | →∞ 6 + τ2
2
= 18 + lim
| τ | →∞ 6 + τ 2
61
MA6451 PROBABILITY AND RANDOM PROCESSES
2
= 18 +
6+
= 18 + 0
= 18
X = 18
E X ( t ) = 18
Var {X(t)} = E[X2(t)] - {E[X(t)]}2
We know that
E X 2 ( t ) = R XX(0)
2 55
= 18 + =
6+0 3
1
=
3
Example : 3
Express the autocorrelation function of the process {X'(t)} in terms of the auto correlation
function of process {X(t)}
Solution
Consider, R XX '(t 1 , t 2 ) = E{X(t 1 )X'(t 2 )}
X ( t 2 + h ) − X ( t 2 )
= E X ( t1 ) lim
n →0
h
X ( t1 ) X ( t 2 + h ) − X ( t1 ) X ( t 2 )
= lim E
h →0
h
R XX ( t1 , t 2 + h ) − R X ( t1 , t 2 )
= lim
h →0
h
∂
⇒ R XX ' (t 1 , t 2 ) = R XX ( t1 , t 2 ) (1)
∂t 2
∂
Similarly R XX ' (t 1 , t 2 ) = R XX ' ( t, t 2 )
∂t1
∂
⇒ R X ' X (t 1 , t 2 ) = R XX ( t1 , t 2 ) by (1)
∂t, ∂t 2
Auto Covariance
The auto covariance of the process {X(t)} denoted by C XX (t 1 , t 2 ) or C(t 1 , t 2 ) is defined as
62
MA6451 PROBABILITY AND RANDOM PROCESSES
CXX ( t1 ,=
t2 ) { (
E X ( t1 ) − E ( X ( t1 ) ) X t 2 − E X ( t 2 )
)}
4.2 CORRELATION COEFFICIENT
C (t ,t )
ρXX ( t1 , t 2 ) =XX 1 2
Var X ( t1 ) x Var X ( t 2 )
Where C XX(t 1 , t 2 ) denotes the auto covariance.
PROPERTY : 1
R XY (τ) = R YX (–τ)
PROPERTY : 2
If {X(t)} and {Y(t)} are two random process then R XY ( τ ) ≤ R XX ( 0 ) R YY ( 0 ) , where
R XX(τ) and R YY (τ) are their respective auto correlation functions.
63
MA6451 PROBABILITY AND RANDOM PROCESSES
PROPERTY : 3
If {X(t)} and {Y(t)} are two random process then,
R XY ( τ ) ≤ 1 R XX ( 0 ) + R YY ( 0 )
2
Solution
By def. we have
R XY (τ) = R XY (t, t+τ)
Now, R XY (t, t+τ) = E[X(t). Y(t+τ)]
= E [A cos (ωt + θ). A sin (ω (t+τ) + θ)]
{ }
= A 2 E sin ω ( t + τ ) + θ cos ( ωt + θ )
2π
1
∫
1
{sin ( ωt + ωτ + θ + ωt + θ )
= 2π 0
2
+ sin [ ωt + ωτ + θ − ωt − θ]} dθ
1
2π
sin [ 2ωt + ωτ + 2θ] + sin ( ωτ )
=
2π ∫
0
2
dθ
64
MA6451 PROBABILITY AND RANDOM PROCESSES
2π
1 cos ( 2ωt + ωτ + 2θ )
= − + sin ωτ ( θ )
4π 2 0
1 cos ( 2ωt + ωτ ) cos ( 2ωt + ωτ + 0 )
= − + + sin ωτ ( 2π − 0 )
4π 2 2
1 cos ( 2ωt + ωτ ) cos ( 2ωt + ωτ )
= − + + 2π sin ωτ
4π 2 2
1
= [0 + 2π sin ωτ]
4π
1
= sin ωτ (3)
2
Substituting (3) in (1) we get
A2
R XY (=
t, t τ ) sin ωτ
2
( t ) x =
F x = (w) ∫ x (t)e
− iωt
dt
−∞
Here X(ω) is called "spectrum of x(t)".
Hence x(t) = Inverse Fourier Transform of X(ω)
∞
1
= ∫ X ( ω) eiωt dω .
2π −∞
Definition
The average power P(T) of x(t) over the interval (-T, T) is given by
T
1
P (T) = ∫ x 2 ( t ) dt
2T − T
65
MA6451 PROBABILITY AND RANDOM PROCESSES
X T ( ω)
2
∞
1
= ∫
2π −∞ 2T
dω (1)
Definition
The average power PXX for the random process {X(t)} is given by
T
1
PXX = lim ∫ E X 2 ( t ) dt
T →∞ 2π
−T
E
∞ X ( ω) 2
1 T dω
= ∫ lim
2π −∞ T →∞ 2T
(2)
∫ R ( τ) e
− iωτ
= XX dτ
−∞
Thus,
∞
SXX ( f ) = ∫ R ( τ) e XX
− i2 πfτ
dτ
−∞
( ω)
SXX= ∫ R ( τ) e
XX
− iωτ
dτ
−∞
∞
SXX ( f ) = ∫ R ( τ) e XX
− i2 πfτ
dτ
−∞
To find R XX(τ) if S XX(ω) or S XX(f) is given
∞
1
=
R XX ( τ ) ∫ SXX ( ω) eiωτdω [inverse Fourier transform of S XX (ω)]
2π −∞
∞
1
(or) R XX ( τ )
= ∫ SXX ( f ) e − i2 πfτdτ
2π −∞
[inverse Fourier transform of S XX(f)]
66
MA6451 PROBABILITY AND RANDOM PROCESSES
The value of the spectral density function at zero frequency is equal to the total area
under the group of the auto correlation function.
∞
SXX ( f ) = ∫ R ( τ) e
XX
− i2 πfc
dτ
−∞
Taking f = 0, we get
∞
Sxx(0) = ∫ R ( τ ) dτ
−∞
XX
TUTORIAL QUESTIONS
1. Find the ACF of {Y(t)} = AX(t)cos (w 0 + ) where X(t) is a zero mean stationary random
process with ACF A and w 0 are constants and is uniformly distributed over (0, 2 ) and
independent of X(t).
2. Find the ACF of the periodic time function X(t) = A sinwt
3.If X(t) is a WSS process and if Y(t) = X(t + a) – X(t – a), prove that
4. If X(t) = A sin( ), where A and are constants and is a random variable, uniformly
distributed over (- ), Find the A.C.F of {Y(t)} where Y(t) = X2(t).
5.. Let X(t) and Y(t) be defined by X(t) = Acos t + Bsin t and Y(t) = B cos t – Asin t
Where is a constant and A nd B are independent random variables both having zero mean and
varaince . Find the cross correlation of X(t) and Y(t). Are X(t) and Y(t) jointly W.S.S
processes?
6. Two random processes X(t) and Y(t) are given by X(t) = A cos ( ), Y(t) = A sin(
), where A and are constants and is uniformly distributed over (0, 2 ). Find the cross
correlation of X(t) and Y(t) and verify that .
7..If U(t) = X cos t + Y sin t and V(t) = Y cost + X sint t where X and Y are independent random
varables such that E(X) = 0 = E(Y), E[X2] = E[Y2] = 1, show that U(t) and V(t) are not jointly
W.S.S but they are individually stationary in the wide sense.
8. Random Prosesses X(t) and Y(t) are defined by X(t) = A cos ( ), Y(t) = B cos ( )
where A, B and are constants and is uniformly distributed over (0, 2 ). Find the cross
correlation and show that X(t) and Y(t) are jointly W.S.S
WORKEDOUT EXAMPLES
Example 1.Check whether the following function are valid auto correlation function (i) 5 sin nπ
1
(ii)
1 + 9τ 2
Solution:
(i) Given R XX(τ) = 5 Sin nπ
R XX (–τ) = 5 Sin n(–π) = –5 Sin nπ
67
MA6451 PROBABILITY AND RANDOM PROCESSES
1
(ii) Given R XX (τ) =
1 + 9τ 2
1
R XX (–τ) = = R XX ( τ )
1 + 9 ( −τ )
2
Example : 2
Find the mean and variance of a stationary random process whose auto correlation
function is given by
2
R XX ( τ ) = 18 +
6 + τ2
Solution
2
Given R XX ( τ ) = 18 +
6 + τ2
=
X2 lim R XX ( τ )
| τ | →∞
2
= lim 18 +
| τ | →∞ 6 + τ2
2
= 18 + lim
| τ | →∞ 6 + τ 2
2
= 18 +
6+
= 18 + 0
= 18
X = 18
E X ( t ) = 18
Var {X(t)} = E[X2(t)] - {E[X(t)]}2
We know that
E X 2 ( t ) = R XX(0)
2 55
= 18 + =
6+0 3
1
=
3
Example : 3
68
MA6451 PROBABILITY AND RANDOM PROCESSES
Express the autocorrelation function of the process {X'(t)} in terms of the auto correlation
function of process {X(t)}
Solution
Consider, R XX '(t 1 , t 2 ) = E{X(t 1 )X'(t 2 )}
X ( t 2 + h ) − X ( t 2 )
= E X ( t1 ) lim
n →0
h
X ( t1 ) X ( t 2 + h ) − X ( t1 ) X ( t 2 )
= lim E
h →0
h
R XX ( t1 , t 2 + h ) − R X ( t1 , t 2 )
= lim
h →0
h
∂
⇒ R XX ' (t 1 , t 2 ) = R XX ( t1 , t 2 ) (1)
∂t 2
∂
Similarly R XX ' (t 1 , t 2 ) = R XX ' ( t, t 2 )
∂t1
∂
⇒ R X ' X (t 1 , t 2 ) = R XX ( t1 , t 2 ) by (1)
∂t, ∂t 2
Example :4
Two random process {X(t)} and {Y(t)} are given by
X(t) = A cos (ωt+θ), Y(t) = A sin (ωt + θ) where A and ω are constants and 'θ' is a uniform
random variable over 0 to 2π. Find the cross correlation function.
Solution
By def. we have
R XY (τ) = R XY (t, t+τ)
Now, R XY (t, t+τ) = E[X(t). Y(t+τ)]
= E [A cos (ωt + θ). A sin (ω (t+τ) + θ)]
{ }
= A 2 E sin ω ( t + τ ) + θ cos ( ωt + θ )
69
MA6451 PROBABILITY AND RANDOM PROCESSES
2π
∫ sin ( ωt + ω ) .cos ( ωt + θ ) 2π dθ
t +θ 1
=
0
2π
1
sin ( ωt + ωτ + θ ) cos ( ωt + θ ) dθ
2π ∫0
=
2π
∫ 2 {sin ( ωt + ωτ + θ + ωt + θ )
1 1
= 2π 0
+ sin [ ωt + ωτ + θ − ωt − θ]} dθ
1
2π
sin [ 2ωt + ωτ + 2θ] + sin ( ωτ )
=
2π ∫
0
2
dθ
2π
1 cos ( 2ωt + ωτ + 2θ )
= − + sin ωτ ( θ )
4π 2 0
1 cos ( 2ωt + ωτ ) cos ( 2ωt + ωτ + 0 )
= − + + sin ωτ ( 2π − 0 )
4π 2 2
1 cos ( 2ωt + ωτ ) cos ( 2ωt + ωτ )
= − + + 2π sin ωτ
4π 2 2
1
= [0 + 2π sin ωτ]
4π
1
= sin ωτ (3)
2
Substituting (3) in (1) we get
A2
R XY (=
t, t τ ) sin ωτ
2
70
MA6451 PROBABILITY AND RANDOM PROCESSES
UNIT – 5
71
MA6451 PROBABILITY AND RANDOM PROCESSES
h(t)
(a)
() ()
→ LTI System →
Input X t Output Y t
h(t)
(b)
a) Shows a general single input - output linear system
b) Shows a linear time invariant system
∞
Y(t)
= ∫ h ( u ) X ( t − u ) du
−∞
∞
=
−∞
∫ h ( t − u ) X ( u ) du
5.4 UNIT IMPULSE RESPONSE TO THE SYSTEM
If the input of the system is the unit impulse function, then the output or response is the
system weighting function.
Y(t) = h(t)
Which is the system weight function.
5.4.1 PROPERTIES OF LINEAR SYSTEMS WITH RANDOM INPUT
Property 1:
If the input X(t) and its output Y(t) are related by
∞
=Y(t) ∫ h ( u ) X ( t − u ) du , then the system is a linear time - invariant system.
−∞
Property 2:
If the input to a time - invariant, stable linear system is a WSS process, then the output
will also be a WSS process, i.e To show that if {X(t)} is a WSS process then the output {Y(t)} is
a WSS process.
Property 3:
∞
If {X(t)} is a WSS process and if Y(t) = ∫ h ( u ) X ( t − u ) du ,
−∞
then
R XY (=
τ ) R XX ( τ ) x h ( τ )
Property 4 :
∞
If {(X(t)} is a WSS process and if Y(t) = ∫ h ( u ) X ( t − u ) du ,
−∞
then
( τ ) R XY ( τ ) x h ( −τ )
R YY=
72
MA6451 PROBABILITY AND RANDOM PROCESSES
Property 5:
∞
If {X(t)} is a WSS process and if Y(t) = ∫ h ( u ) X ( t − u ) du ,
−∞
then
( τ ) R XX ( τ ) x h ( τ ) x h ( −τ )
R YY=
Property 6:
∞
If {X(t)} is a WSS process if Y(t) = ∫ h ( u ) X ( t − u ) du , then S ( ω=)
−∞
XY SXX ( ω) H ( ω)
. Property 7:
∞
If {X(t)} is a WSS process and if Y(t) = ∫ h ( u ) X ( t − u ) du ,
−∞
then
SYY ( ω=
) SXX ( ω) H ( ω)
2
Note:
Instead of taking R XY ( τ )= E X ( t ) Y ( t + τ ) in properties (3), (4) & (5), if we start
with R XY ( τ )= E X ( t − τ ) Y ( t ) , then the above property can also stated as
( τ ) R XY ( τ ) xh ( −τ )
a) R XY=
( τ ) R XY ( τ ) xh ( −τ )
b) R YY=
c) R YY ( τ ) = R XX ( τ ) x h ( −τ ) x h ( τ )
REMARK :
(i) We have written H ( ω) H * ( ω)= H ( ω) because
2
H ( ω) = F h ( τ )
H * ( ω)= F h ( −τ )
(
= F h ( τ ) )
= H ( ω)
(ii) Equation (c) gives a relationship between the spectral densities of the input and output
process in the system.
(iii) System transfer function:
We call H ( ω
= ) F{h ( τ )} as the power transfer function or system transfer function.
SOLVED PROBLEMS ON AUTO CROSS CORRELATION FUNCTIONS OF INPUT
AND OUTPUT
Example :5.4.1
Find the power spectral density of the random telegraph signal.
Solution
We know, the auto correlation of the telegraph signal process X(y) is
73
MA6451 PROBABILITY AND RANDOM PROCESSES
R XX ( τ ) =e
−2 λ τ
( ω)
SXX= ∫ R ( τ) e
XX
− iωτ
dτ
−∞
0 ∞
∫e dτ + ∫ e −2 λτe − iωτdτ
2 λτ − iωτ
= e
−∞ 0
τ = −τ when τ < 0
τ = −τ when τ > 0
0 ∞
( 2 λ−iω)τ
= ∫e
−∞
dτ + ∫ e −2 λτe − iωτdτ
0
∞
e( 2 λ−iω)τ
0
e −( 2 λ+iω)τ
= +
( 2λ − iω) ∞ − ( 2λ + iω) 0
1 1
= e0 − e −∞ − e −∞ − e −0
( 2λ − iω) ( 2λ + iω)
1 1
= (1 − 0 ) − ( 0 − 1)
( 2λ − iω) ( 2λ + i ω )
1 1
= + e −∞ − e −0
( 2λ − i ω ) ( 2 λ + i ω )
1
= (1 − 0 ) +
( 2λ − iω)
2λ + iω + 2λ − iω
=
( 2λ − iω)( 2λ + iω)
4λ
SXX ( ω) = 2
4λ + ω2
Example : 5.4.2
A linear time invariant system has a impulse response h ( t ) = e −βt U ( t ) . Find the power
spectral density of the output Y(t) corresponding to the input X(t).
Solution:
Given X(t) - Input
Y(t) - output
S YY (ω) - |H(ω)|2 S XX(ω)
74
MA6451 PROBABILITY AND RANDOM PROCESSES
∞
Now H ( ω) = h ( t ) e − iωt dt
∫
−∞
0 ∞
0
∞
e −(β+iω)t
=
− ( β + iω ) 0
=
1
− ( β + iω )
(
e −∞ − e −0 )
=
1
− ( β + iω )
( e −∞ − e −0 )
1
=
β + iω
1
|H(ω)| =
β + iω
1
=
β2 + ω2
1
∴ SYY ( ω)
= SXX ( ω)
β + ω2
2
TUTORIAL QUESTIONS
λ is a real positive constant. Find the ACF of the network’s response Y(t). Find the cross
correlation .
WORKEDOUT EXAMPLES
Example: 1
Find the power spectral density of the random telegraph signal.
Solution
We know, the auto correlation of the telegraph signal process X(y) is
R XX ( τ ) =e
−2 λ τ
( ω)
SXX= ∫ R ( τ) e
XX
− iωτ
dτ
−∞
0 ∞
∫ e e dτ + ∫ e e dτ
2 λτ − iωτ −2 λτ − iωτ
=
−∞ 0
τ = −τ when τ < 0
τ = −τ when τ > 0
0 ∞
( 2 λ−iω)τ
= ∫e
−∞
dτ + ∫ e −2 λτe − iωτdτ
0
∞
e ( 2 λ−iω)τ
0
e −( 2 λ+iω)τ
= +
( 2λ − i ω ) ∞ − ( 2λ + i ω ) 0
1 1
= e0 − e −∞ − e −∞ − e −0
( 2λ − iω) ( 2λ + iω)
1 1
= (1 − 0 ) − ( 0 − 1)
( 2λ − iω) ( 2λ + i ω )
1 1
= + e −∞ − e −0
( 2λ − i ω ) ( 2 λ + i ω )
1
= (1 − 0 ) +
( 2λ − iω)
2λ + iω + 2λ − iω
=
( 2λ − iω)( 2λ + iω)
4λ
SXX ( ω) = 2
4λ + ω2
76
Probability & Random Process
Formulas
UNIT-I (RANDOM VARIABLES)
2 F ( x) = P [ X ≤ x] x
F ( x) = P [ X ≤ x] = ∫ f ( x )dx
−∞
3 Mean = E [ X ] = ∑ xi p( xi ) ∞
i Mean = E [ X ] = ∫ xf ( x )dx
−∞
4 E X = ∑ x p( xi )
2 2 ∞
E X 2 = ∫x
i 2
i f ( x )dx
−∞
Var ( X ) = E ( X 2 ) − E ( X ) Var ( X ) = E ( X 2 ) − E ( X )
2 2
5
6 Moment = E X r = ∑ xir pi ∞
Moment = E X r = ∫x
r
i f ( x )dx
−∞
7 M.G.F M.G.F
M X ( t ) = E e tX = ∑ e tx p( x ) ∞
x M X ( t ) = E e tX = ∫e
tx
f ( x )dx
−∞
4) E ( aX + b ) = aE ( X ) + b
5) Var ( aX + b ) = a 2 Var ( X )
6) Var ( aX ± bY ) = a 2 Var ( X ) + b 2Var (Y )
7) Standard Deviation = Var ( X )
8) f ( x ) = F ′( x )
9) p( X > a ) = 1 − p( X ≤ a )
p( A ∩ B)
10) p ( A / B ) = , p( B) ≠ 0
p( B)
11) If A and B are independent, then p ( A ∩ B ) = p ( A ) ⋅ p ( B ) .
2 Poisson −λ
e λ x
e
(
λ e t −1 ) λ λ
x!
3 Geometric
q x −1 p (or) q x p pe t 1 q
1 − qe t p p2
4 Uniform
1 e bt − e at a + b (b − a )2
, a< x<b
f ( x) = b − a ( b − a )t 2 12
0, otherwise
5 Exponential
λ e − λ x , x > 0, λ > 0 λ 1 1
f ( x) =
0, otherwise λ−t λ λ2
6 Gamma
e − x x λ −1 1 λ λ
f ( x) = , 0 < x < ∞, λ > 0
Γ(λ ) (1 − t )λ
7 Normal −1 x − µ
2
µt +
t 2σ 2 µ σ2
1 2 σ
f ( x) = e e 2
σ 2π
16) Memoryless property of exponential distribution
P ( X > S + t / X > S ) = P ( X > t ) .
dx
17) Function of random variable: fY ( y ) = f X ( x )
dy
1) ∑∑ p i j
ij = 1 (Discrete random variable)
∞ ∞
∫∫
−∞ −∞
f ( x , y )dxdy = 1 (Continuous random variable)
P ( x, y )
2) Conditional probability function X given Y P { X = xi / Y = yi } = .
P( y)
P ( x, y )
Conditional probability function Y given X P {Y = yi / X = xi } = .
P( x)
f ( x, y)
3) Conditional density function of X given Y, f ( x / y) = .
f ( y)
f ( x, y)
Conditional density function of Y given X, f ( y / x) = .
f ( x)
4) If X and Y are independent random variables then
f ( x , y ) = f ( x ). f ( y ) (for continuous random variable)
P ( X = x , Y = y ) = P ( X = x ) . P (Y = y ) (for discrete random variable)
d b
5) Joint probability density function P ( a ≤ X ≤ b, c ≤ Y ≤ d ) = ∫ ∫ f ( x , y )dxdy .
c a
b a
6) Marginal density function of X, f ( x ) = f X ( x ) = ∫
−∞
f ( x , y )dy
Marginal density function of Y, f ( y ) = fY ( y ) = ∫
−∞
f ( x , y )dx
7) P ( X + Y ≥ 1) = 1 − P ( X + Y < 1)
Cov ( X , Y )
8) Correlation co – efficient (Discrete): ρ ( x , y ) =
σ Xσ Y
1 1 1
Cov ( X , Y ) =
n
∑ XY − XY , σ X =
n
∑ X 2 − X 2 , σ Y =
n
∑ Y 2 −Y 2
Cov ( X , Y )
9) Correlation co – efficient (Continuous): ρ ( x , y ) =
σ Xσ Y
Regression line Y on X is y − y = b yx ( x − x ) , byx =
∑( x − x)( y − y)
∑( x − x)
2
13) Regression for Continuous random variable:
σx
Regression line X on Y is x − E ( x ) = bxy ( y − E ( y ) ) , bxy = r
σy
σy
Regression line Y on X is y − E ( y ) = b yx ( x − E ( x ) ) , b yx = r
σx
∞
Regression curve X on Y is x = E ( x / y) = ∫ x f ( x / y ) dx
−∞
Regression curve Y on X is y = E ( y / x) = ∫ y f ( y / x ) dy
−∞
dx
fY ( y ) = f X ( x ) (One dimensional random variable)
dy
∂u ∂u
∂x ∂y
fUV ( u, v ) = f XY ( x , y ) (Two dimensional random variable)
∂v ∂v
∂x ∂y
n→∞.
16) Central limit theorem (Lindberg – Levy’s form)
If X1, X2, …Xn be a sequence of independent identically distributed R.Vs with E[Xi]
= µi and Var(Xi) = σi2, i = 1,2,…n and if Sn = X1 + X2 + … + Xn then under certain
general conditions, Sn follows a normal distribution with mean nµ and variance
nσ 2 as n → ∞ .
Sn − nµ X −µ
Note: z = ( for n variables), z = ( for single variables)
σ n σ
n
UNIT-III (MARKOV PROCESSES AND MARKOV CHAINS)
1) Random Process:
A random process is a collection of random variables {X(s,t)} that are
functions of a real variable, namely time ‘t’ where s Є S and t Є T.
2) Classification of Random Processes:
We can classify the random process according to the characteristics of time t
and the random variable X. We shall consider only four cases based on t and X
having values in the ranges -∞< t <∞ and -∞ < x < ∞.
Continuous random process
Continuous random sequence
Discrete random process
Discrete random sequence
RXX (τ ) - Auto correlation function
S XX ( ω ) - Power spectral density (or) Spectral density
RXY (τ ) - Cross correlation function
S XY ( ω ) - Cross power spectral density
S XX ( ω ) = ∫ R (τ ) e
XX
− iωτ
dτ
−∞
C XX (τ ) = RXX (τ ) − E [ X ( t )] E [ X ( t + τ )] = 0
2π −∞
XY
5) General formula:
e ax
i) ∫ e cos bx dx =
ax
( a cos bx + b sin bx )
a 2 + b2
e ax
ii) ∫ e sin bx dx =
ax
( a sin bx − b cos bx )
a 2 + b2
2
a a2
iii) x + ax = x + −
2
2 4
e iθ − e − iθ
iv) sin θ =
2i
e iθ + e − iθ
v) cos θ =
2
UNIT-V (LINEAR SYSTEMS WITH RANDOM INPUTS)
1) Linear system:
f is called a linear system if it satisfies
f a1 X 1 ( t ) ± a2 X 2 ( t ) = a1 f ( X 1 ( t ) ) ± a2 f ( X 2 ( t ) )
invariant system.
3) Relation between input X ( t ) and output Y ( t ) :
∞
Y (t ) = ∫ h(u) X (t − u) du
−∞
Where h( u) system weighting function.
4) Relation between power spectrum of X ( t ) and output Y ( t ) :
SYY (ω ) = S XX (ω ) H (ω )
2
If H (ω ) is not given use the following formula H (ω ) = ∫e
− jω t
h( t ) dt
−∞
5) Contour integral:
∞
e imx π −ma
∫−∞ a 2 + x 2 = a e (One of the result)
−aτ τ
1 e
6) F −1 2 2
= (from the Fourier transform)
a + ω 2a
WWW.VIDYARTHIPLUS.COM
PART B QUESTIONS
X 0 1 2 3 4 5 6 7
P(X) 0 K 2k 2k 3k k2 2k2 7k2+k
Find (i) The value of k, (ii) P[ 1.5 < X < 4.5 / X >2 ] and (iii) The smallest value of λ
for which P(X ≤ λ) < (1/2).
2. A bag contains 5 balls and its not known how many of them are white. Two balls
are drawn at random from the bag and they are noted to be white. What is the
chance that the balls in the bag all are white.
1 x
3. Let the random variable X have the PDF f(x) = e 2 , x >0 Find the moment
2
generating function, mean and variance.
4. A die is tossed until 6 appear. What is the probability that it must tossed
more than 4 times.
5. A man draws 3 balls from an urn containing 5 white and 7 black balls. He
gets Rs. 10 for each white ball and Rs 5 for each black ball. Find his
expectation.
6. In a certain binary communication channel, owing to noise, the probability
that a transmitted zero is received as zero is 0.95 and the probability that a
transmitted one is received as one is 0.9. If the probability that a zero is
transmitted is 0.4, find the probability that (i) a one is received (ii) a one was
transmitted given that one was received
7. Find the MGF and rth moment for the distribution whose PDF is f(x) = k e –x ,
x >0. Find also standard deviation.
8. The first bag contains 3 white balls, 2 red balls and 4 black balls. Second
bag contains 2 white, 3 red and 5 black balls and third bag contains 3
white, 4 red and 2 black balls. One bag is chosen at random and from it 3
balls are drawn. Out of 3 balls, 2 balls are white and 1 is red. What are the
probabilities that they were taken from first bag, second bag and third bag.
9. A random variable X has the PDF f(x) = 2x, 0 < x < 1 find (i) P (X < ½) (ii) P (
¼ < X < ½) (iii) P ( X > ¾ / X > ½ )
10. If the density function of a continuous random variable X is given by
ax 0≤x≤1
a 1≤x≤2
f(x) = 3a – ax 2≤x≤3
0 otherwise
14. Find the moment generating function of the geometric random variable
with the pdf f(x) = p q x-1, x = 1,2,3.. and hence find its mean and variance.
15. A box contains 5 red and 4 white balls. A ball from the box is taken our at
random and kept outside. If once again a ball is drawn from the box, what
is the probability that the drawn ball is red?
16. A discrete random variable X has moment generating function
5
1 3
M X(t) = et Find E(x), Var(X) and P (X=2)
4 4
17. The pdf of the samples of the amplitude of speech wave foem is found to
decay exponentially at rate , so the following pdf is proposed f(x) =
Ce | x| , - < X < . Find C, E(x)
18. Find the MGF of a binomial distribution and hence find the mean and
variance.
. Find the recurrence relation of central moments for a binomial distribution.
22. If X and Y are two independent poisson random variable, then show that
probability distribution of X given X+Y follows binomial distribution.
23. Find MGF and hence find mean and variance of a geometric distribution.
2. The joint p.d.f of a two dimensional randm variable (X, Y) is given by f(x, y) = (8
/9) xy, 1 ≤ x ≤ y ≤ 2 find the marginal density unctions of X and Y. Find also the
conditional density function of Y / X =x, and X / Y = y.
5. If two random variable have hoing p.d.f. f(x1, x2) = ( 2/ 3) (x1+ 2x2) 0< x1 <1,
0< x2 < 1
6. Find the value of k, if f(x,y) = k xy for 0 < x,y < 1 is to be a joint density function.
Find P(X + Y < 1 ) . Are X and Y independent.
7. If two random variable has joint p.d.f. f(x, y) = (6/5) (x +y2), 0 < x < 1 , 0< y
<1.Find P(0.2 < X < 0.5) and P( 0.4 < Y < 0.6)
8. Two random variable X and Y have p.d.f f(x, y) = x2 + ( xy / 3), 0 ≤ x ≤1,
0≤ y ≤ 2. Prove that X and Y are not independent. Find the conditional density
function
e
x2 y 2
9. X and Y are 2 random variable joint p.d.f. f(x, y) = 4xy , x ,y ≥ 0, find
the p. d. f. of x 2 +y2 .
10. Two random variable X and Y have joint f(x y) 2 – x –y, 0< x <1, 0< y < 1. Find
the Marginal probability density function of X and Y. Also find the conditional
density unction and covariance between X and Y.
11. Let X and Y be two random variables each taking three values –1, 0 and
1 and having the joint p.d.f.
X
Y
-1 0 1 Prove that X and Y have different
-1 0 0.1 0.1 expections. Also Prove that X and Y are
0 0.2 0.2 0.2 uncorrelated and find Var X and Var Y
1 0 0.1 0.1
WWW.VIDYARTHIPLUS.COM
12. 20 dice are thrown. Find the approximate probability tat the sum obtained is
between 65 and 75 using central limit theorem.
13. Examine whether the variables X and Y are independent whose joint density is
–xy – x, 0< x , y < ∞
f(x ,y) = x e
.
14. Let X and Y be independent standard normal random variables. Find the pdf of
z =X / Y.
15. Let X and Y be independent uniform random variables over (0,1) . Find the
PDF of Z = X + Y
P [ X(t) = n] = , n 1, 2...
1 at
n 1
=
at , n 0
1 at Show that it is not stationary
WWW.VIDYARTHIPLUS.COM
WWW.VIDYARTHIPLUS.COM
( V2 ) = 1 (1) Find the auto covariance function of X (t) (2) IS X (t) wide sense
stationary? Explain your answer.
6. Discuss the pure birth process and hence obtain its probabilities, mean and
variance.
7. At the receiver of an AM radio, the received signal contains a cosine carrier
signal at the carrier frequency with a random phase that is uniform distributed
over ( 0,2). The received carrier signal is X (t) = A cos(t + ). Show that the
process is second order stationary
8. Assume that a computer system is in any one of the three states busy, idle and
under repair respectively denoted by 0,1,2. observing its state at 2 pm each day,
0.6 0.2 0.2
we get the transition probability matrix as P 0.1 0.8 0.1 . Find out the 3rd
0.6 0 0.4
step transition probability matrix. Determine the limiting probabilities.
9. Given a random variable with density f () and another random variable
uniformly distributed in (-, ) and independent of and X (t) = a cos (t + ),
Prove that { X (t)} is a WSS Process.
10. A man either drives a car or catches a train to go to office each day. He never
goes 2 days in a row by train but if he drives one day, then the next day he is just
as likely to drive again as he is to travel by train. Now suppose that on the first
day of the week, the man tossed a fair die and drove to work iff a 6 appeared.
Find (1) the probability that he takes a train on the 3 rd day. (2) the probability that
he drives to work in the long run.
11. Show that the process X (t) = A cost + B sin t (where A and B are random
variables) is wide sense stationary, if (1) E (A ) = E(B) = 0 (2) E(A2) = E (B2 ) and
E(AB) = 0
12. Find probability distribution of Poisson process.
13. Prove that sum of two Poisson processes is again a Poisson process.
14. Write classifications of random processes with example
Unit 4
Correlation and Spectrum Densities
WWW.VIDYARTHIPLUS.COM
V+ TEAM
WWW.VIDYARTHIPLUS.COM
V+ TEAM
WWW.VIDYARTHIPLUS.COM
WWW.VIDYARTHIPLUS.COM
B.E./B.Tech. DEGREE EXAMINATION, APRIL/MAY 2010
Fourth Semester
(Regulation 2008)
x
1. If the p.d.f of a random variable X is f ( x ) = in 0 ≤ x ≤ 2 , find P ( X > 1.5 / X > 1) .
2
4. A random variable X has mean 10 and variance 16. Find the lower bound for P (5 < X < 15) .
7. Find the mean of the stationary process { x ( t )} , whose autocorrelation function is given by
9
R(τ ) = 16 + .
1 + 16τ 2
8. Find the power spectral density function of the stationary process whose autocorrelation
−τ
function is given by e .
11. (a) (i) The probability mass function of random variable X is defined as P ( X = 0) = 3C 2 ,
r ≠ 0,1, 2 . Find
1
(4) The largest value of X for which F ( x ) < .
2
(ii) If the probability that an applicant for a driver’s license will pass the road test on any
given trial is 0.8. What is the probability that he will finally pass the test
Or
(b) (i) Find the MGF of the two parameter exponential distribution whose density function is
(ii) The marks obtained by a number of students in a certain subject are assumed to be
normally distributed with mean 65 and standard deviation 5. If 3 students are selected at
random from this group, what is the probability that two of them will have marks over 70?
12. (a) (i) Find the bivariate probability distribution of (X,Y) given below:
Y 1 2 3 4 5 6
(ii) Find the covariance of X and Y, if the random variable (X,Y) has the joint p.d.f
f ( x , y ) = x + y , 0 ≤ x ≤ 1, 0 ≤ y ≤ 1 and f ( x , y ) = 0 , otherwise.
Or
8
(b) (i) The joint p.d.f of two dimensional random variable (X,Y) is given by f ( x , y ) = xy ,
9
(ii) A sample of size 100 is taken from a population whose mean is 60 and variance is 400.
Using Central Limit Theorem, find the probability with which the mean of the sample will
13. (a) (i) Examine whether the random process { X ( t )} = A cos(ω t + θ ) is a wide sense
stationary if A and ω are constants and θ is uniformly distributed random variable in (0,2π).
(ii) Assume that the number of messages input to a communication channel in an interval
of duration t seconds, is a Poisson process with mean λ = 0.3 . Compute
(1) The probability that exactly 3 messages will arrive during 10 second interval
(2) The probability that the number of message arrivals in an interval of duration 5
seconds is between 3 and 7.
Or
(b) (i) The random binary transmission process { X ( t )} is a wide sense process with zero mean
τ
and autocorrelation function R(τ ) = 1 − , where T is a constant. Find the mean and variance
T
of the time average of { X ( t )} over (0, T). Is { X ( t )} mean – ergodic?
(ii) The transition probability matrix of a Markov chain { X ( t )} , n = 1, 2, 3, ... having three
14. (a) (i) Find the autocorrelation function of the periodic time function of the period time
function { X ( t )} = A sin ω t .
τ
R(τ ) = 1 − for τ < T and R(τ ) = 0 for τ < T . Find the power spectrum of the
T
process { X ( t )} .
Or
(b) (i) { X ( t )} and {Y ( t )} are zero mean and stochastically independent random processes
−τ
having autocorrelation functions RXX (τ ) = e and RYY (τ ) = cos 2πτ respectively. Find
(ii) Find the autocorrelation function of the process { X ( t )} for which the power spectral
15. (a) (i) A wide sense stationary random process { X ( t )} with autocorrelation RXX (τ ) = e
−aτ
where A and a are real positive constants, is applied to the input of an Linear transmission input
system with impulse response h( t ) = e − bt u( t ) where b is a real positive constant. Find the
autocorrelation of the output Y ( t ) of the system.
(ii) If { X ( t )} is a band limited process such that S XX (ω ) = 0 when ω > σ , prove that
Or
(b) (i) Assume a random process X ( t ) is given as input to a system with transfer function
N0
H (ω ) = 1 for −ω 0 < ω < ω 0 . If the autocorrelation function of the input process is δ (t ) ,
2
find the autocorrelation function of the output process.
Electronics and Communication Engineering
(Common to Bio – Medical Engineering)
(Regulation 2008)
Time : Three hours Maximum : 100 marks
Answer ALL Questions
PART A – (10 x 2 = 20 Marks)
0, x<0
1. The CDF of a continuous random variable is given by F ( x ) = − x/5
. Find the
1 − e , 0≤ x<∞
PDF and mean of X .
2. Establish the memoryless property of the exponential distribution.
4. Find the acute angle between the two lines of regression, assuming the two lines of regression.
5. Prove that a first order stationary process has a constant mean.
6. State the postulates of a Poisson process.
9
7. The autocorrelation function of a stationary random process is R(τ ) = 16 + . Find the
1 + 16τ 2
mean and variance of the process.
9. Find the system Transfer function, if a Linear Time Invariant system has an impulse function
1
;t ≤c
H ( t ) = 2c .
0 ; t ≥ c
10. Define white noise.
PART B – (5 x 16 = 80 marks)
x, 0< x<1
f X ( x ) = k (2 − x ), 1 ≤ x ≤ 2 .
0, otherwise
Or
(b) (i) Derive the m.g.f of Poisson distribution and hence or otherwise deduce its mean and
variance.
(ii) The marks obtained by a number of students in a certain subject are assumed to be
normally distributed with mean 65 and standard deviation 5. If 3 students are selected at
random from this set, what is the probability that exactly 2of them will have marks over 70?
correlation coefficient.
Or
(ii) The life time of a particular variety of electric bulb may be considered as a random
variable with mean 1200 hours and standard deviation 250 hours. Using central limit
theorem, find the probability that the average life time of 60 bulbs exceeds 1250 hours.
stationary.
Or
probability that
(1) X (10) ≤ 8
(ii) Prove that the interval between two successive occurrences of a Poisson process with
1
parameter λ has an exponential distribution with mean .
λ
the power spectral density of the process.
Or
(b) (i) State and prove Weiner – Khintchine Theorem.
1
15. (a) (i) Consider a system with transfer function . An input signal with autocorrelation
1 + jω
function mδ (τ ) + m 2 is fed as input to the system. Find the mean and mean – square
value of the output.
(ii) A stationary random process X ( t ) having the autocorrelation function
impulse function. The linear system has the impulse response of h( t ) = e − bt u( t ) where
Y ( t ) .
Or
−t
1 RC
(b) (i) A linear system is described by the impulse response h( t ) = e u( t ) . Assume an
RC
input process whose Auto correlation function is Bδ (τ ) . Find the mean and Auto
correlation function of the output process.
N0
, for ω − ω0 < ω B
S NN (ω ) = 2 . Find the autocorrelation of { N ( t )} .
0, elsewhere