Professional Documents
Culture Documents
UNIT IV
CORRELATION AND SPECTRAL DENSITIES
26 Introduction 60
27 Auto Correlation functions 60
28 Properties 61
29 Cross Correlation functions 63
30 Properties 64
31 Power spectral density 65
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
32 66
Properties
33 Cross spectral density 66
34 Properties 67
UNIT V
LINER SYSTEMS WITH RANDOM INPUTS
35 Introduction 71
36 Linear time invariant systems 72
37 Problems 72
38 Linear systems with random inputs 73
39 Auto Correlation and Cross Correlation functions of inputs and 74
outputs
40 System transfer function 75
41 Problems 76
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
OBJECTIVES: To provide necessary basic concepts in probability and random processes for
applications such as random signals, linear systems etc in communication engineering.
UNIT I RANDOM VARIABLES 9+3 Discrete and continuous random variables – Moments –
Moment generating functions – Binomial, Poisson, Geometric, Uniform, Exponential, Gamma
and Normal distributions.
UNIT II TWO - DIMENSIONAL RANDOM VARIABLES 9+3 Joint distributions – Marginal and
conditional distributions – Covariance – Correlation and Linear regression – Transformation of
random variables.
UNIT III RANDOM PROCESSES 9+3 Classification – Stationary process – Markov process -
Poisson process – Random telegraph process.
UNIT IV CORRELATION AND SPECTRAL DENSITIES 9+3 Auto correlation functions – Cross
correlation functions – Properties – Power spectral density – Cross spectral density –
Properties.
UNIT V LINEAR SYSTEMS WITH RANDOM INPUTS 9+3
Linear time invariant system – System transfer function – Linear systems with random inputs –
Auto correlation and Cross correlation functions of input and output.
TOTAL (L:45+T:15): 60 PERIODS
OUTCOMES:
The students will have an exposure of various distribution functions and help in acquiring
skills in handling situations involving more than one variable. Able to analyze the response of
random inputs to linear time invariant systems.
TEXT BOOKS:
1. Ibe.O.C., “Fundamentals of Applied Probability and Random Processes", Elsevier, 1st Indian
Reprint, 2007.
2. Peebles. P.Z., "Probability, Random Variables and Random Signal Principles", Tata McGraw
Hill, 4th Edition, New Delhi, 2002.
REFERENCES:
1. Yates. R.D. and Goodman.D.J., "Probability and Stochastic Processes", 2 nd Edition, Wiley
India Pvt. Ltd., Bangalore, 2012.
2. Stark. H., and Woods. J.W., "Probability and Random Processes with Applications to Signal
Processing", 3rd Edition,Pearson Education, Asia, 2002.
3. Miller. S.L. and Childers.D.G., "Probability and Random Processes with Applications to Signal
Processing and Communications", Academic Press, 2004.
4. Hwei Hsu, "Schaum‟s Outline of Theory and Problems of Probability, Random Variables and
Random Processes", Tata McGraw Hill Edition, New Delhi, 2004.
5. Cooper. G.R., McGillem. C.D., "Probabilistic Methods of Signal and System Analysis", 3rd
Indian Edition, Oxford University Press, New Delhi, 2012.
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
UNIT - I
RANDOM VARIABLES
Introduction
Thus a random variable X can be considered as a function that maps all elements in
the sample space S into points on the real line. The notation X(S)=x means that x is the
value associated with the outcomes S by the Random variable X.
SAMPLE SPACE
Consider an experiment of throwing a coin twice. The outcomes
S = {HH, HT, TH, TT} constitute the sample space.
RANDOM VARIABLE
In this sample space each of these outcomes can be associated with a number by
specifying a rule of association. Such a rule of association is called a random variables.
Eg : Number of heads
We denote random variable by the letter (X, Y, etc) and any particular value of the
random variable by x or y.
S = {HH, HT, TH, TT}
X(S) = {2, 1, 1, 0}
Thus a random X can be the considered as a fun. That maps all elements in the sample space S
into points on the real line. The notation X(S) = x means that x is the value associated with
outcome s by the R.V.X.
Example
In the experiment of throwing a coin twice the sample space S is
S = {HH,HT,TH,TT}. Let X be a random variable chosen such that X(S) = x (the number of
heads).
Note
Any random variable whose only possible values are 0 and 1 is called a Bernoulli random
variable.
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
The function p(x) satisfying the above two conditions is called the probability mass
function (or) probability distribution of the R.V.X. The probability distribution {xi, pi} can be
displayed in the form of table as shown below.
X = xi x1 x2 ……. xi
P(X = xi) = pi p1 p2 ……. pi
Notation
Let ‘S’ be a sample space. The set of all outcomes ‘S’ in S such that
X(S) = x is denoted by writing X = x.
P(X = x) = P{S : X(s) = x}
|||ly P(x ≤ a) = P{S : X() (-∞, a)}
and P(a < x ≤ b) = P{s : X(s) (a, b)}
P(X = a or X = b) = P{(X = a) (X = b)}
P(X = a and X = b) = P{(X = a) (X = b)} and so on.
Theorem :2
If ‘X’ is a random variable and f() is a continuous function, then f(X) is a random
variable.
Note
If F(x) is the distribution function of one dimensional random variable then
I. 0 ≤ F(x) ≤ 1
II. If x < y, then F(x) ≤ F(y)
III. F(-∞) = lim F(x) 0
x
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Table 1
Values of X 0 1 2 3 4 5 6 7 8
p(x) a 3a 5a 7a 9a 11a 13a 15a 17a
p(x ) 1
i0
i
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
PROPERTIES OF P.D.F
The p.d.f f(x) of a R.V.X has the following properties
(i) f(x) ≥ 0, -∞ < x < ∞ (ii) f (x) dx 1
Remark
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
1. In the case of discrete R.V. the probability at a point say at x = c is not zero. But in the case
of a continuous R.V.X the probability
C at a point is always zero.
PX c f (x) dx x C C 0
c
x f (x) dx
r
iv Moments about origin
a
b
(x A) f (x) dx
r
v Moments about any point A
a
b
(x mean) f (x) dx
r
vi Moment about mean µr
a
b
(x mean) f (x) dx
2
vii Variance µ2
a
b
viii Mean deviation about the mean is M.D. | x mean | f (x) dx
a
Mathematical Expectations
Def :Let ‘X’ be a continuous random variable with probability density function f(x).
Then the mathematical expectation of ‘X’ is denoted by E(X) and is given by
E(X) x f (x) dx
It is denoted by
'r
x f (x) dx
r
Thus
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
And
Variance ' ' 2
2 2
{x X} f (x) dx
r
Thus
r {x X} f (x) dx
r
(b)
Where r E[X E(X) ] r
XX
1 0
Put r = 2 in (B), we get
2 (x X) f (x) dx
2
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
X EK
E g K f (x) dx
K f (x) dx f (x) dx 1
= K. 1 = K
Thus E(K) = K E[a constant] = constant.
EXPECTATIONS (Discrete R.V.’s)
Let ‘X’ be a discrete random variable with P.M.F p(x)
Then
E(X) x p(x)
x
For discrete random variables ‘X’
E(Xr ) xr p(x)
x (by def)
If we denote
E(Xr ) 'r
Then
'r E[Xr ] x r p(x)
x
Put r = 1, we get
Mean 'r x p(x)
Put r = 2, we get
'2 E[X2 ] x2p(x)
x
E(X ) E(X)
2
' 2
2 2 1'2
The rth moment about mean
'r E[{X E(X)}r ]
(x X) p(x), E(X) X
x
Put r = 2, we get
Variance = 2 (x X)2 p(x)
x
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Theorem 3
If ‘X’ is a random variable with pdf f(x) and ‘a’ is a constant, then
(i) E[a G(x)] = a E[G(x)]
(ii) E[G(x)+a] = E[G(x)+a]
Where G(X) is a function of ‘X’ which is also a random variable.
Theorem 4
If ‘X’ is a random variable with p.d.f. f(x) and ‘a’ and ‘b’ are constants, then
E[ax + b] = a E(X) + b
Cor 1:
If we take a = 1 and b = –E(X) = – X , then we get
E(X- X ) = E(X) – E(X) = 0
Note
1 1
E
X E(X)
E[log (x)] ≠ log E(X)
E(X2) ≠ [E(X)]2
EXPECTATION OF A LINEAR COMBINATION OF RANDOM VARIABLES
Let X1, X2, ……, Xn be any ‘n’ random variable and if a1, a2, ……, an are constants, then
E[a1X1 + a2X2 + ……+ anXn] = a1E(X1) + a2E(X2)+ ……+ anE(Xn)
Result
If X is a random variable, then
Var (aX + b) = a2Var(X) ‘a’ and ‘b’ are constants.
Covariance :
If X and Y are random variables, then covariance between them is defined as
Cov(X, Y) = E{[X - E(X)] [Y - E(Y)]}
= E{XY - XE(Y) – E(X)Y + E(X)E(Y)}
Cov(X, Y) = E(XY) – E(X) . E(Y) (A)
If X and Y are independent, then
E(XY) = E(X) E(Y)
Sub (B) in (A), we get
Cov (X, Y) = 0
If X and Y are independent, then
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Cov (X, Y) = 0
Note
(i) Cov(aX, bY) = ab Cov(X, Y)
(ii) Cov(X+a, Y+b) = Cov(X, Y)
(iii) Cov(aX+b, cY+d) = ac Cov(X, Y)
(iv) Var (X1 + X2) = Var(X1) + Var(X2) + 2 Cov(X1, X2 )
If X1 , X2 are independent
Var (X1+ X2) = Var(X1) + Var(X2)
EXPECTATION TABLE
Discrete R.V’s Continuous R.V’s
1. E(X) = x p(x) 1. E(X) = x f (x) dx
2. E(X ) x p(x) 2. E(X ) '
r ' r r
x r f (x) dx
r
x
r
3. Mean = ' x p(x) 3. Mean = ' x f (x) dx
r r
4. x p(x) 4.
' 2 ' 2
x f (x) dx
2 2
Now
6
E(X) x i p(x i )
i1
= x1p(x1) + x2 p(x2 ) + x3 p(x3 ) + x4 p(x4 ) + x5 p(x5 ) + x6 p(x6 )
= 1 x (1/6) + 1 x (1/6) + 3 x (1/6) + 4 x (1/6) + 5 x (1/6) + 6 x (1/6)
= 21/6 = 7/2 (1)
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
6
E(X) x i p(x p )
i1
= x12p(x1 )+x2 2p(x2 )+x32 p(x3 )+x4 2 p(x4 )+x5 2 p(x5 )+x6 p(x6 )
= 1(1/6) + 4(1/6) + 9(1/6) + 16(1/6) + 25(1/6) + 36(1/6)
1 4 9 16 25 36 91
= = (2)
6 6
Variance (X) = Var (X) = E(X
2
2
) – [E(X)]2
7
91 91 49 35
= – = =
6 2 6 4 12
Example :2
Find the value of (i) C (ii) mean of the following distribution given
C(x x 2 ), 0 x 1
f (x)
0 otherwise
Solution
C(x x 2 ), 0 x 1
Given f (x) (1)
0 otherwise
f (x) dx 1
1
C(x x ) dx 1
2
[using (1)] [ 0<x<1]
0
1
x 2 x3
C 1
2 3 0
1 1
C 1
2 3
32
C 1
6
C
1 C=6 (2)
6
Sub (2) in (1), f(x) = 6(x – x2), 0< x < 1 (3)
Mean = E(x) = x f (x) dx
1
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
1
= 6x 6x
3 4
3 4
0
Mean = ½
Mean C
½ 6
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
x
F(x) f (x) dx
0 x
f (x) dx f (x) dx
0
x
x2 x
3
x x
= 0 + 6x(1 x) dx = 6 x(1 x) dx 6
0 0 2 3 0
= 3x 2x
2 3
= 6 (x x2 ) dx =1
0
Using (1), (2) & (3) we get
0, x0
F(x) 3x 2 2x3 , 0 x 1
1, x 1
Example:1.4.2
ex , x0
(i) If f (x) defined as follows a density function ?
0, x0
(ii) If so determine the probability that the variate having this density will fall in the interval (1,
2).
Solution
ex , x0
Given f (x)
0, x0
(a) In (0, ∞), e is +ve-x
f(x) ≥ 0 in (0, ∞)
0
(b) f (x) dx = f (x) dx f (x) dx
0
x
= 0 0 dx e dx
0
= e x
e 1
0
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
=1
Hence f(x) is a p.d.f
(ii) We know that
b
P(a ≤ X ≤ b) = f (x) dx
a
x 2
ex dx [e ]
2
2
P(1 ≤ X ≤ 2) = f (x) dx 1
1 1
2
e x
dx [ex ]2
1 1
-2 -1
= -e + e = -0.135 + 0.368 = 0.233
Example:1.4..3
A probability curve y = f(x) has a range from 0 to ∞. If f(x) = e -x, find the mean and
variance and the third moment about mean.
Solution
Mean = x f (x) dx
0
= x ex dx x[ex ] [ex ]
0
0
Mean = 1
Variance 2 (x Mean)2 f (x) dx
0
(x 1)2 ex dx
0
2 1
Third moment about mean
b
3 (x Mean)3 f (x) dx
a
Here a = 0, b = ∞
b
3 (x 1)3 e x dx
a
= (x 1)3 (e x ) 3(x 1)2 (e x ) 6(x 1)(e x ) 6(e x ) 0
= -1 + 3 -6 + 6 = 2
µ3 = 2
MOMENT GENERATING FUNCTION
Def : The moment generating function (MGF) of a random variable ‘X’ (about origin) whose
probability function f(x) is given by
MX(t) = E[etX]
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
tx
e f (x)dx, for a continuous probably function
x
=
etxp(x), for a discrete probably function
x
Where t is real parameter and the integration or summation being extended to the entire range of
x.
Example :1.5.1
r
Prove that the rth moment of the R.V. ‘X’ about origin is M (t) t '
X r
r0 r!
Proof
WKT M X(t) = E(etX)
tX (tX)2 (tX)3 (tX)r
= E 1 2! 3! .... r! ....
1!
t2 t r
= E[1] t E(X) E(X2 ) E(Xr ) ........
2! r!
' t2 ' 3 r
t ' t '
MX (t) 1 t 1 2 3 ..... r ........
2! 3! r!
[using r E(X ) ]
' r
th
Thus r moment = coefficient of
tr
r!
Note
1. The above results gives MGF interms of moments.
2. Since M X(t) generates moments, it is known as moment generating function.
Example:1.5.2
Find '1 and ' 2 from MX(t)
Proof
r
WKT MX (t) t 'r
r0 r!
' t '
t2 ' tr '
MX (t) 0 1 2 ..... r (A)
1! 2! r!
Differenting (A) W.R.T ‘t’, we get
3
' ' 2t ' t '
M X (t) 1 2 3 ..... (B)
2! 3!
Put t = 0 in (B), we get
M ' (0) ' Mean
d
X 1
Mean = M (0)
'
(or) (M (t))
1 dt X
t0
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
1! 2!
t2 tr
E (1) E[t(X a)] E[ (X a) ] .... E[ (X a) r ] ....
2
2! r!
t2 t r
1 t E(X a) E(X a)2 E(X a)r ....
2! r!
t2 ' tr '
' ' r
Where r E[(X a) ]
1 t 1 t 2 2 .... r ....
' 2! ' r!
tr '
M X (t)xa 1 t1 2!2 ..... r! r .....
Result:
M CX (t) E[etcx ] (1)
M X (t) E[e ] ctx
(2)
From (1) & (2) we get
MCX (t) M X (ct)
Example :1.5.4
If X1, X2, ….., Xn are independent variables, then prove that
MX 1X ....X
2
(t)
n
E[et( X1 X 2 X n ) ]
= E[e tX1 .e tX2 ...........e tXn ]
= E(e tX1 ).E(e tX2 ) ... E(e tX n )
[ X1, X2, ….., Xn are independent]
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Example:1.5.5
Xa at t
h
= E e
tX ta
= E e n n
tX ta
h h
= E[ e ] E[ e ]
ta tX
h h
= e E[ e ] [by def]
ta t
h h
=e . MX
at t Xa
M (t) e .MX h
, where and M X(t) is the MGF about origin.
h h
Example:1.5.6
Find the MGF for the distribution where
2
3 at x 1
1
f (x) at x 2
3
0 otherwise
Solution
2
Given f (1)
3
1
f (2)
3
f(3) = f(4) = …… = 0
MGF of a R.V. ‘X’ is given by
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Example :1.6.1
Find the Moment Generating Function (MGF) of a binomial distribution about origin.
Solution
n
WKT MX (t) etx p(x)
x0
Let ‘X’ be a random variable which follows binomial distribution then MGF about origin is
given by
n
E[etX ] M X (t) etx p(x)
x0
n etx nC p xq nx p(x) nC pxqnx
x x
x0
(e ) pxnC qnx
tx
n
x
x0
n
(pet )x nCxqnx
x0
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
MX (t) (q pet )n
Example:1.6.2
Find the mean and variance of binomial distribution.
Solution
MX (t) (q pet )n
MX' (t) n(q pet )n1.pet
Put t = 0, we get
M 'X (0) n(q p)n1.p
Mean = E(X) = np (q p) 1 Mean M 'X (0)
MX" (t) np (q pe t ) n1.e t e t (n 1)(q pe t ) n2 .pe t
Put t = 0, we get
MX" (t) np (q p) n1 (n 1)(q p) n2 .p
= np1 (n 1)p
= np n2 p 2 np2
= n2p2 np(1 p)
M"X (0) n2p2 npq 1 p q
"
M (0)
X E(X ) n p npq
2 2 2
Example :1.6.4
Additive property of binomial distribution.
Solution
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
. q p 2e t
n1 n2
= q1 p1e t 2
Example :1.6.5
, M Y(t) q+pe t , then
n1 n2
If M X (t) q+pe t
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
n = 25
The binomial distribution is
P(X = x) = p(x) = nCxpx qn-x
= 25Cx(1/5)x(4/5)n-x, x = 0, 1, 2, ….., 25
Passion Distribution
Def :
A random variable X is said to follow if its probability law is given by
ex , x = 0, 1, 2, ….., ∞
P(X = x) = p(x) =
x!
e(e 1)
t
WKT MX (t)
e(e 1) .et
t
MX '(t)
M X '(0) e .
1' E(X) x.p(x)
x0
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
e x x. ex1
= x.
x0 x! x0 x!
x1
x.
= 0 e .
x1 x!
= e .
x1
x1 (x 1)!
2
= e 1
2!
= e .e
Mean =
' E[X 2 ]
2
2 ex
2
x .p(x) x . x!
x0 x0
e x
{x(x 1) x}.
x0 x!
x(x 1)e x x. e x
x0 x! x0 x!
x2
e
2
x0 (x 2)(x 3) ... 1
e2
x2
x0 (x 2)!
e 2 1
2
1! 2! ....
2
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
3
P(X=1) = e (Given)
10
3
= e (1)
10
e2 1
P(X=2) = (Given)
2! 5
e2 1
(2)
2! 5
3
(1) e (3)
10
2 2
(2) e (4)
5
(3) 1 3
(4) 4
4
3
e0
P(X=0) = e4/3
0!
= e e (4 / 3)
3 4/3 3
P(X=3)
3! 3!
Example :1.7.2
If X is a Poisson variable
P(X = 2) = 9 P(X = 4) + 90 P(X=6)
Find (i) Mean if X (ii) Variance of X
Solution
ex
P(X=x) = , x 0,1, 2,.....
x!
Given P(X = 2) = 9 P(X = 4) + 90 P(X=6)
e2 e4 e6
=9 + 90
2! 4! 6!
2 4
1 9 90
=
2 4!2 6!
1 3 4
=
2 8 2 84
3
1=
4 4
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
4 32 4 0
2 = 1 or 2 = -4
1 or 2i
Mean = = 1, Variance = = 1
Standard Deviation = 1
1.7.3 Derive probability mass function of Poisson distribution as a limiting case of Binomial
distribution
Solution
We know that the Binomial distribution is
P(X=x) = nCx pxqn-x
n!
= p x (1 p) nx
(n x)! x!
1.2.3.......(n x)(n x 1) ..... npn (1 p)n
=
1.2.3.....(n x) x! (1 p)x
x
1.2.3.......(n x)(n x 1)......n p
= (1 p)n
1.2.3.....(n x) x! 1 p
n
n(n 1)(n 2)......(n x 1) x
1
1
n x 1 x n
=
x!
n
n
n(n 1)(n 2)..... (n x 1) x
= x 1 1
x! n n
1 2 x 1
11 1 ...... 1
nx
n
P(X=x) = n n x 1
x! n
1
x
2 x 1
1
nx
= 11 1 ......1
x! n n n n
When n∞
nx
x lt 1 1 1 2 x 1
1 ......1
P(X=x) =
1
x! n n n n n
x 1 2 x 1
= lt 1 lt 1 ...... lt 1
x! n n n
n n n
We know that
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
lt 1
nx
e
n
1n 2
lt 1 lt 1 x 1
and 1
lt 1
n
n n n x n n
P(X=x) = e , x 0,1, 2,......
x!
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Example:1.8.2
If the MGF of X is (5-4et)-1, find the distribution of X and P(X=5)
Solution
Let the geometric distribution be
P(X = x) = qxp, x = 0, 1, 2, …..
The MGF of geometric distribution is given by
p
(1)
1 qet 1
4
1
Here M (t) = (5 - 4e ) 5 1 et
t -1
(2)
X 5
4 1
Company (1) & (2) we get q ;p
5 5
P(X = x) = pqx, x = 0, 1, 2, 3, …….
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
x
1 4
=
5 5 5
1 4 45
P(X = 5) = 6
5 5 5
CONTINUOUS DISTRIBUTIONS
If ‘X’ is a continuous random variable then we have the following distribution
1. Uniform (Rectangular Distribution)
2. Exponential Distribution
3. Gamma Distribution
4. Normal Distribution
1. 9.1 Uniform Distribution (Rectangular Distribution)
Def : A random variable
1 X is set to follow uniform distribution if its
,axb
f (x) b a
0, otherwise
* To find MGF
MX (t) etxf (x)dx
b
1
e tx
dx
a b a a
1 e tx
b a t
b
1 bx at
e e (b
a)t
The MGF of uniform distribution is
ebt eat
MX (t)
(b a)t
* To find Mean and Variance
E(X) x f (x)dx
b
x2
2
x dx a
b 1 1 b
b dx
ba
ba b a a
a x
b2 a 2 ba ab
2(b a) 2 2
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
ab
Mean 1'
2
Putting r = 2 in (A), we get
x2f (x)dx x
b 2
'2
b
dx
b a
a a
a2 ab b2
=
3
Variance = ' '2
2 1
= b ab b b a (b a)
2 2 2 2
3 2 12
(b a) 2
Variance =
12
f (x)
2
0 otherwise
(i) P(X>1) = 1/3
f (x)dx 1 / 3
1 1
dx 1 / 3
1
12
x 1/ 3
1
1 1/ 3
2 1 2
=3
(ii) P(|X| < 1) = P(|X| > 1) = 1 - P(|X| < 1)
P(|X| < 1) + P(|X| < 1) = 1
2 P(|X| < 1) = 1
2 P(-1 < X < 1) = 1
1
2 f (x)dx 1
1
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
1 1
2 dx 1
1 2
= 2
Note:
1. The distribution function F(x) is given by
0 x
x a
F(x) axb
b
1 a bx
2. The p.d.f. of a uniform variate ‘X’ in (-a, a) is given by
1
a x a
F(x) 2a
0 otherwise
, > t
t
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
t 1
1
MX(t) 1
t t
1
1 t t ..... t
2 r
2 2
r
r
t t 2! t t!
1 2 ..... r
r 2! r!
t
MX(t) =
r0
' t1 1
Mean Coefficient of
1!
1
' 2
Coefficient of t 2
2! 2
2
1 2 1 2 2
Variance = Mean =
2
Example: 1.10.1
Let ‘X’ be a random variable with p.d.f
1 x
e3 x0
F(x) 3
0 otherwise
Find 1) P(X > 3) 2) MGF of ‘X’
Solution
WKT the exponential distribution is
F(x) = e-x, x > 0
1
Here
3
1 x3
P(x>3) = f (x) dx e dx
3 3 3
P(X>3) = e-1
MGF is MX (t)
t
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
1 1
1
3 3
1
t 1 3t 1 3t
3 3
1
MX(t)
1 3t
Note
If X is exponentially distributed, then
P(X > s+t / x > s) = P(X > t), for any s, t > 0.
and = dx
=0, elsewhere
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
= 0 ,when x=0 , x=
Now V(X) = = =
= a (a-b) From (1) and (2)
TUTORIAL QUESTIONS
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Example :1
Given the p.d.f. of a continuous random variable ‘X’ follows
6x(1 x), 0x1
f (x) , find c.d.f. for ‘X’
0 otherwise
Solution
6x(1 x), 0 x 1
Given f (x)
0 otherwise
x
The c.d.f is F(x) f (x) dx , x
(i) When x < 0, then
x
F(x) f (x) dx
x
0 dx =0
(ii) When 0< x < 1, then
x
F(x) f (x) dx
0 x
f (x) dx f (x) dx
0 x
x x x2 x3
= 0 + 6x(1 x) dx = 6 x(1 x) dx 6
0 0 2 3 0
= 3x2 2x3
(iii) When x > 1, then
x
F(x) f (x) dx
0 1 x
0 dx 6x(1 x) dx 0 dx
0 0
1
6 (x x2 ) dx =1
0
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Example :2
A random variable X has the following probability function
Values of X 0 1 2 3 4 5 6 7 8
Probability P(X) a 3a 5a 7a 9a 11a 13a 15a 17a
p(x ) 1
i0
i
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Example :3
The mean and SD of a binomial distribution are 5 and 2, determine the distribution.
Solution
Given Mean = np = 5 (1)
SD = npq = 2 (2)
(2) np 4 q 4
(1) npq 5 5
4 1 1
p1 p
5 5 5
Sub (3) in (1) we get
n x 1/5 = 5
n = 25
The binomial distribution is
P(X = x) = p(x) = nCxpx qn-x
= 25Cx(1/5) (4/5)n-x,
x
x = 0, 1, 2, ….., 25
Example :4
If X is a Poisson variable
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
UNIT – II
Introduction
In the previous chapter we studied various aspects of the theory of a single R.V. In this
chapter we extend our theory to include two R.V's one for each coordinator axis X and Y
of the XY Plane.
DEFINITION : Let S be the sample space. Let X = X(S) & Y = Y(S) be two functions each
assigning a real number to each outcome s S. hen (X, Y) is a two dimensional random
variable.
2.1 Types of random variables
1. Discrete R.V.’s
2. Continuous R.V.’s
Discrete R.V.’s (Two Dimensional Discrete R.V.’s)
If the possible values of (X, Y) are finite, then (X, Y) is called a two dimensional discrete
and it can be represented by (xi, y), i = 1,2,….,m.
In the study of two dimensional discrete R.V.’s we have the following
5 important terms.
Joint Probability Function (JPF) (or) Joint Probability Mass Function.
Joint Probability Distribution.
Marginal Probability Function of X.
Marginal Probability Function of Y.
Conditional Probability Function.
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
The set {xi, pij / p.j}, i = 1, 2, 3, …..is called the conditional probability distribution of X
given Y = yj.
The conditional probability function of Y given X = xi is given by
P[Y = yi / X = xj] pij
P[Y = yi / X = xj] = –––––––––––––––– = ––––
P[X = xj] pi.
The set {yi, pij / pi.}, j = 1, 2, 3, …..is called the conditional probability distribution of Y
given X = xi.
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
15
28 , y0
3
PY (y) , y 1
7
1
, y2
28
Example :2.3.1
2
(2x 3y), 0 x 1, 0 y 1
Show that the function f (x, y) 5
0 otherwise
is a joint density function of X and Y.
Solution
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
2
(2x 3y), 0 x 1, 0 y 1
Given f (x, y) 5
0 otherwise
(i) f (x, y) 0 in the given interval 0 ≤ (x,y) ≤ 1
(ii)
f (x, y) dx dy =
11 2 (2x 3y) dx dy
5
00 1
x2
11
= 2 2 3xy dy
5 00 2
0 2 1
2 1 2 3y 21 3
= (1 3y) dy = y 2 5 2
50 5
0
2 5
= 1
5 2
Since f(x,y) satisfies the two conditions it is a j.d.f.
Example :2.3.2
The j.d.f of the random variables X and Y is given
8xy, 0 x 1, 0yx
f (x, y)
0, otherwise
Find (i) fX(x) (ii) fY(y) (iii) f(y/x)
Solution
We know that
(i) The marginal pdf of ‘X’ is
x
fX(x) = f(x) = f (x, y)dy 8xydy 4x3
0
f (x) 4x , 0 x 1
3
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Result
Marginal pdf g Marginal pdf y F(y/x)
2y
4x3, 0<x<1 4y, 0<y<x , 0 y x, 0 x 1
x2
REGRESSION
* Line of regression
The line of regression of X on Y is given by
y
x x r. (y y)
x
The line of regression of Y on X is given by
y
y y r. (x x)
x
* Angle between two lines of Regression.
1 r2 yx
tan r 2 2
x y
* Regression coefficient
Regression coefficients of Y on X
y
r. bYX
x
Regression coefficient of X and Y
x
r. bXY
y
Correlation coefficient r bXY bYX
Example:2.4.1
1. From the following data, find
(i) The two regression equation
(ii) The coefficient of correlation between the marks in Economic and Statistics.
(iii) The most likely marks in statistics when marks in Economic are 30.
Marks in Economics 25 28 35 32 31 36 29 38 34 32
Marks in Statistics 40 46 49 41 36 32 31 30 33 39
Solution
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
X X Y Y X X Y Y
2 2 2
X Y X X X 32 X Y Y 38
25 43 -7 5 49 25 -35
28 46 -4 8 16 64 -32
35 4 3 11 9 121 33
32 41 0 3 0 9 0
31 36 -1 -2 1 4 2
36 32 4 -6 16 36 -24
29 31 -3 -7 09 49 +21
38 30 6 -8 36 64 -48
34 33 2 -5 4 25 -48
32 39 0 1 0 1 100
320 380 0 0 140 398 -93
X 320 Y 380
Here X 32 and Y 38
n 10 n 10
Coefficient of regression of Y on X is
(X X)(Y Y) 93
b YX 0.6643
(X X)
2
140
Coefficient of regression of X on Y is
(X X)(Y Y) 93
b XY 0.2337
(Y Y)
2
398
Equation of the line of regression of X and Y is
XX bXY (Y Y)
X – 32 = -0.2337 (y – 38)
X = -0.2337 y + 0.2337 x 38 + 32
X = -0.2337 y + 40.8806
Equation of the line of regression of Y on X is
YY bYX (X X)
Y – 38 = -0.6643 (x – 32)
Y = -0.6643 x + 38 + 0.6643 x 32
= -0.6642 x + 59.2576
Coefficient of Correlation
r2 = bYX bXY
= -0.6643 x (-0.2337)
r = 0.1552
r = 0.1552
r = 0.394
Now we have to find the most likely mark, in statistics (Y) when marks in economics (X) are 30.
y = -0.6643 x + 59.2575
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
COVARIANCE
Def : If X and Y are random variables, then Covariance between X and Y is defined as
Cov (X, Y) = E(XY) – E(X) . E(Y)
Cov (X, Y) = 0 [If X & Y are independent]
CORRELATION
Types of Correlation
Positive Correlation
(If two variables deviate in same direction)
Negative Correlation
(If two variables constantly deviate in opposite direction)
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Solution
X Y U = X – 68 V = Y – 68 UV U2 V2
65 67 -3 -1 3 9 1
66 68 -2 0 0 4 0
67 65 -1 -3 3 1 9
67 68 -1 0 0 1 0
68 72 0 4 0 0 16
69 72 1 4 4 1 16
70 69 2 1 2 4 1
72 71 4 3 12 16 9
U 0 V 0 UV 24 U2 36 V2 52
Now
U 0
U 0
n
8
V
8
V 1
n 8
Cov (X, Y) = Cov (U, V)
UV 24
UV 03 (1)
n 8
U
2
2 36
U U 0 2.121 (2)
n 8
V
2
52
1 2.345
2
V V (3)
n 8
Cov(U, V) 3
r(X, Y) = r(U, V) =
U .V 2.121 x 2.345
= 0.6031 (by 1, 2, 3)
Example :2.6.2
1
Let X be a random variable with p.d.f. f (x) , 1 x 1 and let
2
Y = x2, find the correlation coefficient between X and Y.
Solution
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
E(X) x.f (x) dx
1
11 1 x2 1 1 1
x. dx 0
2 2 2
1 1 2 2 2
E(X) = 0
E(Y) x2 .f (x) dx
3 1
1x
1 1 1 1 1 2 1
x . dx
1
2
.
1 2 2 3 1 2 3 3 2 3 3
E(XY) = E(XX2)
1
x4
= E(X3) x .f (x) dx 0
3
4 1
E(XY) = 0
Cov(X, Y)
r(X, Y) = (X, Y) = 0
XY
= 0.
Note : E(X) and E(XY) are equal to zero, noted not find x&y.
Example : 1
If the joint pdf of (X, Y) is given by fxy(x, y) = x+y, 0 ≤ x, y ≤ 1, find the pdf of = XY.
Solution
Given fxy(x, y) = x + y
Given U = XY
Let V=Y
u
x & y V
v
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
x 1 x u y y
. 2; 0; 1 (1)
u V v V u v
y x 1 u
(x, y) u v 1 1
J V V2 = 1
(u, v) y y v v
0 1
u v
1
|J|= (2)
V
The joint p.d.f. (u, v) is given by
fuv (u, v) f xy (x, y) |J|
1
(x y)
| v|
1u (3)
u
V v
The range of V :
Since 0 ≤ y ≤ 1, we have 0 ≤ V ≤ 1 ( V = y)
The range of u :
Given 0≤x≤1
u
0
v
0≤u≤v
Hence the p.d.f. of (u, v) 1
is given
u by
f uv (u, v) v , 0 u v, 0 v 1
v v
Now
fU (u) fu,v (u, v) dv
1
fu,v (u, v) dv
u
1 u
2 1 dv
v
u
1 1
v
v u.
1 u
fu(u) = 2(1-u), 0 < u < 1
p.d.f of (u, v) p.d.f of u = XY
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
1u
f (u, v) v fu(u) = 2(1-u), 0 < u < 1
vv
uv
0 ≤ u ≤ v, 0 ≤ v ≤ 1
TUTORIAL QUESTIONS
WORKEDOUT EXAMPLES
Example 1
The j.d.f of the random variables X and Y is given
8xy, 0 x 1, 0yx
f (x, y)
0, otherwise
Find (i) fX(x) (ii) fY(y) (iii) f(y/x)
Solution
We know that
(i) The marginal pdf of ‘X’ is
x
fX(x) = f(x) = f (x, y)dy 8xydy 4x3
0
f (x) 4x , 0 x 1
3
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Solution
E(X) x.f (x) dx
1
1
1 1 x2 1 1 1
x. dx 0
2 2 2
1 1 2 2 2
E(X) = 0
E(Y) x2 .f (x) dx
3 1
1x
1 1 1 1 1 2 1
x . dx
1
2
.
1 2 2 3 1 2 3 3 2 3 3
E(XY) = E(XX2)
1
x4
= E(X3) x .f (x) dx 0
3
4 1
E(XY) = 0
Cov(X, Y)
r(X, Y) = (X, Y) = 0
XY
= 0.
Note : E(X) and E(XY) are equal to zero, noted not find x&y.
Result
Marginal pdf g Marginal pdf y F(y/x)
2y
4x3, 0<x<1 4y, 0<y<x , 0 y x, 0 x 1
x2
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
UNIT - III
RANDOM PROCESSES
Introduction
In chapter 1, we discussed about random variables. Random variable is a function of the
possible outcomes of a experiment. But, it does not include the concept of time. In the
real situations, we come across so many time varying functions which are random in
nature. In electrical and electronics engineering, we studied about signals.
Generally, signals are classified into two types.
(i) Deterministic
(ii) Random
Here both deterministic and random signals are functions of time. Hence it is
possible for us to determine the value of a signal at any given time. But this is not
possible in the case of a random signal, since uncertainty of some element is always
associated with it. The probability model used for characterizing a random signal is called
a random process or stochastic process.
REMARK
i) If t is fixed, then {X(s, t)} is a random variable.
ii) If S and t are fixed {X(s, t)} is a number.
iii) If S is fixed, {X(s, t)} is a signal time function.
NOTATION
Here after we denote the random process {X(s, t)} by {X(t)} where the index set T is assumed to
be continuous process is denoted by {X(n)} or {Xn}.
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Deterministic Process
A process is called deterministic if future value of any sample function can be predicted
from past values.
STATIONARY PROCESS
A random process is said to be stationary if its mean, variance, moments etc are constant.
Other processes are called non stationary.
Example :3.3.1
Show that a first order stationary process has a constant mean.
Solution
Let us consider a random process {X(t 1)} at two different times t 1 and t2.
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
E X t1
xf x, t dx
1
E X t 2 xf x, t 2 dx
[f(x,t2) is the density form of the random process X(t 2)]
Let t2 = t1 + C
= E X t 1
Thus E X t 2 t1
E X
Mean process {X(t1)} = mean of the random process {X(t 2)}.
Definition 2:
If the process is first order stationary, then
Mean = E(X(t)] = constant
Second Order Stationary Process
A random process is said to be second order stationary, if the second order density
function stationary.
f x1 , x 2 ; t1 , t 2 f x1 , x 2 ; t1 C, t 2 C x1 , x 2 and C.
E X 12 , E X 22 , E X 1, X 2 denote change with time, where
X = X(t1); X2 = X(t2).
Strongly Stationary Process
A random process is called a strongly stationary process or Strict Sense Stationary
Process (SSS Process) if all its finite dimensional distribution are invariance under translation of
time 't'.
fX(x1, x2 ; t1, t2) = fX(x1, x2 ; t1+C, t2+C)
fX(x1, x2, x3; t1, t2, t3) = fX(x1, x2, x3; t1+C, t2+C, t3+C)
In general
fX(x1, x2..xn; t1, t2…tn) = fX(x1, x2..xn; t1+C, t2+C..tn+C) for any t1 and any real number
C.
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Let X(t1) and X(t2) be the two given numbers of the random process {X(t)}. The auto
correlation is
R XX t1 , t 2 E X t1 xX t 2
Mean Square Value
Putting t 1 = t 2 = t in (1), we get
RXX (t,t) = E[X(t) X(t)]
R XX t, t E X 2 t is the mean square value of the random process.
Auto Covariance of A Random Process
CXX t1 , t 2 E X t1 E X t1 X t 2 E X t 2
= R XX t1 , t 2 E X t1 E X t 2
Correlation Coefficient
The correlation coefficient of the random process {X(t)} is defined as
t ,t C XX t1 , t 2
Var X t 1 xVar X t 2
XX 1 2
CROSS CORRELATION
The cross correlation of the two random process {X(t)} and {Y(t)} is defined by
RXY (t1, t2) = E[X(t1) Y (t2)]
EVOLUTIONARY PROCESS
A random process that is not stationary in any sense is called as evolutionary process.
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
1
,0 C 2
f 2
0 , otherwise
E[X(t)] =
Xtf d
2
1
= A t d
0
2
A
sin t 0
2
=
2
A
= Sin 2 t Sin t 0
2
= Sint sin t
A
2
= 0 constant
Since E[X(t)] = a constant, the process X(t) is a stationary random process.
ne t t
n
=
n0 n
et t
n
=
n n 1
1
t
n
= et n 1
n1
t t 2
= et ...
0! 1!
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
t
2
t
= t e t 1 ...
1 2
= t e t et
= t , depends on t
Hence Poisson process is not a stationary process.
ERGODIC RANDOM PROCESS
Time Average
The time average of a random process {X(t)} is defined as
1 T
X
Xtdt
2T T
T
Ensemble Average
The ensemble average of a random process {X(t)} is the expected value of the random
variable X at time t
Ensemble Average = E[X(t)]
Ergodic Random Process
{X(t)} is said to be mean Ergodic
If lim X T
T
T
1
lim Xtdt
T 2T T
Mean Ergodic Theorem
Let {X(t)} be a random process with constant mean and let XT be its time average.
Then {X(t)} is mean ergodic if
lim Var X T 0
T
Correlation Ergodic Process
The stationary process {X(t)} is said to be correlation ergodic if the process {Y(t)} is
mean ergodic where
Y(t) = X(t) X(t+)
E y t |T|
lim YT when YT is the time average of Y(t).
MARKOV PROCESS
Definition
A random process {X(t)} is said to be markovian if
P X t n1 X n1 / X n x n , x t n1 x n 1...x t 0 x 0
P X t n1 X n 1 / x t n x n
Where t0 t1 t2 ... tn tn1
Examples of Markov Process
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
1. The probability of raining today depends only on previous weather conditions existed
for the last two days and not on past weather conditions.
2. A different equation is markovian.
Markov Process
MARKOV CHAIN
Definition
We define the Markov Chain as follows
If P{Xn = an/Xn-1 = an-1, Xn-2 = an-2, … X0 = a0}
P{Xn = an / Xn-1 = an-1} for all n. the process {Xn}, n = 0, 1, 2… is called as Markov
Chains.
1. a1, a2, a3, … an are called the states of the Markov Chain.
2. The conditional probability P{Xn aj | X n1 ai} Pij n 1, n is called the one step
transition probability from state ai to state aj at the nth step.
3.The tmp of a Markov chain is a stochastic matricx
i) Pij 0
ii) Pij = 1 [Sum of elements of any row is 1]
Poisson Process
The Poisson Process is a continuous parameter discrete state process which is very useful
model for many practical situations. It describe number of times occurred. When an experiment
is conducted as a function of time.
Property Law for the Poisson Process
Let be the rate of occurrences or number of occurrences per unit time and P n(t) be the
probability of n occurrences of the event in the interval (0, t) is a Poisson distribution with
parameter t.
e t t
n
Pn t
n!
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
=
n2 n1
2 1
n1
e t 2 . n 2 .t n1 t
2 t1 ,n n
n 2 n1
1
2 1
= n, n2 n1
0 , otherwise
(0,-1)
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
TUTORIAL QUESTIONS
Example:2 Given an example of stationary random process and justify your claim.
Solution:
Let us consider a random process X(t) = A as (wt + ) where A & are custom and '' is
uniformly distribution random Variable in the interval
(0, 2).
Since '' is uniformly distributed in (0, 2), we have
1
,0 C 2
2
f
0 , otherwise
E[X(t)] =
Xtf d
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
2
1
= A t d
0
2
A
sin t 0
2
=
2
A
= Sin 2 t Sin t 0
2
= Sint sin t
A
2
= 0 constant
Since E[X(t)] = a constant, the process X(t) is a stationary random process.
Example:3.which are not stationary .Examine whether the Poisson process {X(t)} given by the
et t
probability law P{X(t)=n] = , n = 0, 1, 2, ….
n
Solution
We know that the mean is given by
E X t nPn t
n0
ne t t
n
=
n0 n
et t
n
=
n n 1
1
t
n
= et n 1
n1
t t 2
= et ...
0! 1!
t t 2
= t e t
1 ...
1 2
= t e e
t t
= t , depends on t
Hence Poisson process is not a stationary process.
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
UNIT - 4
CORRELATION AND SPECTRAL DENSITY
Introduction
The power spectrum of a time series x(t) describes how the variance of the data x(t) is
distributed over the frequency components into which x(t) may be decomposed. This
distribution of the variance may be described either by a measure or by a statistical
cumulative distribution function S(f) = the power contributed by frequencies from 0 upto
f. Given a band of frequencies [a, b) the amount of variance contributed to x(t) by
frequencies lying within the interval [a,b) is given by S(b) - S(a). Then S is called the
spectral distribution function of x.
The spectral density at a frequency f gives the rate of variance contributed by
frequencies in the immediate neighbourhood of f to the variance of x per unit frequency.
PROPERTY: 1
The mean square value of the Random process may be obtained from the auto correlation
function.
RXX(), by putting = 0.
is known as Average power of the random process {X(t)}.
PROPERTY: 2
RXX() is an even function of .
RXX () = RXX (-)
PROPERTY: 3
If the process X(t) contains a periodic component of the same period.
PROPERTY: 4
If a random process {X(t)} has no periodic components, and
E[X(t)] = X then
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
|T|
(or)X |T|
i.e., when , the auto correlation function represents the square of the mean of the random
process.
PROPERTY: 5
The auto correlation function of a random process cannot have an arbitrary shape.
Solution:
(i) Given RXX() = 5 Sin n
RXX (–) = 5 Sin n(–) = –5 Sin n
RXX() RXX(–), the given function is not an auto correlation function.
1
(ii) Given R XX () =
1 92
RXX (–) = 1 R
1 9
2 XX
Example : 2
Find the mean and variance of a stationary random process whose auto correlation
function is given by
2
R XX 18
6 2
Solution
2
Given R XX 18
6 2
X2 lim R XX
| |
2
= lim 18
| |
6 2
= 18 lim 2
| | 6 2
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
2
= 18
6
= 18 + 0
= 18
X = 18
E X t = 18
Var {X(t)} = E[X2(t)] - {E[X(t)]}2
We know that
E X 2 t = R XX (0)
2 55
= 18
60 3
1
=
3
Example : 3
Express the autocorrelation function of the process {X'(t)} in terms of the auto correlation
function of process {X(t)}
Solution
Consider, RXX'(t1, t2) = E{X(t1)X'(t2)}
X t 2 h X t 2
= E X t1 lim
n0
h
X t1 X t 2 h X t1 X t 2
= lim E
h0
h
R XX t1 , t 2 h R X t1 , t 2
= lim
h0
h
RXX' (t 1 , t 2 ) = R XX t1 , t 2 (1)
t2
Similarly R XX' (t 1 , t 2) =
R XX 't, t2
t1
RX'X (t 1 , t 2 ) = R XX t1 , t 2 by (1)
t, t2
Auto Covariance
The auto covariance of the process {X(t)} denoted by C XX(t1, t2) or C(t1, t2) is defined as
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
CXX t1 , t 2
E X t1 E X t1 X t 2 E X
t 2
CORRELATION COEFFICIENT
CXX t1 , t 2
XX t1 , t 2
Var X t1 x Var X t 2
Where CXX(t1, t2) denotes the auto covariance.
CROSS CORRELATION
Cross correlation between the two random process {X(t)} and {Y(t)} is defined as
RXY (t1, t2) = E[X(t1) Y(t2)] where X(t1) Y(t2) are random variables.
CROSS COVARIANCE
Let {X(t)} and {Y(t)} be any two random process. Then the cross covariance is defined
as
CXY t1 , t 2
E X t1 E Y t1 X t 2 E Y
t 2
The relation between Mean Cross Correlation and cross covariance is as follows:
CXY t1 , t 2 R XY t1 , t 2 E X t1 E Y
t 2
Definition
Two random process {X(t)} and {Y(t)} are said to be uncorrelated if
C XY t1 , t 2 0, t1 , t 2
Hence from the above remark we have,
RXY (t1, t2) = E[X(t1) Y(t2)]
4.4.1 CROSS CORRELATION COEFFICIENT
c XY t1 , t 2
XY t1 , t 2
Var X t1 Var X t 2
4.4.1 CROSS CORRELATION AND ITS PROPERTIES
Let {X(t)} and {Y(t)} be two random. Then the cross correlation between them is also
defined as
R XY (t, t+) = E X t Y t
= RXY ()
PROPERTY : 1
RXY () = RYX (–)
PROPERTY : 2
If {X(t)} and {Y(t)} are two random process then R XY R XX 0 R YY 0 , where
RXX() and RYY() are their respective auto correlation functions.
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
PROPERTY : 3
If {X(t)} and {Y(t)} are two random process then,
R XY 1 R XX 0 R YY 0
2
Solution
By def. we have
RXY() = RXY (t, t+)
Now, RXY (t, t+) = E[X(t). Y(t+)]
= E [A cos (t + ). A sin ( (t+) + )]
= A 2 E sin t cos t
0 2
1 2
sin t cos t d
2 0
=
1 2 1
2 0 2 sin t t
=
sint t d
1 2 sin 2t 2 sin
2 0
= d
2
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
1 cos2t 2 2
= sin
4 2 0
= 1
cos 2t cos 2t 0
4 2 2 sin 2 0
= 1
cos 2t cos 2t
4 2 2 2sin
= 0 2sin
1
4
1
= sin (3)
2
Substituting (3) in (1) we get
A2
R XY t, t sin
2
F x t x w x t e it dt
Here X() is called "spectrum of x(t)".
Hence x(t) = Inverse Fourier Transform of X()
1
X e
it
= d .
2
Definition
The average power P(T) of x(t) over the interval (-T, T) is given by
T
1
PT x t dt
2T
2
T
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
X T d
2
1
= (1)
2 2T
Definition
The average power PXX for the random process {X(t)} is given by
P lim T E X t dt
XX
T
1
2T
2
E X 2
1 d
=
T
2
lim T
2T
(2)
SXX R XX e id
SXX f R XX e i 2fd
To find RXX() if SXX
1() or SXX(f)iis given
R S e d [inverse Fourier transform of SXX()]
XX XX
2
1
(or) R XX d
i2f
SXX f e
2
[inverse Fourier transform of SXX(f)]
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
The value of the spectral density function at zero frequency is equal to the total area
under the group of the auto correlation function.
Sxx(0) =
R d
XX
TUTORIAL QUESTIONS
1. Find the ACF of {Y(t)} = AX(t)cos (w0 + ) where X(t) is a zero mean stationary random
process with ACF A and w0 are constants and is uniformly distributed over (0, 2 ) and
independent of X(t).
2. Find the ACF of the periodic time function X(t) = A sinwt
3. If X(t) is a WSS process and if Y(t) = X(t + a) – X(t – a), prove that
4. If X(t) = A sin( ), where A and are constants and is a random variable, uniformly
distributed over (- ), Find the A.C.F of {Y(t)} where Y(t) = X2(t).
5.. Let X(t) and Y(t) be defined by X(t) = Acos t + Bsin t and Y(t) = B cos t – Asin t
Where is a constant and A nd B are independent random variables both having zero mean and
varaince . Find the cross correlation of X(t) and Y(t). Are X(t) and Y(t) jointly W.S.S
processes?
6. Two random processes X(t) and Y(t) are given by X(t) = A cos ( ), Y(t) = A sin(
), where A and are constants and is uniformly distributed over (0, 2 ). Find the cross
correlation of X(t) and Y(t) and verify that .
7..If U(t) = X cos t + Y sin t and V(t) = Y cost + X sint t where X and Y are independent random
varables such that E(X) = 0 = E(Y), E[X2] = E[Y2] = 1, show that U(t) and V(t) are not jointly
W.S.S but they are individually stationary in the wide sense.
8. Random Prosesses X(t) and Y(t) are defined by X(t) = A cos ( ), Y(t) = B cos ( )
where A, B and are constants and is uniformly distributed over (0, 2 ). Find the cross
correlation and show that X(t) and Y(t) are jointly W.S.S
WORKEDOUT EXAMPLES
Example 1.Check whether the following function are valid auto correlation function (i) 5 sin n
1
(ii)
1 92
Solution:
(i) Given RXX() = 5 Sin n
RXX (–) = 5 Sin n(–) = –5 Sin n
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
1
(ii) Given R XX () =
1 92
RXX (–) = 1 R
1 9
2 XX
Example : 2
Find the mean and variance of a stationary random process whose auto correlation
function is given by
2
R XX 18
6 2
Solution
2
Given R XX 18
6 2
X2 lim R XX
| |
2
= lim 18
| |
6 2
= 18 lim 2
| | 6 2
2
= 18
6
= 18 + 0
= 18
X = 18
E X t = 18
Var {X(t)} = E[X2(t)] - {E[X(t)]}2
We know that
E X 2 t = R XX (0)
2 55
= 18
60 3
1
=
3
Example : 3
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Express the autocorrelation function of the process {X'(t)} in terms of the auto correlation
function of process {X(t)}
Solution
Consider, RXX'(t1, t2) = E{X(t1)X'(t2)}
X t 2 h X t 2
= E X t1 lim
n0
h
X t1 X t 2 h X t1 X t 2
= lim E
h0
h
R XX t1 , t 2 h R X t1 , t 2
= lim
h0
h
RXX' (t 1 , t 2 ) = R XX t1 , t 2 (1)
t2
Similarly R XX' (t 1 , t 2) =
R XX 't, t2
t1
RX'X (t 1 , t 2 ) = R XX t1 , t 2 by (1)
t, t2
Example :4
Two random process {X(t)} and {Y(t)} are given by
X(t) = A cos (t+), Y(t) = A sin (t + ) where A and are constants and '' is a uniform
random variable over 0 to 2. Find the cross correlation function.
Solution
By def. we have
RXY() = RXY (t, t+)
Now, RXY (t, t+) = E[X(t). Y(t+)]
= E [A cos (t + ). A sin ( (t+) + )]
= A 2 E sin t cos t
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
1
2
0 2
2
1
sin t cos t d
2 0
=
1 2 1
2 2 sin t t
= 0
sint t d
1 sin 2t 2 sin
2
2 0
= d
2
1 cos2t 2 2
= sin
4 2 0
= 1
cos 2t cos 2t 0
4 2 2 sin 2 0
= 1
cos 2t cos 2t
4 2 2 2sin
= 0 2sin
1
4
1
= sin (3)
2
Substituting (3) in (1) we get
A2
R XY t, t sin
2
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
UNIT – 5
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
h(t)
(a)
nput X t
I LTI System O
utput Y t
h(t)
(b)
a) Shows a general single input - output linear system
b) Shows a linear time invariant system
h t uXudu
UNIT IMPULSE RESPONSE TO THE SYSTEM
If the input of the system is the unit impulse function, then the output or response is the
system weighting function.
Y(t) = h(t)
Which is the system weight function.
5.4.1 PROPERTIES OF LINEAR SYSTEMS WITH RANDOM INPUT
Property 1:
If the input X(t) and its output Y(t) are related by
R XY R XX x h
Property 4 :
R YY R XY x h
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Property 5:
R YY R XX x h x h
Property 6:
. Property 7:
SYY SXX H
2
Note:
Instead of taking R XY E X t Y t in properties (3), (4) & (5), if we start
with R XY E X
t Y t , then the above property can also stated as
a) R XY R XY xh
b) R YY R XY xh
c) R YY R XX xh xh
REMARK :
(i) We have written H H* H because
2
H F h
H * F h
= F h
= H
(ii) Equation (c) gives a relationship between the spectral densities of the input and output
process in the system.
(iii) System transfer function:
We call H Fh as the power transfer function or system transfer function.
SOLVED PROBLEMS ON AUTO CROSS CORRELATION FUNCTIONS OF INPUT
AND OUTPUT
Example :5.4.1
Find the power spectral density of the random telegraph signal.
Solution
We know, the auto correlation of the telegraph signal process X(y) is
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
R XX e
2
SXX R XX e id
0
= e
2 i
e d 0 e 2e id
when 0
when 0
0
= e
2i
d e 2e id
0
0
e2i e2i
=
2 i 2 i 0
1 1
= e0 e e e 0
2 i 2 i
1 1
= 1 0 0 1
2 i 2 i
1 1 0
= e e
2 i 2 i
1
= 1 0
2 i
2 i 2 i
=
2 i2 i
4
SXX
42 2
Example : 5.4.2
A linear time invariant system has a impulse response h t et U t . Find the power
spectral density of the output Y(t) corresponding to the input X(t).
Solution:
Given X(t) - Input
Y(t) - output
SYY() - |H()|2 SXX()
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
Now H
h te
it
dt
0
= h te
it
dt e t e it dt
0
= 0+ e it dt
0
ei t
=
i 0
e 0
1
=
i
e
e 0
1
=
i
e
1
=
i
1
|H()| =
i
1
=
2 2
1
SYY SXX
2
2
TUTORIAL QUESTIONS
www.BrainKart.com
Click Here for Probability and Random Processes full study material.
is a real positive constant. Find the ACF of the network’s response Y(t). Find the cross
correlation .
WORKEDOUT EXAMPLES
Example: 1
Find the power spectral density of the random telegraph signal.
Solution
We know, the auto correlation of the telegraph signal process X(y) is
R XX e
2
SXX R XX e id
0
= e
2 i
e d 0 e 2e id
when 0
when 0
0
= e
2i
d e 2e id
0
0
e2i e2i
=
2 i 2 i 0
1 1
= e0 e e e 0
2 i 2 i
1 1
= 1 0 0 1
2 i 2 i
1 1 0
= e e
2 i 2 i
1
= 1 0
2 i
2 i 2 i
=
2 i2 i
4
SXX
42 2
www.BrainKart.com