You are on page 1of 7

Module

2
Random Processes

Version 2, ECE IIT, Kharagpur


Lesson
8
Stochastic Processes

Version 2, ECE IIT, Kharagpur


After reading this lesson, you will learn about

¾ Stochastic Processes
¾ Statistical Average
¾ Wide sense stationary process
¾ Complex valued stochastic process
¾ Power Density Spectrum of a stochastic process

• Many natural and man-made phenomena are random and are functions of time.
e.g., i) Fluctuation of air temperature or pressure, ii) Thermal noise that are
generated due to Brownian motion of carriers in conductors and semiconductors
and iii) Speech signal.

• All these are examples of Stochastic Processes.

• An instantaneous sample value of a stochastic process is a random variable.

• A stochastic process, some time denoted by X(t), may be defined as an ensemble


of (time) sample functions.

Ex: Consider several identical sources each of which is generating thermal noise.

• Let us define Xti=X(t=ti), i=1,2,….., n (n is arbitrarily chosen) as the random


variables obtained by sampling the stochastic process X(t) for any set of time
instants t1<t2<t3…..<tn. So, these ‘n’ random variables can be characterized by
their joint pdf, i.e. p(xt1, xt2, xt3,….., xtn).

• Now, let us consider another set of random variables X (ti + τ ) , generated from
same stochastic process X(t) with an arbitrary time shift ‘ 'τ '
X t1 +τ = X (t = t1 + τ ),....... X ti +τ = X (t = ti + τ ) ……
and with a joint pdf p ( xt1 +τ , xt2 +τ ,....., xtn +τ ) .

In general, these two joint pdf-s may not be same. However, if they are same for
all τ and any ‘n’, then the stochastic process X(t) is said to be stationary in the
strict sense. i.e.,
p ( xt1 , xt 2 ,......, xtn ) = p ( xt1 +τ , xt2 +τ ,....., xtn +τ ) for all 'τ ' and all ‘n’.

Strict sense stationarity means that the statistics of a stochastic process is invariant
to any translation of the time axis. Finding a quick example of a physical random
process which is stationary in the strict sense may not be an easy task.

• If the joint pdf’s are different for any 'τ ' or ‘n’ the stochastic process is non-
stationary.

Version 2, ECE IIT, Kharagpur


Statistical Averages
• Like random variables, we can define statistical averages or ensemble averages
for a stochastic process.

Let X(t) be a random process and Xti = X (t = ti )


Then, the n-th moment of Xti is:
+∞
E[ X t i ] = ∫x
n n
ti . p ( xti )dxti 2.8.1
-∞

• For a stationary process, p ( xti +τ ) = p ( xti ) for all τ and hence the n-th moment is
independent of time.

• Let us consider two random variables X ti = X (ti ), i = 1, 2


The correlation between Xt1 & Xt2 is measured by their joint moment:
+∞ +∞
E[ Xt 1 Xt 2] = ∫ ∫ x .x
-∞ -∞
t1 t2 p ( xt 1, xt 2) dxt 1dxt 2 2.8.2

This joint moment is also known as the autocorrelation function, Φ(t1,t2) of the
stochastic process X(t). In general, Φ(t1,t2) is dependent on t1 and t2.

• However, for a strict-sense stationary process,


p(xt1,xt2) = p(xt1+T, xt2+T),

For any T and this means that autocorrelation function Φ(t1,t2) for a stationary
process is dependent on (t2 – t1), rather than on t2 and t1,
i.e.,
E[ Xt 1 Xt 2 ] = Φ (t 1, t 2) = Φ (t2 - t1) = Φ (τ ) , say where τ = t2 – t1 2.8.3
It may be seen that, Φ (−τ ) = Φ (τ ) and hence, Φ (τ ) is an even function.

• Note that hardly any physical stochastic process can be described as stationary in
the strict sense, while many of such processes obey the following conditions:
a) Mean value, i.e., E[X] of the process is independent of time and,
b) Φ (t1 , t2 ) = Φ (t1 − t2 ) over the time domain of interest.

Such stochastic processes are known as wide sense stationary. This is a less
stringent condition on stationarity.

• The auto-covariance function of a stochastic process is closely related to the


autocorrelation function:
μ (t1 , t2 ) = E{[ X t1 − m(t1 )][ X t 2 − m(t2 )]}
2.8.4
= Φ (t1 − t2 ) − m(t1 ).m(t2 )
Here, m(t1) = E[ Xt1]. For a wide sense stationary process,

Version 2, ECE IIT, Kharagpur


μ (t1 - t2) = μ (t2 - t1) = μ (τ ) = Φ (τ ) - m 2 2.8.5

• It is interesting to note that a Gaussian process is completely specified by its mean


and auto covariance functions and hence, if a Gaussian process is wide-sense
stationary, it is also strict-sense stationary.

• We will mean wide-sense stationary process when we discuss about a stationary


stochastic process.

Averages for joint stochastic processes


Let X(t) & Y(t) denote any two stochastic processes and Xti = X(ti), i=1,2,….n and
Yt ' j = Y (t ' j ), j=1,2,…………,m are the random variables at t1 > t2 > t3….> tn and
t '1 >t '2 >......>t 'm [n and m are arbitrary integers].

The two processes are statistically described by their joint pdf:


p(x t1 ,x t2 ,.....x tn , yt '1 ,yt '2 ,......yt ' m )

Now, the cross-correlation function of X(t) and Y(t) is defined as their joint moment:
α α
Φ xy (t1 , t2 ) = E ( X t1Yt 2 ) = ∫ ∫x
−α −α
y Φ ( xt1 , yt 2 ) dxt1dyt 2
t1 t 2 2.8.6

The cross-covariance is defined as,


μ xy (t1 , t2 ) = Φ xy (t1 , t2 ) − mx (t1 ).m y (t2 ) 2.8.7

Here, mx(t1) = E[ Xt1] and my(t1) = E[ Yt2]. If X and Y are individually and jointly
stationary, we have,
Φ xy (t1 , t2 ) = Φ xy (t1 − t2 )
& 2.8.8
μ xy (t1 , t2 ) = μ xy (t1 − t2 )

In particular, Φ xy (−τ ) = E ( X t1Yt1+τ ) = E ( X t ' −τ .Yt ' ) = Φ yx (τ )


1 1

Now, two stochastic processes are said to be statistically independent if and only if,
p ( xt1 , xt 2 ,.....xtn , yt ' 1 , yt ' 2 ....., yt 'm )
2.8.9
= p ( xt1 , xt 2 ,.....tn ). p ( yt ' 1 , yt ' 2 ....., yt 'm )
for all ti, t’j, n and m

Two stochastic processes are said to be uncorrelated if,


Φ xy (t 1, t2) = E[Xt1].E[Yt2] . This implies, μxy (t1, t 2) = 0 2.8.10
i.e., the processes have zero cross-covariance.

Version 2, ECE IIT, Kharagpur


Complex valued stochastic process
Let, Z(t) = X(t) + jY(t), where X(t) and Y(t) are individual stochastic processes.
One can now define joint pdf of Zti = Z(t =i), i=1,2,…………,n, as
p ( xt1 , xt 2 ,........xtn , yt1 , yt 2, ........ ytn )

~
Now the auto correlation function Φzz (t1, t2) of the complex process z(t) is defined as,

~ 1 1
Φzz (t1, t2) = .E[ Zt 1. Z * t2] = .E[( xt 1 + jyt1)(xt2 + jyt2)]
2 2
1
= .[ Φxx (t1, t 2) + Φyy (t1, t 2) + j{Φyx (t1, t 2) - Φxy (t1, t 2)} ] 2.8.11
2
Here, Φxx (t1, t 2) : Auto-correlation of X;
Φ yy (t 1, t 2) : Auto-correlation of Y;
Φyx (t1, t 2) and Φ xy (t1, t 2) are cross correlations of X&Y.

For a stationary ~z (t ), Φ zz (t1, t 2) = Φ zz (τ ) and Φ * zz (t1, t 2) = Φ * zz (τ ) = Φzz (−τ ) .


2.8.12

Power Density Spectrum of a Stochastic Process


A stationary stochastic process is an infinite energy signal and hence its Fourier
Transform does not exist. The spectral characteristic of a stochastic process is obtained
by computing the Fourier Transform of the auto correlation function.

That is, the distribution of power with frequency is described as:


α

∫α Φ(τ )e
− j 2π f τ
Φ( f ) = dτ 2.8.13

The Inverse Fourier Transform relationship is:


∫ Φ(f)e
j2πfτ
Φ (τ ) = df 2.8.14
-∞
α
Note that, Φ (0) =

∫α Φ( f )df = E ( X 12 ) ≥ 0

∴Φ (0) represents the average power of the stochastic signal, which is the area under the
Φ ( f ) curve.
Hence, Φ ( f ) is called the power density spectrum of the stochastic process.

If X(t) is real, then Φ (τ ) is real and even.


Hence Φ ( f ) is real and even.

Version 2, ECE IIT, Kharagpur


Problems
Q2.8.1) Define and explain with an example a “wide sense stationary stochastic
process”.

Q2.8.2) What is the condition for two stochastic processes to be uncorrelated?

Q2.8.3) How to verify whether two stochastic processes are statistically independent
of each other?

Version 2, ECE IIT, Kharagpur

You might also like