You are on page 1of 11

RANDOM PROCESSES

In practical problems we deal with time varying waveforms whose value at a time is
random in nature. For example, the speech waveform, the signal received by
communication receiver or the daily record of stock-market data represents
random variables that change with time. How do we characterize such data?
Such data are characterized as random or stochastic processes. This lecture covers
the fundamentals of random processes..
Random processes
Recall that a random variable maps each sample point in the sample space to a point in
the real line. A random process maps each sample point to a waveform.
Consider a probability space {S , F , P}. A random process can be defined on {S , F , P} as
an indexed family of random variables { X ( s, t ), s S,t } where is an index
set which may be discrete or continuous usually denoting time. Thus a random
process is a function of the sample point and index variable t and may be
written as X (t , ).
Remark

For a fixed t ( t 0 ), X (t 0 , ) is a random variable.

For a fixed ( 0 ), X (t , 0 ) is a single realization of the random process and


is a deterministic function.

For a fixed ( 0 ) and a fixed t ( t 0 ), X (t , 0 ) is a single number.

When both t and are varying we have the random process X (t , ).

The random process { X ( s, t ), s S , t T } is normally denoted by { X (t )}. Following


figure illustrates a random procee.
A random process is illustrated below.

X (t , s3 )
S
s3
s2

X (t , s2 )
s1

X (t , s1 )

Figure Random Process


( To Be animated)

Example Consider a sinusoidal signal X (t ) A cos t where A is a binary random


variable with probability mass functions p A (1) p and p A (1) 1 p.
Clearly, { X (t ), t } is a random process with two possible realizations X 1 (t ) cos t and
X 2 (t ) cos t. At a particular time t0 X (t0 ) is a random variable with two values
cos t0 and cos t0 .
Continuous-time vs. discrete-time process
If the index set is continuous, { X (t ), t } is called a continuous-time process.
Example Suppose X (t ) A cos( w0 t ) where A and w0 are constants and is
uniformly distributed between 0 and 2 . X (t ) is an example of a continuous-time
process.
4 realizations of the process is illustrated below.
(TO BE ANIMATED)

0.8373

0.9320

1.6924

1.8636

If the index set is a countable set, { X (t ), t } is called a discrete-time process.


Such a random process can be represented as X [n], n Z and called a random sequence.
Sometimes the notation X n , n 0 is used to describe a random sequence indexed by the
set of positive integers.
We can define a discrete-time random process on discrete points of time. Particularly,
we can get a discrete-time random process X [n], n Z by sampling a continuous-time
process { X (t ), t } at a uniform interval T such that X [n] X (nT ).
The discrete-time random process is more important in practical implementations.
Advanced statistical signal processing techniques have been developed to process this
type of signals.
Example Suppose X n 2 cos( 0 n Y ) where 0 is a constant and Y is a random
variable uniformly distributed between and - .
X n is an example of a discrete-time process.

0.4623

1.9003

0.9720
Continuous-state vs. discrete-state process:
The value of a random process X (t ) is at any time t can be described from its probabilistic
model.

The state is the value taken by X (t ) at a time t, and the set of all such states is called the
state space. A random process is discrete-state if the state-space is finite or countable. It
also means that the corresponding sample space is also finite countable. Other-wise the
random process is called continuous state.
Example Consider the random sequence { X n , n 0} generated by repeated tossing of a
fair coin where we assign 1 to Head and 0 to Tail.
Clearly X n can take only two values- 0 and 1. Hence { X n , n 0} is a discrete-time twostate process.

How to describe a random process?


As we have observed above that X (t ) at a specific time t is a random variable and can
be described by its probability distribution function FX (t ) ( x ) P( X (t ) x ). This
distribution function is called the first-order probability distribution function. We can
similarly define the first-order probability density function f X (t ) ( x )

dFX ( t ) ( x)
dx

To describe { X (t ), t } we have to use joint distribution function of the random


variables

at

all

possible

values

X (t1 ), X (t 2 ),..... X (t n ) represents

of

t.

For

any

positive

integer

n,

n jointly distributed random variables. Thus a

random process { X (t ), t } can thus be described by specifying the n-th order joint
distribution

function

FX ( t1 ), X (t2 )..... X (tn ) ( x1 , x2 .....xn ) P ( X (t1 ) x1 , X (t2 ) x2 ..... X (t n ) xn ), n 1 and t n


or th the n-th order joint density function
f X ( t1 ), X ( t2 )..... X ( tn ) ( x1 , x2 .....xn )

n
FX ( t1 ), X ( t2 )..... X ( tn ) ( x1 , x2 .....xn )
x1x2 ...xn

If { X (t ), t } is a discrete-state random process, then it can be also specified by the


collection of n-th order joint probability mass function

p X ( t1 ), X (t2 )..... X ( tn ) ( x1 , x2 .....xn ) P( X (t1 ) x1 , X (t2 ) x2 ..... X (t n ) xn ), n 1 and t n


If the random process is continuous-state, it can be specified by
Moments of a random process
We defined the moments of a random variable and joint moments of random variables.
We can define all the possible moments and joint moments of a random process
{ X (t ), t }. Particularly, following moments are important.

x (t ) Mean of the random process at t E ( X (t )


RX (t1 , t2 ) = autocorrelation function of the process at times t1 , t2 E ( X (t 1 ) X (t 2 ))
Note that
RX (t1 , t2 ) = RX (t2 , t1 , ) and
RX (t , t ) EX 2 (t ) sec ond moment or mean - square value at time t.

The autocovariance function C X (t1 , t2 ) of the random process at time


t1 and t2 is defined by

C X (t1 , t2 ) E ( X (t 1 ) X (t1 ))( X (t2 ) X (t 2 ))


=RX (t1 , t2 ) X (t1 ) X (t2 )
C X (t , t ) E ( X (t ) X (t )) 2 variance of the process at time t.
These moments give partial information about the process.
The ratio X (t1 , t2 )

C X (t1 , t2 )
C X (t1 , t1 ) C X (t2 , t2 )

is called the correlation coefficient.

The autocorrelation function and the autocovariance functions are widely used

to

characterize a class of random process called the wide-sense stationary process.


We can also define higher-order moments
R X (t1 , t 2 , t 3 ) E ( X (t 1 ), X (t 2 ), X (t 3 )) = Triple correlation function at t1 , t 2 , t3 etc.

The above definitions are easily extended to a random sequence { X n , n 0}.


Example
(a) Gaussian Random Process
For any positive integer n, X (t1 ), X (t 2 ),..... X (t n ) represent
variables.

These

random

variables

define

n jointly random
random

vector

X [ X (t1 ), X (t2 ),..... X (tn )]'. The process X (t ) is called Gaussian if the random vector
[ X (t1 ), X (t2 ),..... X (tn )]' is jointly Gaussian with the joint density function given by

f X (t1 ), X (t2 )... X ( tn ) ( x1 , x2 ,..., xn )


where CX E ( X

)(
XX

1
X'CX1 X
e 2

det(CX )

X)'

and X E ( X) E ( X 1 ), E ( X 2 )......E ( X n ) '.


The Gaussian Random Process is completely specified by the autocovariance matrix
C X and hence by the mean vector X and the autocorrelation matrix R X EXX ' .

(b) Bernoulli Random Process


A Bernoulli process is a discrete-time random process consisting of a sequence of
independent and identically distributed Bernoulli random variables. Thus the discrete
time random process { X n , n 0} is Bernoulli process if
P{ X n 1} p and
P{ X n 0} 1 p
Example
Consider the random sequence { X n , n 0} generated by repeated tossing of a fair coin
where we assign 1 to Head and 0 to Tail. Here { X n , n 0} is a Bernoulli process where
each random variable X n is a Bernoulli random variable with
1
and
2
1
p X (0) P{ X n 0}
2
p X (1) P{ X n 1}

(c) A sinusoid with a random phase


X (t ) A cos( w0 t ) where A and w0 are constants and is uniformly distributed
between 0 and 2 . Thus
1
f ( )
2
X (t ) at a particular t is a random variable and it can be shown that

f X ( t ) ( x ) A2 x 2
0

xA
otherwise

The pdf is sketched in the Fig. below:


The mean and autocorrelation of X (t ) :

X (t ) EX (t )
EA cos( w0t )

A cos( w0t )

1
d
2

0
RX (t1 , t2 ) EA cos( w0t1 ) A cos( w0t2 )
A2 E cos( w0t1 ) cos( w0t2 )
A2
E (cos( w0 (t1 t2 )) cos( w0 (t1 t2 2 )))
2
A2
A2
1

cos( w0 (t1 t2 ))
d
cos( w0 (t1 t2 2 ))
2
2
2
A2

cos( w0 (t1 t2 ))
2

Two or More Random Processes


In practical situations we deal with two or more random processes. We often deal with the
input and output processes of a system. To describe two or more random processes we
have to use the joint distribution functions and the joint moments.

Consider two random processes { X (t ), t } and {Y (t ), t }. For any positive integers


/
/
/
n and m , X (t1 ), X (t2 ),..... X (t n ), Y (t 1 ), Y (t 2 ),.....Y (t m ) represent m n jointly distributed

random variables. Thus these two random processes can be described by the
(n m)th order joint distribution function
FX ( t ), X (t
1

2 )..... X

/
( t n ),Y ( t1/ ),Y ( t2/ ),.....Y ( t m
)

( x1 , x2 .....xn , y1 , y2 ..... ym )

P ( X (t1 ) x1 , X (t2 ) x2 ..... X (tn ) xn , Y (t1/ ) y1 , Y (t 2/ ) y2 .....Y (t m/ ) ym )


or the corresponding (n m)th order joint density function
f X (t ), X (t
1

2 )..... X

( tn ),Y ( t1/ ),Y ( t2/ ),.....Y ( tm/ )

( x1 , x2 .....xn , y1 , y2 ..... ym )

n m
F
/
/
/ ( x1 , x2 ..... xn , y1 , y 2 .... . ym )
x1x2 ...xn y1y2 ...ym X (t1 ), X (t2 )..... X (tn ),Y (t1 ),Y (t2 ),.....Y (t m )

Two random processes can be partially described by the joint moments:


Cross correlation function of the processes at times t1 , t 2

RXY (t1 , t2 ) E ( X (t 1 )Y (t2 )) E ( X (t 1 )Y (t2 ))


Similarly,
RYX (t1 , t2 ) E (Y (t 1 ) X (t2 )) E ( X (t 2 )Y (t1 ))
Cross cov ariance function of the processes at times t1 , t 2
C XY (t1 , t2 ) E ( X (t 1 ) X (t1 ))(Y (t2 ) Y (t2 ))

RXY (t1 , t2 ) X (t1 ) Y (t2 )


.
Cross-correlation coefficient

XY (t1 , t2 )

C XY (t1 , t2 )
C X (t1 , t1 ) CY (t2 , t 2 )

On the basis of the above definitions, we can study the degree of dependence between
two random processes

Independent processes: Two random processes { X (t ), t } and {Y (t ), t }.


are called independent if each t1 and t2 , the random variables X (t1 ) and
X (t2 ) are independent.

Uncorrelated processes: Two random processes { X (t ), t } and {Y (t ), t }.


are called uncorrelated if
C XY (t1 , t2 ) 0 t1 , t2
This also implies that for such two processes
RXY (t1 , t2 ) X (t1 ) Y (t 2 )
.

Orthogonal processes: Two random processes { X (t ), t } and {Y (t ), t }.


are called orthogonal if
R XY (t1 , t2 ) 0 t1 , t2
Example Suppose X (t ) A cos( w0t 1 ) and Y (t ) A sin( w0t 2 ) where A and w0 are
constants and 1 and 2 are independent random variables each uniformly distributed
between 0 and 2 . Then
RXY (t1 , t2 ) EX (t1 ) X (t 2 )
= EA cos( w0t1 1 ) A sin( w0t 2 )

and are independent

= EA cos( w0t1 1 ) EA sin( w0t 2 )


=0 0 0
Therefore, random processes { X (t ), t } and {Y (t ), t } are orthogonal.

You might also like