Professional Documents
Culture Documents
In practical problems we deal with time varying waveforms whose value at a time is
random in nature. For example, the speech waveform, the signal received by
communication receiver or the daily record of stock-market data represents
random variables that change with time. How do we characterize such data?
Such data are characterized as random or stochastic processes. This lecture covers
the fundamentals of random processes..
Random processes
Recall that a random variable maps each sample point in the sample space to a point in
the real line. A random process maps each sample point to a waveform.
Consider a probability space {S , F , P}. A random process can be defined on {S , F , P} as
an indexed family of random variables { X ( s, t ), s S,t } where is an index
set which may be discrete or continuous usually denoting time. Thus a random
process is a function of the sample point and index variable t and may be
written as X (t , ).
Remark
X (t , s3 )
S
s3
s2
X (t , s2 )
s1
X (t , s1 )
0.8373
0.9320
1.6924
1.8636
0.4623
1.9003
0.9720
Continuous-state vs. discrete-state process:
The value of a random process X (t ) is at any time t can be described from its probabilistic
model.
The state is the value taken by X (t ) at a time t, and the set of all such states is called the
state space. A random process is discrete-state if the state-space is finite or countable. It
also means that the corresponding sample space is also finite countable. Other-wise the
random process is called continuous state.
Example Consider the random sequence { X n , n 0} generated by repeated tossing of a
fair coin where we assign 1 to Head and 0 to Tail.
Clearly X n can take only two values- 0 and 1. Hence { X n , n 0} is a discrete-time twostate process.
dFX ( t ) ( x)
dx
at
all
possible
values
of
t.
For
any
positive
integer
n,
random process { X (t ), t } can thus be described by specifying the n-th order joint
distribution
function
n
FX ( t1 ), X ( t2 )..... X ( tn ) ( x1 , x2 .....xn )
x1x2 ...xn
RX (t1 , t2 ) = autocorrelation function of the process at times t1 , t2 E ( X (t 1 ) X (t 2 ))
Note that
RX (t1 , t2 ) = RX (t2 , t1 , ) and
RX (t , t ) EX 2 (t ) sec ond moment or mean - square value at time t.
C X (t1 , t2 )
C X (t1 , t1 ) C X (t2 , t2 )
The autocorrelation function and the autocovariance functions are widely used
to
These
random
variables
define
n jointly random
random
vector
X [ X (t1 ), X (t2 ),..... X (tn )]'. The process X (t ) is called Gaussian if the random vector
[ X (t1 ), X (t2 ),..... X (tn )]' is jointly Gaussian with the joint density function given by
)(
XX
1
X'CX1 X
e 2
det(CX )
X)'
f X ( t ) ( x ) A2 x 2
0
xA
otherwise
X (t ) EX (t )
EA cos( w0t )
A cos( w0t )
1
d
2
0
RX (t1 , t2 ) EA cos( w0t1 ) A cos( w0t2 )
A2 E cos( w0t1 ) cos( w0t2 )
A2
E (cos( w0 (t1 t2 )) cos( w0 (t1 t2 2 )))
2
A2
A2
1
cos( w0 (t1 t2 ))
d
cos( w0 (t1 t2 2 ))
2
2
2
A2
cos( w0 (t1 t2 ))
2
random variables. Thus these two random processes can be described by the
(n m)th order joint distribution function
FX ( t ), X (t
1
2 )..... X
/
( t n ),Y ( t1/ ),Y ( t2/ ),.....Y ( t m
)
( x1 , x2 .....xn , y1 , y2 ..... ym )
2 )..... X
( x1 , x2 .....xn , y1 , y2 ..... ym )
n m
F
/
/
/ ( x1 , x2 ..... xn , y1 , y 2 .... . ym )
x1x2 ...xn y1y2 ...ym X (t1 ), X (t2 )..... X (tn ),Y (t1 ),Y (t2 ),.....Y (t m )
XY (t1 , t2 )
C XY (t1 , t2 )
C X (t1 , t1 ) CY (t2 , t 2 )
On the basis of the above definitions, we can study the degree of dependence between
two random processes