Chapter 5
Random Processes
By Abinet E.
由NordriDesign提供
www.nordridesign.com
Chapter Outline
➢ Random Process and Ensembles
➢ Autocorrelation Function and Its Properties
➢ Cross -correlation Function and Its Properties
➢ Stationary Random Processes
➢ Ergodic Random Processes
➢ Power Spectral Density function
Determinism vs randomness
There are two main kinds of processes in Nature, distinguished by their time
evolution.
➢ 1. In a deterministic process, the future state of the system is completely
determined by the present state.
Physical systems whose time evolution is described by differential
equations are deterministic; e.g
• classical mechanics (Newton’s equation)
• quantum mechanics (Schrödinger’s equation)
• the weather (chaotic but deterministic!)
➢ 2. Stochastic (or random) processes are non-deterministic: the time
evolution is subject to a probability distribution.
Examples of stochastic processes are
• Random walks
• Markov chains
• Birth-death processes
• Queues 3
Random or Stochastic Process
✓ A RANDOM VARIABLE X, is a rule for assigning to every outcome, A, of an
experiment a number X(A).
• Note: X denotes a random variable and X(A) denotes a particular value.
✓ A RANDOM PROCESS X(t) is a rule for assigning to every outcome w, a function
X(t,A).
• Note: for notational simplicity we often omit the dependence on w.
• A general Random or Stochastic Process can be described as: Collection of time
functions (signals) corresponding to various outcomes of random experiments.
• Collection of random variables observed at different times.
• Examples of random processes in communications:
o Channel noise,
o Information generated by a source,
o Interference.
• A random process is a process (i.e., variation in time or one dimensional space)
whose behavior is not completely predictable and can be characterized by statistical
laws.
• Examples of random processes
o Daily stream flow
o Hourly rainfall of storm events
o Stock index
Average of random process
Random process X = {X (t),t 0} = {X (t, A),t 0}
is a function with two variable , the time t and
the random event A .
•Ensemble average
Expectation operator E{}
The Ensemble average of X (t) : E{X (t)}
The Ensemble average of the autocorrelation :
RX (t, t + ) = E{X (t) X (t + )}
• Time average
T
{}dt
1
Operator : lim
T → 2T −T
The time average of X (t) :
T
x(t)dt
1
X (t) = lim
T → 2T −T
where x(t) is any specific sample function.
The time average of the autocorrelation :
T
x(t)x(t + )dt
1
X (t) X (t + ) = lim
T → 2T −T
Example:
Consider process X (t) = a cos(t + ) , a,
are constant. RV is uniform In the
interval (-π, π)
(1) Find the Ensemble X (t) and its
average of autocorrelation
function.
(2) Find the time average X (t) and its
of autocorrelation
function.
Solution :
(1) The Ensemble average and autocorrelation function
(2) The time average of X (t)
T
1
X (t ) = lim
T → 2T
x(t )dt
−T
T
1
= lim
T →
a cos(t + )dt
2T − T
1 a [sin(T + ) − sin(− T + )]
= lim
T →2T
(−2 | sin(T + ) − sin(− T + ) | 2)
=0
• The time average of the autocorrelation :
T
x(t) x(t + )dt
1
X (t) X (t + ) = lim
T → 2T −T
T
1
= lim
T → 2T
a cos(t + ) a cos(t + +
)dt
1 −T T
= lim
T → 2T a cos(t + ) a cos(t + +
−T
)dt
2
1 a
= lim [sin(2T + + 2) + sin(−2T + + 2)] + a 2 T cos( )}
{
T → 2T 4
(−2 | sin(2T + + 2 ) + sin(−2T + + 2 ) | 2)
2
= cos( )
a
2
Ergodic Random Processes
• Random process {X (t),t T} is ergodic
If it satisfies the following condition .
E{X (t)} = X (t)
RX (t,t + ) = E{X (t)X (t + )}
= X (t) X (t + )
Example:
Consider process X (t) = a cos(t + ) ,
a, are constant. RV is uniform In the
interval (-π, π)
X (t) is ergodic ?
E{X (t)} = 0 = X (t)
RX (t, t + ) = E{X (t) X (t + )}
2
= cos( )
a
2
= X (t)X (t + )
X (t) is ergodic.
Stationary process
• Strict sense stationary process (SSS)
• Wide sense stationary process (WSS)
(Mean)2 =
Solution:
USE AUTO CORRELATION FORMULA
Power Spectral Density for Random Processes
• Let {X (t),t T} be a wide-sense stationary
processes (WSS) ,
1 +T 1 +T
E X (t) dt
2
PXav = E lim | X (t) | dt = Tlim
2
→ 2T
T → 2T −T −T
PXav is the total average power of X (t) .
PXav can be denoted as
+1
S X ()d
PXav =
2 −
S X () is the power spectral density of X (t) .
Spectral Property
If X (t) is WSS ,
+
(1) 1
PXav = S X ()d = R X ( = 0)
2 −
(2) RX ( ) and S X () are Fourier transform pair.
if X (t) is a continuous WSS
+
S X () = RX ( ) e − j d
−
+
S X () e j d
1
RX ( ) =
2 −
if X (t) is a discrete WSS
+
S X () = RX ( ) e− jk
k =−
+
1
RX ( ) = S X () e jk d
2 −
(3) S X () is a even function and more than zero.
S X () = S X (−); S X () 0
if X (t) is a discrete WSS
+
S X () = RX ( ) e− jk
k =−
+
1
RX ( ) = S X () e jk d
2 −
(3) S X () is a even function and more than zero.
S X () = S X (−); S X () 0
+
1
(1) Showing : PXav = S X ()d = R X ( = 0)
2 −
X (t) ⎯Fourier transform⎯→G()
According to Parseval’s theorem , the
total energy of X (t)
+ 2 +
Q = X (t ) 1
G() 2 d
2 −
X
dt−=
X (t), t T
Let X T (t) de defined as : X T (t) =
0, t T
X T (t) ⎯ ⎯⎯ ⎯⎯
Fourier transform
→G T ( )
+ T
GT () = X T (t)e − jtdt = X (t)e − jtdt
− −T
The total energy of X T (t)
+ +T 2 1 +
QX =
2
T
−
XT (t) dt = X (t) dt =
2
GT ( ) 2 d
−T −
lim Q X T = Q X
T →
The time average of QX T
+T +
QX
X (t) 2 dt = 1 1
1
= GT () d
2
T
2T 2T −T 2T 2 −
QX
Then the ensemble average of T
2T
QX 1 +T 1 1 +
E
2T
T
= E 2T X (t ) dt = E 2T 2
2
G T () d
2
−T −
• Continuing…
the total average power of X (t) :
QX 1 +T
+T
2
dt
1 E X (t)
= lim E 2T X (t) dt = lim
2
PXav = lim E T
T → 2T →
T →
−T T 2T −T
+T +T
1 1
= lim
T → 2T −T
R X (t, t)dt = lim
T → 2T −T
R X ( = 0)dt = R X ( = 0)
PXav = RX ( = 0)
QX 1 1 +
PXav = lim E T = lim E 2T 2
T → T →
GT ( ) d
2
2T −
+
1 1 G () 2 d
= lim E T
2 − T → 2T
1
Let S () = lim E G () 2
X
T , then
T → 2T
+
S X ()d
1
PXav =
2 −
(2) Showing:
RX ( ) ⎯⎯⎯⎯⎯ → SX ()
Fourier transform
1
SX () = Tlim E GT ()
2
→ 2T
T
GT () = X (t)e − jtdt
−T
1 E G () 2 = 1 E G () 2 = 1
E GT () G T ()
2T T
2T T
2T
1 T T
− jt − js
= X (t)e dt X (s)e ds
2T −T −T
E
1 T T
=
2T −T X (t)e − jt
dt X (s)e ds js
−T
E
1
T T
− j (t−s )
=
2T −T −T X (t)X (s)e dtds
E
1
T T
2T −T −T
= E X (t)X (s) e − j (t−s )
dtds
T T
1 − j (t−s )
= −T X
2T −T
R (t − s) e dtds
• Continuing…
let u = t + s , = t - s t = u + , s = u -
t t 1 1 2 2
u 1
J= = 2 2 = −
2
s s 1 1
−
u 2 2
1
T T
− j (t−s)
S X () = RX (t − s) e dtds
2T −T −T
u + u −
1 − j , - T T , -T T
= 2T J RX ( ) e dud
2 2
1 2T
2T −| |
− j
= J R X ( ) e du d
−(2T −| |)
2T −2T
| |
2T
− j
= −2T 2T
1 − R X ( ) e d
1 T T +
S () = lim
X R X (t − s) e
− j (t−s)
dtds = RX ( ) e− j d
T → 2T
−T −T −
(3) showing:
S X () = S X (−)
RX ( ) = RX (− )
+
S X () = RX ( ) e− j d
−
+
S X (−) = RX ( ) e j d let 1 = −
−
−
R (− ) e− j1 d (− )
= X 1 1
+
+
R ( ) e− j1 d ( )
=
−
X 1 1
S X () = S X (−)
Example: X (t) is a WSS process , the mean
m X (t) = 0 , the autocorrelation X
R ( ) = 4e −| |
cos( ) ,
(1) Find the power spectral density S X () ;
(2) Find the total average power PXav .
Solution: (1)
RX ( ) ⎯ ⎯⎯⎯⎯
Fourier transform
→ S X ()
+
S X () = RX ( ) e− j d
−
1 1
= 4 + 2
1 + ( + )
2
1 + ( − )
+
(2) 1
PXav = S X ()d
2 −
= RX ( = 0)
=4
Example:
X (t) is a WSS process , the mean m X (t) = 0 ,
the power spectral density
22 + 1
S X () = 4
+ 52 + 4 .
(1)Find the autocorrelation RX ( ) ;
(2)Find the total average PXav
power Solution: (1)
RX ( ) ⎯ ⎯ ⎯ ⎯⎯
Fourier transform
→ S X ()
+ +
22 +1
1 1
RX ( ) = j
d
2 − S X () e j
d = 2 − + 5 + 4
4 2
e
+
2k 4k e j d
= 1 2 +1 + 2 + 4
1 2
2 −
= k e− + k e−2
1 2
1
k= − , k 7
1 2
=
6 12
+
1
(2) PXav = S X ()d
2 −
= RX ( = 0)
5
= k1 + k2 =
12
Q.N. A random process has sample functions of the form X(t) = A
cos(wt+Θ) where w is constant, A is a random variable that has a
magnitude of +1 and −1 with equal probability, and Θ is a random
variable that is uniformly distributed between 0 and 2π. Assume that
the random variables A and Θ are independent. 1. Is X(t) a wide-
sense stationary process?
QUESTION
For good questions
https://www.probabilitycourse.com/chapter5/5_2_5_solved_prob.php