Professional Documents
Culture Documents
Mit Ocw Random-Process
Mit Ocw Random-Process
1. Random Processes
A random variable, x(ζ ) , can be defined from a Random event, ζ , by assigning values xi
to each possible outcome, Ai , of the event. Next define a Random Process, x(ζ , t) , a
function of both the event and time, by assigning to each outcome of a random event, ζ , a
function in time, x1 (t ) , chosen from a set of functions, xi (t) .
⎛ A1 → p1 → x1 (t ) ⎞
⎜ ⎟
⎜ A2 → p2 → x2 (t ) ⎟
(6)
⎜ M M M ⎟
⎜ ⎟
⎝ An → pn → xn (t ) ⎠
This “menu” of functions, xi (t) , is called the ensemble (set) of the random process and
may contain infinitely many xi (t) , which can be functions of many independent variables.
EXAMPLE: Roll the dice: Outcome is Ai , where i = 1 : 6 is the number on the face of the
dice and choose some function
xi (t ) = t i (7)
Since a random process is a function of time we can find the averages over some period of
time, T , or over a series of events. The calculation of the average and variance in time are
different from the calculation of the statistics, or expectations, as discussed in the
previously.
1 T
[xi (t) − M { xi (t)}]2 dt
T ∫0
V t {xi (t)} =Tlim
→∞ (9)
1 T
Rit (τ ) =Tlim
→∞ ∫ [ xi (t) − M t { xi (t)}][ xi (t + τ ) − M t { xi (t + τ )}]dt (10)
T 0
• If Rit is large (i.e. Rit (τ ) → 1) then xi (t) and xi (t + τ ) are “similar”. For example, a
sinusoidal function is similar to itself delayed by one or more periods.
• If Rit is small then xi (t ) and xi (t + τ ) are not similar – for example white noise would
result in Rit (τ ) = 0 .
EXPECTED VALUE:
∞
µ x t1 = E{x(t1 )} = ∫ x f ( x, t1 )dx (11)
−∞
STATISTICAL VARIANCE:
∞
σ x 2t1 = E ⎩⎧⎨[ x(t1 ) − µ x (t1 )]2 ⎭⎬⎫ = ∫ (x − µ x ) 2 f ( x, t1 )dx (12)
−∞
AUTO-CORRELATION:
Rx xt1 , t2 = E{x(t1 , ζ )x(t2 , ζ )} = E {[ x(t1 , ζ ) − E{x(t1 , ζ )}][ x(t2 , ζ ) − E{x(t2 , ζ )}]} (13)
Example: Roll the dice: k = 1 : 6 Assign to the event Ak (t) a random process function:
VARIANCE: V t {xk (t )} = T
∫ a 2 cos 2 kωo tdt = a2
lim 1
T →∞ T 0 2
CORRELATION: R t {xk (t )} = T
∫ a 2 cos(kωo t) cos(kωo (t + τ ))dt
lim 1
T →∞ T 0
= a2
coskωoτ
2
Looking at the correlation function then we see that if kωo t = π/2 then the correlation is
zero – for this example it would be the same as taking the correlation of a sine with cosine,
since cosine is simply the sine function phase-shifted by π/2 , and cosine and sine are not
correlated.
Now if we look at the STATISTICS of the random process, for some time t = to ,
xk (ζ , to ) = a cos(kωoto ) = yk (ζ ) (15)
EXPECTED E{ y (ζ )} = ∑ pk xk = ∑ k =1 16 a cos(kωoto )
6
VALUE:
V { y (ζ )} = ∑ k =1 16 a 2 cos 2 (kωo to )
6
VARIANCE:
In general the Expected Value does not match the Time Averaged Value of a function – i.e.
the statistics are time dependent whereas the time averages are time independent.
µ x (t1 ) = E{x(t1 , ζ )} ≠ f (t )
The statistics, or expectations, of a stationary random process are NOT necessarily equal to
the time averages. However for a stationary random process whose statistics ARE equal to
the time averages is said to be ERGODIC.
EXAMPLE: Take some random process defined by y (t, ζ ) :
yi (t ) = a cos(ωot + θi ) (17)
where θ (ζ ) is a random variable which lies within the interval 0 to 2π , with a constant,
uniform PDF such that
⎧1/ 2π ; for(0 ≤ θ ≤ 2π )
fθ (θ ) = ⎨ (18)
⎩ 0; else
2π 1
E{ y (toζ )} = ∫ a cos(ωoto + θ )dθ = 0 (19)
0 2π
a2
V (to ) = R(τ = 0) = (20)
2
1 T
T ∫0
m{ y (t , ζ i )} =Tlim
→∞ a cos(ωo t + θi )dt
(22)
1 a
=Tlim
→∞ [sin(ωoT + θi )] = 0
T ωo
TIME VARIANCE:
a2
V t = R t (0) = (23)
2
CORRELATION:
1 T 2
T ∫0
R t (τ ) =Tlim
→∞ a cos(ωo t + θi ) cos(ωo [t + τ ] + θi )dt
(24)
1
= a 2 cos ωoτ
2
Given the random process y (t, ζ ) it is simplest to assume that its expected value is zero.
Thus, if the expected value equals some constant, E{x(t , ζ )} = xo , where xo ≠ 0 , then we
can simply adjust the random process such that the expected value is indeed
zero: y(ζ , t) = x(t , ζ ) − xo .
The autocorrelation function R(τ ) is then
R(τ ) = E{ y (t , ζ ) y (t + τ , ζ )} = R t (τ )
1 T (25)
T ∫0
=Tlim
→∞ yi (t ) yi (t + τ )dt
N
y(ζ , t) = ∑ an cos(ωn t +ψ n (ζ )) (26)
n=1
where ψ n (ζ ) are all independent random variables in [0, 2π ] with a uniform pdf. This
random process is stationary and ergodic with an expected value of zero.
The autocorrelation R(τ ) is
N
an2
R(τ ) = ∑ cos(ωnτ ) (27)
n=1 2