You are on page 1of 24

3 Random processes September 9, 2010

3. DISCRETE-TIME RANDOM PROCESSES


Outline
Random variables
Random processes
Filtering random processes
Spectral factorization
Special types of random processes
Autoregressive moving average processes
Autoregressive processes
Moving average processes
1
3 Random processes September 9, 2010
Random variables
Denitions
A random variable x is a function that assigns a number to each outcome of a ran-
dom experiment.
Probability distribution function:
F
x
() =Pr{x }
Probability density function:
f
x
() =
d
d
F
x
()
Mean or expected value:
m
x
=E{x} =
Z

f
x
()d
Variance:

2
x
= Var{x} =E{(x m
x
)
2
} =
Z

(x m
x
)
2
f
x
()d = E{x
2
}m
2
x
2
3 Random processes September 9, 2010
Random variables
Denitions
Joint probability distribution function:
F
x,y
(, ) = Pr{x , y }
Joint probability density function:
f
x,y
(, ) =

2

F
x,y
(, )
Correlation:
r
xy
=E{xy

}
Covariance:
c
xy
= Cov(x, y) =E{(x m
x
)(y m
y
)

} = r
xy
m
x
m

y
Correlation coefcient

xy
=
c
xy

y
=
r
xy
m
x
m

y
, |
xy
| 1
3
3 Random processes September 9, 2010
Random variables
x and y uncorrelated x and y strongly correlated
y = x +n (small n)
Linearly dependent
4
3 Random processes September 9, 2010
Random variables
Denitions
Two random variables x and y are independent if
f
x,y
(, ) = f
x
() f
y
()
Two random variables x and y are uncorrelated if
E{xy

} =E{x}E{y

} or r
xy
= m
x
m

y
or c
xy
=0
Two random variables x and y are orthogonal if
r
xy
=0
Orthogonal random variables are not necessarily uncorrelated
Zero-mean uncorrelated random variables are orthogonal
5
3 Random processes September 9, 2010
Random processes
Denitions
A random process x(n) is an indexed sequence of random variables (a signal)
Mean and variance:
m
x
(n) =E{x(n)}
2
x
(n) = E{|x(n) m
x
(n)|
2
}
Autocorrelation and autocovariance:
r
x
(k, l ) = E{x(k)x

(l )}
c
x
(k, l ) = E{[x(k) m
x
(k)][x(l ) m
x
(l )]

} =r
x
(k, l ) m
x
(k)m

x
(l )
Cross-correlation and cross-covariance
r
xy
(k, l ) = E{x(k)y

(l )}
c
xy
(k, l ) = E{[x(k) m
x
(k)][y(l ) m
y
(l )]

} = r
xy
(k, l ) m
x
(k)m

y
(l )
Uncorrelated and orthogonal processes are dened as for variables but now k, l
6
3 Random processes September 9, 2010
Random processes
Stationarity
First-order stationarity if f
x(n)
() = f
x(n+k)
(). Implies m
x
(n) = m
x
(0) := m
x
Second-order stationarity if f
x(n
1
),x(n
2
)
(
1
,
2
) = f
x(n
1
+k),x(n
2
+k)
(
1
,
2
).
Implies r
x
(k, l ) =r
x
(k l , 0) := r
x
(k l )
Stationarity in the strict sense, if the process is stationary for all orders L >0
Wide-sense stationarity, if i) m
x
(n) =m
x
; ii) r
x
(k, l ) =r
x
(k l ), and iii) c
x
(0) <
Two processes x(n) and y(n) jointly wide-sense stationary if i) both x(n) and y(n)
are wide-sense stationary and ii) r
xy
(k, l ) = r
xy
(k l , 0) := r
xy
(k l )
Properties of WSS processes:
symmetry: r
x
(k) =r

x
(k)
mean-square value: r
x
(0) =E{|x(n)|
2
} 0
maximum value: r
x
(0) |r
x
(k)|
mean-square periodicity: r
x
(k
0
) =r
x
(0) r
x
(k) periodic with period k
0
7
3 Random processes September 9, 2010
Random processes
Autocorrelation and autocovariance matrices
We consider a WSS process x(n) and collect p+1 samples in a vector
x = [x(0), x(1), . . . , x(p)]
T
Autocorrelation matrix:
R
x
=E{xx
H
} =
_

_
r
x
(0) r

x
(1) r

x
(p)
r
x
(1) r
x
(0) r

x
(p1)
.
.
.
.
.
.
.
.
.
r
x
(p) r
x
(p1) r
x
(0)
_

_
Autocovariance matrix:
C
x
=E{(xm
x
)(xm
x
)
H
} =R
x
m
x
m
H
x
where m
x
= [m
x
, m
x
, . . . , m
x
]
T
The autocorrelation matrix of a WSS process x(n) is Toeplitz, Hermitian, and non-
negative denite; hence the eigenvalues of R
x
are nonnegative
8
3 Random processes September 9, 2010
Random processes
x =
1
N
N

n=1
x(n)
Sample mean:
Realization 1
Realization 2
Realization 3
Realization 4
Realization 5
Ensemble mean: E[x(n)]
When is the sample mean equal to the ensemble mean (expectation)?
9
3 Random processes September 9, 2010
Random processes
Ergodicity
Sample mean:
m
x
(N) =
1
N
N1

n=0
x(n)
A WSS process is ergodic in the mean if
lim
N
E{| m
x
(N) m
x
|
2
} = 0 or lim
N
m
x
(N) =m
x
Necessary and sufcient condition:
lim
N
1
N
N1

k=0
c
x
(k) =0
Sufcient condition:
lim
k
c
x
(k) = 0
Similar derivations exist for higher-order averages
10
3 Random processes September 9, 2010
Random processes
White noise
White noise is a discrete-time random process v(n) with autocovariance:
c
v
(k) =
2
v
(k)
i.e. c
v
(k) =0 for k =0.
All variables are uncorrelated with variance
2
v
(probability density not important)
The power spectrum of zero-mean white noise is constant:
P
v
(e
j
) =

k=
r
v
(k)e
j k
=
2
v
11
3 Random processes September 9, 2010
Random processes
Power spectrum
The power spectrum of a WSS process is the DTFT of the autocorrelation:
P
x
(e
j
) =

k=
r
x
(k)e
j k
, Also: P
x
(z) =

k=
r
x
(k)z
k
Since the autocorrelation is conjugate symmetric, the power spectrum is real:
P
x
(z) =P

x
(1/z

) P
x
(e
j
) =P

x
(e
j
)
If the stochastic process is real, the power spectrum is even:
P
x
(z) = P

x
(z

) P
x
(e
j
) =P

x
(e
j
) =P
x
(e
j
)
The power spectrum is nonnegative:
P
x
(e
j
) 0
The total power is proportional to the area under the power spectrum:
E{|x(n)|
2
} =r
x
(0) =
1
2
Z

P
x
(e
j
)d (use inverse DTFT, take k = 0)
12
3 Random processes September 9, 2010
Random processes
Power spectrum
The eigenvalues
i
of the nn autocorrelation matrix are upper and lower bounded
by the maximum and minimum value, respectively, of the power spectrum:
min

P
x
(e
j
)
i
max

P
x
(e
j
)
The power spectrum is related to the mean of |X(e
j
)|
2
as
P
x
(e
j
) = lim
N
1
2N +1
E
_
_
_

n=N
x(n)e
j n

2
_
_
_
If x(n) has a nonzero mean or a periodicity, the power spectrumcontains impulses
13
3 Random processes September 9, 2010
Filtering random processes
Suppose x(n) is a WSS process with mean m
x
and correlation r
x
(k) that is ltered
by a stable LSI lter with unit sample response h(n); then the output y(n) is also
WSS with
m
y
= m
x
H(e
j 0
)
r
y
(k) = r
x
(k) h(k) h

(k)
= r
x
(k) r
h
(k)
where r
h
(k) is the (deterministic) autocorrelation of h(n):
r
h
(k) =h(k) h

(k) =

n=
h(n)h

(n+k)
The power of y(n) is given by
E{|y(n)|
2
} = r
y
(0) =

l =

m=
h(l )r
x
(ml )h

(m) =h
H
R
x
h
where we assume h(n) is zero outside [0, N1] and h = [h(0), h(1), . . . , h(N1)]
T
14
3 Random processes September 9, 2010
Filtering random processes
In terms of the power spectrum, this means that
P
y
(e
j
) = P
x
(e
j
)|H(e
j
)|
2
P
y
(z) = P
x
(z)H(z)H

(1/z

)
So assuming no pole/zero cancelations between P
x
(z) and H(z), if H(z) has a
pole (zero) at z = z
0
, then P
y
(z) also has a pole (zero) at z = z
0
and another at
the conjugate reciprocal location z =1/z

0
If H(e
j
) is a narrow-band bandpass lter with center frequency
0
, bandwidth
, and magnitude 1, then the output power is
E{|y(n)|
2
} = r
y
(0) =
1
2
Z

|H(e
j
)|
2
P
x
(e
j
)d


2
P
x
(e
j
0
)
so the power spectrum describes how the power is distributed over frequency
15
3 Random processes September 9, 2010
Spectral factorization
If the power spectrum P
x
(e
j
) of a WSS process is a continuous function of ,
then P
x
(z) may be factored as
P
x
(z) =

k=
r
x
(k)z
k
=
2
0
Q(z)Q

(1/z

)
Proof:
If ln[P
x
(z)] is analytic in < |z| <1/ then we can write
ln[P
x
(z)] =

k=
c(k)z
k
and ln[P
x
(e
j
)] =

k=
c(k)e
j k
so c(k) is the IDTFT of ln[P
x
(e
j
)], and since ln[P
x
(e
j
)] is real, c(k) = c

(k)
16
3 Random processes September 9, 2010
Spectral factorization
Proof (continued):
Now we can write
P
x
(z) =exp{c(0)}exp
_

k=1
c(k)z
k
_
exp
_
1

k=
c(k)z
k
_
If we now dene the second exponential as
Q(z) = exp
_

k=1
c(k)z
k
_
, |z| >
then we can express the third exponential as
exp
_
1

k=
c(k)z
k
_
=exp
_

k=1
c(k)(1/z

)
k
_

= Q

(1/z

), |z| <1/
and so we obtain
P
x
(z) =
2
0
Q(z)Q

(1/z

) with
2
0
=exp{c(0)}
17
3 Random processes September 9, 2010
Spectral factorization
The lter Q(z) is causal, stable, and minimum phase; moreover it is monic:
Q(z) =1+q(1)z
1
+q(2)z
2
+
A process that can be factorized as described earlier is a regular process
Properties of a regular process
Aregular process can be realized as the output of a lter H(z) driven by white
noise with variance
2
0
If the process is ltered by the inverse lter 1/H(z), then the output is white
noise with variance
2
0
(whitening)
The process and the white noise contain the same information (compression)
18
3 Random processes September 9, 2010
Spectral factorization
Suppose the power spectrum is a rational function
P
x
(z) =
N(z)
D(z)
then the spectral factorization tells us we can factor this as
P
x
(z) =
2
0
Q(z)Q

(1/z

) =
2
0
_
B(z)
A(z)
__
B

(1/z

)
A

(1/z

)
_
where
B(z) =1+b(1)z
1
+ +b(q)z
q
A(z) =1+a(1)z
1
+ +a(p)z
p
whose roots are all inside the unit circle
Since P
x
(e
j
) is real, we have P
x
(z) = P
x
(1/z

); so the poles and zeros occur in


conjugate reciprocal pairs and we simply relate the zeros inside the unit circle to
the zeros of B(z) and the poles inside the unit circle to the zeros of A(z)
19
3 Random processes September 9, 2010
Special types of random processes
Autoregressive moving average processes
Suppose we lter white noise v(n) of variance
2
v
with the lter
H(z) =
B
q
(z)
A
p
(z)
=

q
k=0
b
q
(k)z
k
1+

p
k=1
a
p
(k)z
k
The power spectrum of the output x(n) can then be written as
P
x
(z) =
2
v
B
q
(z)B

q
(1/z

)
A
p
(z)A

p
(1/z

)
P
x
(e
j
) =
2
v
|B
q
(e
j
)|
2
|A
p
(e
j
)|
2
Such a process is known as an autoregressive moving average process of order
(p, q), or ARMA(p, q)
The power spectrum of an ARMA(p, q) process has 2p poles and 2q zeros with
conjugate reciprocal symmetry
20
3 Random processes September 9, 2010
Special types of random processes
Autoregressive moving average processes
From the LCCDE between v(n) and x(n):
x(n) +
p

l =1
a
p
(l )x(nl ) =
q

l =0
b
q
(l )v(nl )
we can multiply both sides with x

(nk) and take the expectation:


r
x
(k) +
p

l =1
a
p
(l )r
x
(k l ) =
q

l =0
b
q
(l )E{v(nl )x

(nk)} =
q

l =0
b
q
(l )r
vx
(k l )
The crosscorrelation between v(n) and x(n) can further be expressed as
r
vx
(k l ) =E{v(k)x

(l )} =

m=
E{v(k)v

(l m)}h

(m) =
2
v
h

(l k)
For k 0, this leads to the Yule-Walker equations
r
x
(k) +
p

l =1
a
p
(l )r
x
(k l ) =
_

2
v
q

l =0
b
q
(l )h

(l k) =
2
v
c
q
(k) ; 0 k q
0 ; k >q
21
3 Random processes September 9, 2010
Special types of random processes
Autoregressive moving average processes
The Yule-Walker equations can be stacked for k =0, 1, . . . , p+q:
_

_
r
x
(0) r
x
(1) r
x
(p)
r
x
(1) r
x
(0)
.
.
. r
x
(p+1)
.
.
.
.
.
.
.
.
.
r
x
(q) r
x
(q 1) r
x
(q p)
r
x
(q +1) r
x
(q) r
x
(q p+1)
.
.
.
.
.
.
.
.
.
r
x
(q +p) r
x
(q +p1) r
x
(q)
_

_
_

_
1
a
p
(1)
.
.
.
a
p
(p)
_

_
=
2
v
_

_
c
q
(0)
c
p
(1)
.
.
.
c
q
(q)
0
.
.
.
0
_

_
Given the lter coefcients a
p
(k) and b
q
(k), it gives a recursion for the autocorre-
lation
Given the autocorrelation, we may compute the lter coefcients a
p
(k) and b
q
(k)
22
3 Random processes September 9, 2010
Special types of random processes
Autoregressive processes
An ARMA(p, 0) process is an autoregressive process, or AR(p):
P
x
(z) =
2
v
|b(0)|
2
A
p
(z)A

p
(1/z

)
P
x
(e
j
) =
2
v
|b(0)|
2
|A
p
(e
j
)|
2
The Yule-Walker equations are given by
r
x
(k) +
p

l =1
a
p
(l )r
x
(k l ) =
2
v
|b(0)|
2
(k) ; k 0
Stacking the Yule-Walker equations for k = 0, 1, . . . , p:
_

_
r
x
(0) r
x
(1) r
x
(p)
r
x
(1) r
x
(0)
.
.
. r
x
(p+1)
.
.
.
.
.
.
.
.
.
r
x
(p) r
x
(p1) r
x
(0)
_

_
_

_
1
a
p
(1)
.
.
.
a
p
(p)
_

_
=
2
v
|b(0)|
2
_

_
1
0
.
.
.
0
_

_
Estimating a
p
(k) from the Yule-Walker equations is easy (linear)
23
3 Random processes September 9, 2010
Special types of random processes
Moving average processes
An ARMA(0, q) process is a moving average process, or MA(q):
P
x
(z) =
2
v
B
q
(z)B

q
(1/z

) P
x
(e
j
) =
2
v
|B
q
(e
j
)|
2
The Yule-Walker equations are given by
r
x
(k) =
2
v
q

l =0
b
q
(l )b

q
(k l ) =
2
v
b
q
(k) b

q
(k)
The autocorrelation function is zero outside [q, q]
Estimating b
q
(k) from the Yule-Walker equations is not easy (nonlinear)
24

You might also like