You are on page 1of 7

Response of Linear Systems to Random Signals

Stationary and Weakly Stationary Systems


Introduction: One of the most important applications of probability theory is the
analysis of random signals and systems. Here we will introduce the basic concept of
analying the response of an L!I system if the input is a random process.
Stationary processes: " process is stationary iff its statistical properties remain
unchanged under any time shift #time$in%ariant&.
!he re'uirement for a process to be stationary is rather stringent( which often renders
the study of such process impractical. !o rela) the re'uirement so that the results can
be applied to a wider range of physical systems and processes( we define a process
called weakly stationary or second$order stationary process.
Weakly stationary processes: " process *X#t&( +, is weakly stationary if its first two
moments remain unchanged under any time shift #time$in%ariant& and the co%ariance
between X#t& and X#s& depends only on -t . s-.
/rgodic 0rocess
1or an ergodic process( all statistical a%erages are e'ual to their time$a%erage
e'ui%alents.
0roperties of Random Signals
Introduction: Since random signals are defined in terms of probability( one way to
characterie a random signal is to analye the long$term obser%ation of the process.
!he following analysis assumes the #weakly& stationary2ergodic property.
!he four basic properties of random signals:
o "utocorrelation function . " measure of the degree of randomness of a signal.
o 3ross$correlation function . " measure of correlation between two signals.
o "utospectral density function . " measure of the fre'uency distribution of a
signal.
o 3ross$spectral density function . " measure of the fre'uency distribution of
one signal #output& due to another signal #input&.
3orrelation functions:
o "utocorrelation function: !he autocorrelation function R
xx
(t) of a random
signal X(t) is a measure of how well the future %alues of X(t) can be predicted
based on past measurements. It contains no phase information of the signal.
4 5 4 5 4 5
4 5 4 5
+
4
+
5
and of 061 7oint & ( # and &( # &( # where
& ( # &8 # & # 9 & #
x x x x P t X x t X x
dx dx x x P x x t X t X E R
xx
+
+


o 3ross$correlation function: !he cross$correlation function R
xy
(t) is a measure
of how well the future %alues of one signal can be predicted based on past
measurements of another signal.
&8 # & # 9 & # + t Y t X E R
xy
o 0roperties of correlations functions:
R
xx
(-t) : R
xx
(t) #e%en function&
R
xy
(-t) : R
yx
(t)
R
xx
(0) :
x
2
;
x
2
: mean s'uare %alue of x(t)
R
xx
() :
x
2

!he cross$correlation ine'uality:
& # & # & # t R t R t R
yy xx xy

o /)ample:
R
xx
(t) of a sine wa%e process:
& 4 cos#
4
& #
8 & # 4 sin9 8 4 sin9
4
&8 # & # 9 & #
8 & # 4 sin9 & # and 8 4 sin9 & # Let
elsewhere +
4 +
4
5
& # where & 4 sin# & #
4
4
+
4
4 5
4 5

f
A
R
d t f t f
A
x x E R
t f A x t f A x
P t f A t X
xx
xx

+ + +
+ + +

'

R<)) t # &
t
+ +.++=
4
+
4
Since the en%elop of the autocorrelation function is constant as (
we can conclude that any future %alues of a sine wa%e process can be
accurately predicted based on past measurements.
" time$delay system:
[ ]
& # & # &8 # & # 9 & # & # & #
& # & # & # & # & # & #
&8 # & # 9 & # & # & # & #
+
+
+
+
+



nn xx yy xx xy
mean
xy
xy
R R t y t y E R t R R
t t x t x E t n t t x t x E R
t y t x E R t n t t x t y
+ +
+
1
1
]
1

,
_

+ + +
+ +


Spectral density functions:
o "utospectral density function #autospectrum&: !he autospectral density
function S
xx
(f) of a random signal X(t) is a measure of its fre'uency
distribution. It is the 1ourier transform of the autocorrelation function.
df t f f S df e f S t R
t d e t R t d e t R f S
xx
t f j
xx xx
t f j
xx
t f j
xx xx





+
4
+
4 4
& 4 cos# & # 4 & # & #
& # 4 & # & #


o 3ross$spectral density function #cross$spectrum&: !he cross$spectral density
function S
xy
(f) is a measure of the portion of the fre'uency distribution of one
signal that is a result of another signal. It is the 1ourier transform of the cross$
correlation function.
df t f f S t R t d e t R f S
xy xy
t f
xy xy



+
4
& 4 cos# & # 4 & # & # & #

o Symmetry: S
xx
(-f) : S
xx
(f) #e%en function& S
xy
(-f) : S
xy
*
(f) : S
yx
(f)
o /)ample:
White noise: !he autocorrelation function of a white noise process is
an impulse function. !he implication here is that white noise is totally
unpredictable since its autocorrelation function decays to ero at t > +.
Of course( this is what we e)pect to see in a white noise process( a
random process with infinite energy.
Low$pass white noise: "ssuming the low$pass white noise has a
magnitude of 5 and the bandwidth will %ary. !he autocorrelation
function is computed using I11!.
z
$5

n#t&
x#t& y#t&
S<nn
f
5 f 4++ if
+ otherwise
:
R<nn ifft S<nn # & :
+ 4++ ?++
+
5
4
S<nn
f
f
+ =+ 5++
5+
+
5+
R<nn
t
t
S<nn
f
5 f 5++ if
+ otherwise
R<nn ifft S<nn # &
S<nn
f
f
+ 4++ ?++
+
5
4
R<nn
t
t
+ =+ 5++
=
+
=
S<nn
f
5 f =+ if
+ otherwise
R<nn ifft S<nn # &
S<nn
f
f
+ 4++ ?++
+
5
4
R<nn
t
t
+ =+ 5++
4
+
4
!he autocorrelation function of a low$pass white noise process is a
sinc 9sin#x&2x8 function. !he implication here is that the predictability
of a low$pass white noise process is in%ersely proportional to the
bandwidth of the noise. Since the noise energy is proportional to the
bandwidth( this conclusion is consistent with common sense.
@and$pass white noise: "ssuming the band$pass white noise has a
magnitude of 5 and the bandwidth will %ary. !he autocorrelation
function is computed using I11!.

S<nn
f
5 f 5++ # & f 4++ # &
.
if
+ otherwise
R<nn ifft S<nn # &
S<nn
f
f
+ 4++ ?++
+
5
4
R<nn
t
t
+ =+ 5++
=
+
=
S<nn
f
5 f 5++ # & f 55+ # &
.
if
+ otherwise
R<nn ifft S<nn # &
S<nn
f
f
+ 4++ ?++
+
5
4
R<nn
t
t
+ =+ 5++
+.=
+
+.=
S<nn
f
5 f 5++ # & f 5+5 # &
.
if
+ otherwise
R<nn ifft S<nn # &
S<nn
f
f
+ 4++ ?++
+
5
4
R<nn
t
t
+ =+ 5++
+
!he autocorrelation function of a band$pass white noise process has
the similar property . the predictability is in%ersely proportional to the
bandwidth of the noise. In addition( the autocorrelation function
approaches the sine wa%e process as the bandwidth decreases( a
conclusion that can easily be drawn using L!I system theory for
deterministic systems.
L!I Systems
Introduction: L!I systems are analyed using the correlation2spectral techni'ue. !he
inputs are assumed to be stationary2ergodic random with mean : +.
Ideal system:
& # & # & #
& # & # & #
+
f H f X f Y
d t h x t y
t


Ideal model:
o 3orrelation and spectral relationships:
& # & # & # & # & # & #
& # & # & # & # & # & # & #
4
+ + +
f S f H f S f S f H f S
d t R h t R d d t R h h t R
xx xy xx yy
xx xy xx yy





!otal output noise energy:


df f S
yy y
& #
4

o /)ample: L01 to white noise


K
A
dt e
K
A dt t h A
K
A
df
K f
A
df f S
e
K
A
df e f S t R
K f
A
f S f H f S
A f S
K f f
K f
f H t u e
K
t h
RC K e f H
K f j
f H
K
t
y
yy y
K
t
t f j
yy yy
xx yy
xx
K
t
f j
4
5
& #
or (
4
& 4 # 5
4 & #
4
& # & # and
& 4 # 5
& # & # & #
& # : noise 1or white
& 4 # tan & # and (
& 4 # 5
5
& # & #
5
& #
#L01& & #
4 5
5
& #
+
4
4
+
4 4
+
4
4
4
4
4
5
4
& #

+


+

+


+

One$sided spectrum: #f& : 4 S#f& for f +


H#f&
h#t&
x#t& y#t&
o /)ample: L01 to a sine process
8 & 4 # 5 9 4
8 & 4 # 5 9 4
& #
& #
& 4 # cos
8 & 4 # 5 9 4
& #
& 4 # cos & # & # and
8 & 4 # 5 9 4
& #
& # & # & #
& #
4
5
& # : wa%e sine 1or
& 4 # tan & #
& 4 # 5
5
& #
+ +
+
5
& #
#L01& & #
4 5
5
& #
4
+
4
4
+
4
+
4
+
4
+ 4
+
4
+
4
+
4
4
+
4
5
4
& #
K f
A
df
K f
f f A
df f
t f
K f
A
t R
df t f f t R
K f
f f A
f f H f
f f A f
K f f
K f
f H
t
t e
K
t h
RC K e f H
K f j
f H
y
yy y
yy
yy yy
xx yy
xx
K
t
f j

+

+

'

<

Aodels uncorrelated input and output noise:


mn
(f) :
um
(f) :
!n
(f) : +
& # & #
& # & #
& #
& #
& #
& # & #
& #
& #
& #
& #
& # & # & # & # & # & # & #
& # & # & # & # & # & #
& # & # & # & # & # & #
4
4
f f
f f
f
f
f H
f f
f
f
f
f H
f f H f f f H f f
f f f f f f
t n t ! t y t m t u t x
mm xx
nn yy
uu
!!
mm xx
xy
uu
xy
uu !! uu u! xy
nn !! yy mm uu xx



+ +
+ +
H#f&
h#t&
x#t&
y#t&

m#t&

u#t&
!#t&
n#t&

You might also like