- Ieee Tnn 2008
- Financial Econometrics lecture notes 2
- M.tech. - Control Systems
- 10.1.1.16
- Signals First Exam
- 1-40
- Random Process
- Foreign Exchange Markets Efficiency and Common Stochastic Trends.pdf
- 1261-3793-1-PB
- ex1
- Sem five
- stoch_cmarkov
- Lecture 11 Vt13
- Adler
- What is statistics todo ingles.docx
- [Åström, Karl J.; Hägglund, Tore] PID Controlle(BookZZ.org) 015
- Power Electronics and Power Systems Syllabus New
- Martingale Density
- The Real Option Model – Evolution and Applications
- Modelling a demand on motor vehicle in US
- salman
- Ashwini CV2
- Convolution and Correlation
- Applebaum Levyprocesses
- Sequential Likelihood Ratio Test for Cognitive Radios
- Sequential Sampling
- m10057
- mit6 003f11 lec14
- 421
- isibc200541
- Edge Elements
- CPC_PETOOL2.pdf
- stokeswaves.pdf
- TAP-04120288-PML
- TAP Parabolic
- APMagazine_07482899
- wafotutor25
- Radio Prop Basics eBook
- Emt
- Antenna Balanis Solutions Chapter 4
- VectGUI
- Interesting Probability Questions
- The Finite Element Method a Review

**Overview of Random Processes
**

2

GOAL

This fundamental course is concerned with the statistical

characterization of random signals.

EE 301

(deterministic signals)

(no randomness)

Signals & Systems

EE 306

(random signals)

• Deterministic processes :

physical process is represented by explicit mathematical relation

• Random processes :

result of a large number of separate causes. Described in

probabilistic terms and by properties which are averages.

3

Let denote the random outcome of an experiment.

To every such outcome, suppose a waveform

is assigned.

The collection of such

waveforms form a

stochastic process.

For fixed (the set of

all experimental outcomes),

is a specific time function.

For fixed t,

is a random variable.

The ensemble of all such realizations

over time represents the stochastic process X(t).

ξ

) , ( ξ t X

S

i

∈ ξ

) , (

1 1 i

t X X ξ =

) , ( ξ t X

t

1

t

2

t

) , (

n

t X ξ

) , (

k

t X ξ

) , (

2

ξ t X

) , (

1

ξ t X

M

M

M

) , ( ξ t X

0

) , ( ξ t X

Random (stochastic) Processes

4

Random (stochastic) Processes

For example

where is a uniformly distributed random variable in

represents a stochastic process.

Stochastic processes are everywhere:

Noise, detection and classification problems, pattern recognition,

stock market fluctuations, various queuing systems

all represent stochastic phenomena.

), cos( ) (

0

ϕ ω + = t a t X

ϕ

(0, 2 ), π

5

Random (stochastic) Processes

If X(t) is a stochastic process, then for fixed t, X(t) represents

a random variable. Its distribution function (cdf) is given by

Notice that depends on t, since for a different t, we obtain

a different random variable. Further

represents the first-order probability density function (pdf) of the

process X(t).

} ) ( { ) , ( x t X P t x F

X

≤ =

) , ( t x F

X

dx

t x dF

t x f

X

X

) , (

) , ( =

∆

6

Random (stochastic) Processes

For t = t

1

and t = t

2

, X(t) represents two different random variables

X

1

= X(t

1

) and X

2

= X(t

2

) respectively. Their joint distribution is

given by

and

represents the second-order density function of the process X(t).

Similarly represents the n

th

order density

function of the process X(t). Complete specification of the stochastic

process X(t) requires the knowledge of

for all and for all n. (an almost impossible task

in reality).

} ) ( , ) ( { ) , , , (

2 2 1 1 2 1 2 1

x t X x t X P t t x x F

X

≤ ≤ =

) , , , , , (

2 1 2 1 n n

t t t x x x f

X

L L

) , , , , , (

2 1 2 1 n n

t t t x x x f

X

L L

n i t

i

, , 2 , 1 , L =

2

1 2 1 2

1 2 1 2

1 2

( , , , )

( , , , )

X

X

F x x t t

f x x t t

x x

∂

=

∂ ∂

∆

7

Random (stochastic) Processes

Mean of a Stochastic Process:

represents the mean value of a process X(t). In general, the mean of

a process can depend on the time index t.

Autocorrelation function of a process X(t) is defined as

and it represents the interrelationship between the random variables

X

1

= X(t

1

) and X

2

= X(t

2

) generated from the process X(t).

Properties:

1.

2.

*

1 2 2 1

( , ) ( , )

XX XX

R t t R t t =

. 0 } | ) ( {| ) , (

2

> = t X E t t R

XX

(Average instantaneous power)

( ) { ( )} ( , )

X

t E X t x f x t dx µ

+∞

−∞

= =

∫

∆

* *

1 2 1 2 1 2 1 2 1 2 1 2

( , ) { ( ) ( )} ( , , , )

XX X

R t t E X t X t x x f x x t t dx dx = =

∫ ∫

∆

8

Random (stochastic) Processes

The function

represents the autocovariance function of the process X(t).

) ( ) ( ) , ( ) , (

2

*

1 2 1 2 1

t t t t R t t C

X X XX XX

µ µ − =

Similarly

0

2

0

0

( ) { ( )} {cos( )}

1

cos( ) 0.

2

X

t E X t aE t

t d

π

µ ω ϕ

ω ϕ ϕ

π

= = +

= + =

∫

). ( cos

2

)} 2 ) ( cos( ) ( {cos

2

)} cos( ) {cos( ) , (

2 1 0

2

2 1 0 2 1 0

2

2 0 1 0

2

2 1

t t

a

t t t t E

a

t t E a t t R

XX

− =

+ + + − =

+ + =

ω

ϕ ω ω

ϕ ω ϕ ω

Example ). 2 , 0 ( ~ ), cos( ) (

0

π ϕ ϕ ω U t a t X + =

9

Stationary Random Processes

Stationary processes exhibit statistical properties that are

invariant to shift in the time index.

Thus, for example, second-order stationarity implies that

the statistical properties of the pairs

{X(t

1

) , X(t

2

) } and {X(t

1

+c) , X(t

2

+c)} are the same for any c.

Similarly first-order stationarity implies that the statistical

properties of X(t

i

) and X(t

i

+c) are the same for any c.

10

Stationary Random Processes

In strict terms, the statistical properties are governed by the

joint probability density function. Hence a process is n

th

-order

Strict-Sense Stationary (S.S.S) if

for any c, where the left side represents the joint density function of

the random variables and

the right side corresponds to the joint density function of the random

variables

A process X(t) is said to be strict-sense stationary if above eqn. is

true for all

) , , , , , ( ) , , , , , (

2 1 2 1 2 1 2 1

c t c t c t x x x f t t t x x x f

n n n n X X

+ + + ≡ L L L L

) ( , ), ( ), (

2 2 1 1 n n

t X X t X X t X X = = = L

). ( , ), ( ), (

2 2 1 1

c t X X c t X X c t X X

n n

+ =

′

+ =

′

+ =

′

L

. and , 2 , 1 , , , 2 , 1 , c any n n i t

i

L L = =

11

Stationary Random Processes

For a first-order strict sense stationary process,

for any c.

In particular c = – t gives

i.e., the first-order density of X(t) is independent of t.

In that case

) , ( ) , ( c t x f t x f

X X

+ ≡

) ( ) , ( x f t x f

X X

=

[ ( )] ( ) , E X t x f x dx a constant. µ

+∞

−∞

= =

∫

12

Stationary Random Processes

i.e., the second order density function of a SSS process depends only

on the difference of the time indices

In that case, the autocorrelation function is given by

i.e., it depends only on the difference of the time indices.

2 1

. t t τ − =

*

1 2 1 2

*

1 2 1 2 2 1 1 2

*

2 1

( , ) { ( ) ( )}

( , , )

( ) ( ) ( ),

XX

X

XX XX XX

R t t E X t X t

x x f x x t t dx dx

R t t R R

τ

τ τ

=

= = −

= − = = −

∫ ∫

∆

∆

Similarly, for a second-order strict-sense stationary process

for any c. For c = – t

1

we get

) , , , ( ) , , , (

2 1 2 1 2 1 2 1

c t c t x x f t t x x f

X X

+ + ≡

1 2 1 2 1 2 2 1

( , , , ) ( , , )

X X

f x x t t f x x t t ≡ −

13

Stationary Random Processes

The basic conditions for the first and second order stationarity are

usually difficult to verify.

In that case, we often resort to a looser definition of stationarity,

known as Wide-Sense Stationarity (W.S.S).

A process X(t) is said to be Wide-Sense Stationary if

(i)

and

(ii)

i.e., the mean is a constant and the autocorrelation function

depends only on the difference between the time indices.

µ = )} ( { t X E

*

1 2 2 1

{ ( ) ( )} ( ),

XX

E X t X t R t t = −

14

Stationary Random Processes

Remarks:

1. Notice that these conditions do not say anything about the

nature of the probability density functions, and instead deal

with the average behavior of the process.

2. Strict-sense stationarity always implies wide-sense

stationarity.

SSS WSS

However, the converse is not true in general, the only exception

being the Gaussian process.

If X(t) is a Gaussian process, then wide-sense stationarity (w.s.s)

strict-sense stationarity (s.s.s).

15

Ergodic Random Processes

If almost every member of the ensemble shows the same statistical

behavior as the whole ensemble, then it is possible to determine the

statistical behavior by examining only one typical sample function.

⇒ Ergodic process

For ergodic process, the mean values and autocorrelation function

can be determined by time averages as well as by ensemble

averages, that is,

Ergodic in the mean process:

Ergodic in the autocorrelation process:

These conditions can exist if the process is stationary.

→ ergodic stationary (not vice versa)

{ }

1

( ) ( )

2

lim

T

T

T

E X t X t dt

T

−

→∞

=

∫

( )

{ }

* *

1

( ) ( ) ( ) ( )

2

lim

T

XX

T

T

R E X t X t X t X t dt

T

τ τ τ

−

→∞

= + = +

∫

16

Power Spectral Density

• Power spectrum of X(t)

• Autocorrelation of X(t)

• Power spectrum and Autocorrelation function

are Fourier transform pair

• Total average power=

2

( ) [ ( )] ( )

j f

x x x

S f F R R e d

π τ

τ τ τ

∞

−

−∞

= =

∫

1 2

( ) [ ( )] ( )

j f

x x x

R F S f S f e df

π τ

τ

∞

−

−∞

= =

∫

(0) ( )

x x

R S f df

∞

−∞

=

∫

17

Random Processes as Inputs/Outputs to LTI Sytems

A deterministic system transforms each input waveform into

an output waveform by operating only on the

time variable t.

Thus a set of realizations at the input corresponding to a process X(t)

generates a new set of realizations at the

output associated with a new process Y(t).

) , (

i

t X ξ

)] , ( [ ) , (

i i

t X T t Y ξ ξ =

)} , ( { ξ t Y

Our goal is to study the output process statistics in terms of the input

process statistics and the system function.

] [⋅ T

→

) (t X

→

) (t Y

t t

) , (

i

t X ξ

) , (

i

t Y ξ

18

Random Processes as Inputs/Outputs to LTI Sytems

Linear Systems: represents a linear system if

Let represent the output of a linear system.

Time-Invariant System: represents a time-invariant system if

i.e., shift in the input results in the same shift in the output also.

If satisfies both, then it corresponds to a linear time-invariant (LTI)

system.

] [⋅ L

)} ( { ) ( t X L t Y =

)}. ( { )} ( { )} ( ) ( {

2 2 1 1 2 2 1 1

t X L a t X L a t X a t X a L + = +

] [⋅ L

) ( )} ( { )} ( { ) (

0 0

t t Y t t X L t X L t Y − = − ⇒ =

] [⋅ L

19

Random Processes as Inputs/Outputs to LTI Sytems

LTI

∫

∫

∞ +

∞ −

∞ +

∞ −

− =

− =

) ( ) (

) ( ) ( ) (

τ τ τ

τ τ τ

d t X h

d X t h t Y

arbitrary

input

t

) (t X

t

) (t Y

) (t X ) (t Y

LTI systems can be uniquely represented in terms of their output

to a delta function

LTI ) (t δ ) (t h

Impulse

Impulse

response of

the system

t

) (t h

Impulse

response

then

20

Random Processes as Inputs/Outputs to LTI Sytems

Output Statistics: The mean of the output process is given by

). ( ) ( ) ( ) (

} ) ( ) ( { )} ( { ) (

t h t d t h

d t h X E t Y E t

X X

Y

∗ = − =

− = =

∫

∫

∞ +

∞ −

∞ +

∞ −

µ τ τ τ µ

τ τ τ µ

h(t)

) (t

X

µ ) (t

Y

µ

In particular if X(t) is wide-sense stationary, then we have

so that

X X

t µ µ = ) (

constant. a c d h t

X X Y

, ) ( ) (

µ τ τ µ µ = =

∫

∞ +

∞ −

21

Random Processes as Inputs/Outputs to LTI Sytems

Output Statistics: The autocorrelation function of the output is given by

*

( ) ( ) ( ) ( ).

YY XX

R R h h τ τ τ τ = ∗ − ∗

h(τ) h*(-τ)

( )

XY

R τ

→

→ →

( )

YY

R τ ( )

XX

R τ

h*(-τ) h(τ)

( )

YX

R τ

→

→ →

( )

YY

R τ

( )

XX

R τ

Thus, Y(t) is w.s.s process.

X(t) and Y(t) are jointly w.s.s.

LTI system

h(t)

wide-sense

stationary process

wide-sense

stationary process.

) (t X

) (t Y

22

Random Processes as Inputs/Outputs to LTI Sytems

Output Statistics: The power spectral density function of the output is

2

*

( ) ( ) ( ) ( ) ( ) ( )

YY XX XX

S f S f H f S f H f H f = =

H(f) H*(f)

( )

XY

S f

→

→ →

( )

YY

S f ( )

XX

S f

H*(f) H(f)

( )

YX

S f

→

→ →

( )

YY

S f

( )

XX

S f

{ }

{ }

*

( ) ( ) ( ) ( )

XY XY

S f F R F E X t Y t τ τ

= = +

{ }

{ }

*

( ) ( ) ( ) ( )

YX YX

S f F R F E Y t X t τ τ

= = +

23

White Noise

• A random process X(t) is called a white process if it

has a flat power spectrum.

– If S

x

(f) is constant for all f

• It closely represents thermal noise

f

Sx(f)

The area is infinite

(Infinite power !)

0

( )

2

n

N

S f =

24

White Noise

• Autocorrelation

2 2 0 0

( ) ( ) ( )

2 2

j f j f

x x

N N

R S f e df e df

π τ π τ

τ δ τ

∞ ∞

−∞ −∞

= = =

∫ ∫

Sn(f)

N

0

/2

f

N

0

/2

Rx(τ)

τ

Rx(τ)=0 if τ=t

2

-t

1

≠0

X(t

1

) and X(t

2

) are uncorrelated if t

1

≠ t

2

25

White Gaussian Noise

• The sampled random variables will be statistically

independent Gaussian random variables

Sn(f)

N

0

/2

f

N

0

/2

Rx(τ)

τ

τ=0

τ≠0

26

Poisson Random Process

Poisson random variable:

L , 2 , 1 , 0 ,

! " duration of interval

an in occur arrivals "

= =

)

`

¹

¹

´

¦

∆

−

k

k

e

k

P

k

λ

λ

∆ =

∆

⋅ = = µ µ λ

T

T np

∆

0 T

43 42 1

arrivals k

27

Poisson Random Process

Definition: X(t) = n(0, t) represents a Poisson process if

the number of arrivals n(t

1

, t

2

) in an interval (t

1

, t

2

) of length

t = t

2

– t

1

is a Poisson random variable with parameter

Thus

. t λ

1 2 2 1

, , 2 , 1 , 0 ,

!

) (

} ) , ( { t t t k

k

t

e k t t n P

k

t

− = = = =

−

L

λ

λ

t t n E t X E λ = = )] , 0 ( [ )] ( [

). , min( ) , (

2 1 2 1

2

2 1

t t t t t t R

XX

λ λ + =

X(t) : not WSS

28

Poisson Impulse Process

Although X(t) does not represent a wide sense stationary

process, its derivative does represent a wide sense

stationary process (called Poisson Impulse Process).

) (t X

′

) (t X ) (t X

′

dt

d ) (⋅

constant a

dt

t d

dt

t d

t

X

X

,

) (

) ( λ

λ µ

µ = = =

′

2

1 2 1 2

( ) ( ).

X X

R t , t t t λ λ δ

′ ′

= + −

29

Gaussian Random Process

• A random process X(t) is a Gaussian process if for all

n and for all , the random variables

has a jointly Gaussian density function, which may be

expressed as

– where

1 2

( , , , )

n

t t t K

2

{ ( ), ( ), , ( )}

i n

X t X t X t K

1

/ 2 1/ 2

1 1

( ) exp[ ( ) ( )]

2 (2 ) [det( )]

T

n

f x x m C x m

C π

−

= − − −

2

[ ( ), ( ), , ( )]

T

i n

x X t X t X t = K

( ) m E X =

{ } (( )( ))

ij i i j j

C c E x m x m = = − −

: n random variables

: mean value vector

: nxn covariance matrix

30

Gaussian Random Process

• Property 1

– For Gaussian process, knowledge of the mean(m)

and covariance(C) provides a complete statistical

description of process

• Property 2

– If a Gaussian process X(t) is passed through a LTI

system, the output of the system is also a

Gaussian process. The effect of the system on

X(t) is simply reflected by the change in mean(m)

and covariance(C) of X(t)

- Ieee Tnn 2008Uploaded byleehiep
- Financial Econometrics lecture notes 2Uploaded bycdertert
- M.tech. - Control SystemsUploaded bySandeep Kumar
- 10.1.1.16Uploaded byAmu517
- Signals First ExamUploaded bymohammad
- 1-40Uploaded byjackychen101
- Random ProcessUploaded byChaithanya Suresh
- Foreign Exchange Markets Efficiency and Common Stochastic Trends.pdfUploaded byAlexander Obi Davids
- 1261-3793-1-PBUploaded byPhong D Le
- ex1Uploaded byclaudio
- Sem fiveUploaded bySuyash Agrawal
- stoch_cmarkovUploaded byMesumbe Paul Akwe
- Lecture 11 Vt13Uploaded bykushtrim_bajqinca
- AdlerUploaded byanhntran4850
- What is statistics todo ingles.docxUploaded byMigue Vanegas
- [Åström, Karl J.; Hägglund, Tore] PID Controlle(BookZZ.org) 015Uploaded byablbabybb
- Power Electronics and Power Systems Syllabus NewUploaded bysoorajthevally
- Martingale DensityUploaded byMarkus Pelger
- The Real Option Model – Evolution and ApplicationsUploaded byIQ3 Solutions Group
- Modelling a demand on motor vehicle in USUploaded byDasha Nikiforova
- salmanUploaded bySanjoy Basak
- Ashwini CV2Uploaded byAshwini Kumar Maurya
- Convolution and CorrelationUploaded byM Vijay Kumar
- Applebaum LevyprocessesUploaded byEduardo Hernandez
- Sequential Likelihood Ratio Test for Cognitive RadiosUploaded byOPTICALMIMOOFDM
- Sequential SamplingUploaded bymili3011
- m10057Uploaded byvaavills
- mit6 003f11 lec14Uploaded byapi-246008426
- 421Uploaded byAntonio Dominguez
- isibc200541Uploaded bypostscript

- Edge ElementsUploaded byOzlem Ozgun
- CPC_PETOOL2.pdfUploaded byOzlem Ozgun
- stokeswaves.pdfUploaded byOzlem Ozgun
- TAP-04120288-PMLUploaded byOzlem Ozgun
- TAP ParabolicUploaded byOzlem Ozgun
- APMagazine_07482899Uploaded byOzlem Ozgun
- wafotutor25Uploaded byOzlem Ozgun
- Radio Prop Basics eBookUploaded byOzlem Ozgun
- EmtUploaded byOzlem Ozgun
- Antenna Balanis Solutions Chapter 4Uploaded byOzlem Ozgun
- VectGUIUploaded byOzlem Ozgun
- Interesting Probability QuestionsUploaded byOzlem Ozgun
- The Finite Element Method a ReviewUploaded byOzlem Ozgun