You are on page 1of 82

October 1, 2019 ADSP 1

Outline

• Random Variable
• Random Process
• Ensemble average, Correlation, covariance
• Linear mean square estimation
• Autocorrelation, stationarity, ergodicity

October 1, 2019 ADSP 2


October 1, 2019 ADSP 3
• Sample space – set of all possible experimental
outcomes

• Events –subsets of sample space

• Random variable

October 1, 2019 ADSP 4


• Complex random variable – assigning complex
numbers to elementary events in sample space z=m+jn

• Characterization of a random variable x is given


statistically in terms of a probability assignment or
probability law defined on events in sample space Ω

October 1, 2019 ADSP 5


October 1, 2019 ADSP 6
• Once a probability assignment is defined on events in
the sample space, it is possible to develop a
probabilistic description for the random variable x.

• For a real valued random variable x one type of


probabilistic description is the probability distribution
function.

October 1, 2019 ADSP 7


Probability Distribution Function

October 1, 2019 ADSP 8


Probability Density Function

October 1, 2019 ADSP 9


Continuous probability distribution and density
functions

October 1, 2019 ADSP 10


• It is important to say that probability distribution
function is a probability (I.e., its value is a number
between 0 and one), and it is defined for both
discrete and continuous random variables.

• Probability density function contains information


about probability but it is not a probability since can
have any value positive, even larger than one.

https://www.researchgate.net/post/What_is_the_difference_between_probability_distribution_function_and_probability_density_function

October 1, 2019 ADSP 11


Ensemble averages

• Complete statistical characterization of a random variable


requires it be possible to determine the probability of any
event that may be defined on the sample space

• However, complete characterization of a random variable may


not be necessary if the average behaviour of the random
variable is known

Can you give a simple example to understand the above

for e.g. Air temperature, air pressure

October 1, 2019 ADSP 12


Ensemble averages

• In this course we primarily focus on statistical averages rather


than the probabilistic descriptions of the random variable i.e.
using statistical measures such as mean, variance to
characterize a random variable rather than its probability
distribution function

October 1, 2019 ADSP 13


• Expected value of a random variable

October 1, 2019 ADSP 14


October 1, 2019 ADSP 15
Expectation is a linear operator

October 1, 2019 ADSP 16


Jointly distributed random variables
• Necessary to consider statistical dependencies between
random variables

• Suppose for e.g. we flip two fair coins. The outcome of the
first coin does not affect the outcome of the second coin then

• Sample space is {0,0} {0,1} {1,0} {1,1}

October 1, 2019 ADSP 17


• In other experiment 3 coins are tossed one is a fair coin and
other two are unfair coins

• Head in fair coin – toss unfair coin that results


in head
• Tail in fair coin – toss unfair coin that results
in tail

• In this case knowing the outcome of unfair coin increases the


probability of predicting the outcome of the other coin

October 1, 2019 ADSP 18


Their joint probability distribution characterizes their statistical
relationship

October 1, 2019 ADSP 19


October 1, 2019 ADSP 20
Joint Moments

 
   xyf X ,Y ( x, y )dxdy if X and Y are continuous
E ( XY )   
   xyp X ,Y ( x, y ) if X and Y are discrete
( x, y )RX ,Y

October 1, 2019 ADSP 21


Joint Moments

October 1, 2019 ADSP 22


October 1, 2019 ADSP 23
Independent, uncorrelated and orthogonal random
variables
• Random variables in which the value of one random variable
does not depend on the value of another.

• For e.g. x(1) is a random variable that is defined to be equal to


the number of sunspots that occur on the last day of the year
and if x(2) is the random variable that is defined to be equal to
the amount of US national debt on the last day of the year

• Then these two variables are said to be statistically independent

October 1, 2019 ADSP 24


Independent, uncorrelated and orthogonal random
variables

October 1, 2019 ADSP 25


• A weaker form of independence is correlation

October 1, 2019 ADSP 26


• Correlation between random variables provide important
characterization of statistical dependence that exists between
them

• Will play an important role in signal modeling, Weiner filtering,


spectrum estimation

October 1, 2019 ADSP 27


• In the following we will look at relationship between
correlation and linear predictability through linear mean square
estimation problem

October 1, 2019 ADSP 28


Linear Mean Square Estimation

October 1, 2019 ADSP 29


October 1, 2019 ADSP 30
October 1, 2019 ADSP 31
October 1, 2019 ADSP 32
Gaussian Random Variables

October 1, 2019 ADSP 33


October 1, 2019 ADSP 34
October 1, 2019 ADSP 35
October 1, 2019 ADSP 36
Bias and Consistency

October 1, 2019 ADSP 37


October 1, 2019 ADSP 38
October 1, 2019 ADSP 39
Random Process
(Discrete time)

October 1, 2019 ADSP 40


• Mapping from sample space Ω to discrete random signals

October 1, 2019 ADSP 41


• In the case of coin toss experiment. For time n assign value of
1 when head appears and -1 when tail appears call this x(n). If
the coin flip is independent then it is Bernoulli’s process
October 1, 2019 ADSP 44
Ensemble Averages
October 1, 2019 ADSP 46
Autocorrelation

http://qingkaikong.blogspot.com/201
7/01/signal-processing-how-
autocorrelation.html
October 1, 2019 ADSP 49
• Throughout this course the random process will
always be assumed to have zero mean so that
autocovariance and autocorrelation can be used
interchangeably

October 1, 2019 ADSP 50


Two different Random Process
• Find the autocorrelation of y(n)
October 1, 2019 ADSP 54
summary

October 1, 2019 ADSP 55


Gaussian Process

• Many of the processes that are found in applications are Gaussian or


approximately Gaussian as a result of central limit theorem
Example

Find
solution
October 1, 2019 ADSP 59
Stationary Process

• Stationarity – statistical time invariance


• First order density function independent of time

• First order statistics independent of time

• Process said to first order stationary


Stationary Process

• Second order density function independent of time

• Second order density function depends only on the difference


in time n2-n1

• If a process is second order stationary then it will be first order


stationary too
• The difference k-l is called the lag

October 1, 2019 ADSP 62


October 1, 2019 ADSP 64
Auto covariance and Autocorrelation Matrix
Auto covariance and Autocorrelation
Problem

• Construct a 2×2 autocorrelation matrix from the


autocorrelation sequence of a random phase sinusoid

• Solution

October 1, 2019 ADSP 67


Problem

• Construct a 2×2 autocorrelation matrix for the signal

• Where A, ω1, ω2 are constants and ф1 and ф2 are


uncorrelated random variables

given the autocorrelation


For the signal

October 1, 2019 ADSP 68


Solution

October 1, 2019 ADSP 69


Ergodicity

• Ensemble average describe statistical averages of the


process over the ensemble of all possible discrete time
signal

• Sometimes we may not have many realisations of the


processes a priori

• Hence, we consider estimating mean and autocorrelation


from time average
Ergodicity

• Problem of estimating the mean mx(n) of a random process

• If a large number of sample realizations of the process were


available e.g. xi(n), i = 1,2…L then the average is

• When such realization is not available then it is necessary to


consider methods for estimating ensemble average from a
single realization
Convergence of sample mean to ensemble average
Convergence of sample mean to ensemble average
Convergence of sample mean to ensemble average
October 1, 2019 ADSP 76
White Noise

• A stochastic process of zero mean is said to be white if its power


spectral density is constant for all frequencies

S(ω) = σ2 for -π < ω < π

• White noise has the property that any two of its samples are
uncorrelated by the autocorrelation function

E[v(n) v(n-k)*] = σv2 k=0


= 0 k≠ 0
White Noise

• If the white noise is Gaussian then any two samples of the


process are statistically independent

• A sample realization of zero mean unit variance white Gaussian


noise is shown below
Power Spectrum Estimation

• Fourier analysis is an important tool in the analysis and


description of deterministic discrete-time signals

• We cannot compute Fourier transform of a random process

• So develop a frequency domain representation


October 1, 2019 ADSP 80
Homework (read)

You might also like