You are on page 1of 19

Chapter 1: Review of Stochastic

Process
Random Variables
• Any
  random situation can be studied by the
axiomatic definitions of probability by defining
– – Universal set of unpredictable outcomes
– – collection of subset of is called, whose elements are called
events.
– – probability representing the unpredictability of these events.
• Difficult, to work with this probability space.
• A random variable x(ζ) is a mapping that assigns a
real number x to every outcome ζ from an abstract
probability space.
•  

• A complex valued random variable


represented as
Representation of Random Variables
Cumulative distribution
Cumulative distribution
function (CDF)
function (CDF)
Probability density
function(pdf )
Probability density
function(pdf
Expectation of) a random
variable
Expectation of a function
Expectation of a random
of random variable
variable
Moments

Central Moments
Expectation of a function
of random variable

Moments

Central Moments
Second central moment or variance

Skewness
Skewness
Kurtosis
Kurtosis
Characteristic functions

Moment generating
Characteristic functions
functions
 Assignment
Cumulants generating functions 1.1
Moment generating•functions
Show that if the mth derivative of the moment generating function
Cumulants with respect to evaluated at 0 results in the mth moment.

Cumulants generating functions

Cumulants
Useful Random Variables
Uniformly distributed RV

Normal RV
Normal RV
Cauchy RV
Cauchy RV

Assignment 1.2
• Find the mean, variance, moments and moment generating
functions of Uniform, Normal and Cauchy RV.
Random Vectors
• A
  real-valued random vector containing M RV is
represented as:

• A random vector is completely characterized by its joint


CDF
• Often written as

• Two random variables and are independent if the events


and are jointly independent. That is,
Statistical Description of Random Vector
Mean vector
Mean vector

Autocorrelation matrix

Autocorrelation matrix
Autocovariace matrix

Correlation coefficient
Uncorrelatedness
Autocovariace
Orthogonal matrix

Correlation coefficient

Uncorrelatedness

Orthogonal
Statistical Description of Two Random
Vectors

Cross-correlation
matrix

Cross-covariance
matrix

Uncorrelated

Orthogonal
Linear Transformations of Random
Vector
•• Linear
  transformations are relatively simple
mappings and are given by the matrix operation

• Assuming L=M, and both are real valued.

• For complex valued RV,

• The determination of is tedious and in practice not


necessary
Statistical Description of Linear
Transformation of Random Vector
Mean vector

Autocorrelation matrix ==

Autocovariace matrix

Cross-correlation

Cross-covariance
Normal Random Vectors
• If
  the components of the random vector x (ζ ) are
jointly normal, then x (ζ ) is a normal random M
-vector. For real valued normal random vector

• Its characteristic equation is


Properties of normal random vector

• Pdf
  and all higher order moments completely
specified from mean vector and covariance
matrix.
• If the components of are mutually
uncorrelated, they are also independent.
Assignment 1.4
• A linear• transformation of a normal
Show that a linear transformation random
of a normal random vector
vector isisalso normal.
also normal.
Sum of Independent Random Variables

• If
  a random variable is a linear combination of
M statistically independent random variables,
the pdf and statistical descriptors are easy.

Mean
Mean
Variance
Probability density
function
Variance

Probability density
function
• Example: What is the pdf of y if its is the sum of
four identical independent random variables
uniformly distributed over [-0.5, 0.5].
• Solution:
U[-0.5, 0.5]*U[-0.5, 0.5]= fx12

fx12*U[-0.5, 0.5]=fx123
fx123*U[-0.5, 0.5]=fx1234
Conditional Density
• Provides
  a measure of the degree of
dependence of the variables on each other.
• From Baye’s rule, the joint pdf is given as

• If they are independent


Ensemble Averages
• In most cases, the statistical descriptors are
obtained from actual observed data instead of
from pdf.
• If the ensemble statistic approaches the actual
statistic, it is called unbiased estimator.
• If the variance of the estimator is very small it
is called consistent estimator.
Assignment 1
•1.   Show that if the mth derivative of the moment
generating function with respect to
evaluated at 0 results in the mth moment.
2. Find the mean, variance, moments and
moment generating functions of Uniform,
Normal and Cauchy RV.
3. Show that a linear transformation of a
normal random vector is also normal.

You might also like