Professional Documents
Culture Documents
X [ X1 , X 2 , X 3 , .........., X N ]
The last property describes what is known as Marginal probability Distribution functions.
Page | 1
Unit IV (Part B) Random Vectors
2 FX , Y ( x, y)
f X ,Y ( x, y)
x y
x y
(iii) FX ,Y ( x, y) f X ,Y ( x, y) dx dy
x
FX ( x) f X ,Y ( x, y ) dy dx
(iv) y
FY ( y ) f X ,Y ( x, y ) dx dy
f X ( x) f
X ,Y ( x, y ) dy
(v)
fY ( y ) f
X ,Y ( x, y ) dx
Page | 2
Unit IV (Part B) Random Vectors
f X ,Y ( x, y) f X ( x) fY ( y)
For a function of two dimensional random vector the expectation or mean is given by
E[ g ( X ,Y ) ] g ( x, y ) f X ,Y ( x, y) dx dy
Assume g(X1, X2, X3, …. ,Xn) = a1X1+ a2X2+ ……….. aNXN then
N
E[ g ( X 1, X 2 , ...... X N ) E ai X i
i 1
...... (a x
1 1 a2 x2 ..........a N x N f X1 , X 2 ....... X N ( x1 , x2 , ..... x N ) dx1 dx2 ......dxN
N
E ai X i ...... (a1 x1 a2 x2 ..........aN xN f X1 , X 2 ....... X N ( x1 , x2 , ..... xN ) dx1 dx2 ......dxN
i1
N N
E ai X i ai EX i
i1 i1
Let [ g ( X ,Y ) ] X nY k
Then joint moments of two random variables about the origin are given by
E[ g ( X ,Y ) ] E[ X nY k ] x
n
y k f X , Y ( x, y) dx dy
There are two first order joint moment about the origin m10 , m01 given as below
Page | 3
Unit IV (Part B) Random Vectors
m10 E[ X 1Y 0 ] xf
X ,Y ( x, y) dx dy E[ X ]
m01 E[ X Y ] yf ( x, y ) dx dy E[Y ]
0 1
X ,Y
There are three 2nd order joint moments m20, m02, m11as given below
m20 E[ X Y ] x f X , Y ( x, y) dx dy E[ X 2 ]
2 0 2
m02 E[ XY 2 ] y f X , Y ( x, y) dx dy E[Y 2 ]
2
m11 E[ X Y ] xy f ( x, y) dx dy E[ XY ]
1 1
X ,Y
m20 denotes mean square value of X, m02 gives the mean square value of Y and m11 is called
correlation of X and Y and denoted as RXY.
The correlation between two random variables X and Y is defined as
RXY E[ XY ] xy f X ,Y ( x, y) dx dy
If this happens then two random variables X and Y are called uncorrelated.
Statistical independence is sufficient condition for two random variables to be uncorrelated.
However the converse is not true i.e. if two random variables are uncorrelated they may or may
not be independent.
However if two random variables are Gaussian and they are uncorrelated then these are
statistically independent as well.
Joint Central Moments of Two Random variables
The joint central moments of two random variables X and Y are defined as
E[( X mX ) n (Y mY ) k ] (x m
X ) n ( y mY ) k f X , Y ( x, y) dx dy
Page | 4
Unit IV (Part B) Random Vectors
Similarly 02 E[(Y mY ) 2 ] ( y mY ) 2 f X , Y ( x, y) dx dy 2Y Var(Y )
And 11 E[( X mX )(Y mY )] (x m
X )( y mY ) f X , Y ( x, y) dx dy
This is known as covariance between X and Y and denoted as CXY. Thus covariance CXY is
defined as
C XY E[( X mX )(Y mY )] E[ XY ] mX mY
C XY RXY mX mY
C XY E[( X m X )(Y m y )]
X Y X2 Y2
ρ can have value from -1 to +1, ρ = 0 implies that two random variables are uncorrelated.
Page | 5