3 views

Uploaded by Suhas Ramesh

for engineering students

- endsem.pdf
- v2_8
- Notes11-09.pdf
- Algebra.pdf
- Eigenvalues Eigenvectors and Differential Equations
- Syllabus
- maths1-formula.pdf
- linear algebra
- geometría analítica
- 04156203
- Peche
- m110 Fr+Ins+Sp Exs.ps
- statika-noteeng (1)
- Lecture 5
- Probability density functions
- AIAA-96-3981-RO Unsteady Adyn & AE Models Using Karhunen-Loeve Eigenmodes CcAE-ROM-Afoil
- AIC729
- ACP_PS3_sol(1).pdf
- ijcttjournal-v1i1p13
- Numerical Techniques for Shallow Water Waves

You are on page 1of 45

Random Vectors

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

1 Introduction

2 Basic Quantities

Definitions

3 Transformation

Transformation of Random Vectors

4 Covariance Matrices

Covariance Matrices

Transformation/Diagonalization

Examples

5 Gaussian Random Vectors

pdf

Transformed pdf

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Introduction

A Random Vector Is Just A Collection of Random Variables

Consider Random Vectors X and Y

X = (X1 , . . . , Xn )T

Y = (Y1 , . . . , Ym )T

Each Entry Is Itself a Random Variable

Could Have Identical Or Different Distribution for Each

Dimension

Could Be Independent Or Have Correlation Between

Dimensions

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Definition

The Joint Distribution Function of the random variables X and Y

is defined as

FXY (x, y ) = P[X x, Y y ]

Properties Include

FXY (, ) = 1

FXY (, y ) = FXY (x, ) = 0

FXY (, y ) = FY (y )

FXY (x, ) = FX (x)

2

xy FXY (x, y ) = fXY (x, y )

Can Generalize to N Random Variables

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Definition

The Probability Distribution Function (PDF) of the random vector

X = (X1 , X2 , . . . , Xn )T is defined as

FX (x) = P(X x)

= P(X1 x1 , X2 x2 , . . . , Xn xn )

= P (nk=1 {Xk xk })

Has Same Properties FX () = 1 and FX () = 0

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Definition

The Probability Density Function (pdf) of the random vector

X = (X1 , X2 , . . . , Xn )T can be obtained from FX (x) as

n FX (x)

fX (x) =

x1 . . . xn

Derivatives must exist

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Definition

The Joint Distribution Function of the random vectors

X = (X1 , X2 , . . . , Xn )T and Y = (Y1 , Y2 , . . . , Ym )T is defined as

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Definition

The Joint Density Function of the random vectors

X = (X1 , X2 , . . . , Xn )T and Y = (Y1 , Y2 , . . . , Ym )T is defined as

fX,Y (x, y) =

x1 . . . xn y1 . . . ym

Derivatives must exist

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Definition

The Marginal Density Function of the random vector

X = (X1 , X2 , . . . , Xn )T can be obtained from the joint density

function as

Z Z

fX (x) = fXY (x, y)dy1 . . . dym

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Mean Vector

Definition

The Mean Vector of the random vector X = (X1 , X2 , . . . , Xn )T

is a vector whose elements are given by

Z Z

i = xi fX (x1 , . . . , xn )dx1 . . . dxn

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Definition

The random vector X = (X1 , X2 , . . . , Xn )T is Jointly Gaussian iff

the joint density function has the form

1 1 1 T

fX (x) = exp (x )K (x )

(2)n/2 | det(K)|1/2 2

with Ki,j = E {(Xi i )(Xj j )}, i is the mean of Xi ,

x = (x1 , . . . , xn )T , and = (1 , . . . , n )T

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Uncorrelated

Definition

Let X and Y be real ndimensional random vectors with mean

vectors X and Y respectively. The random vectors are

uncorrelated if

E {XYT } = X T Y

Natural Extension of Uncorrelated Definition for Random

Variables

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Orthogonal

Definition

Let X and Y be real ndimensional random vectors. The random

vectors are orthogonal if

E {XYT } = 0

Implies That Expected Value of Inner Product is Zero, i.e.

E {XT Y} = 0

The Inner Product Yields a Scalar

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Independent

Definition

Let X and Y be real ndimensional random vectors with joint pdf

fXY (x, y). The random vectors are independent if

fXY (x, y) = fX (x)fY (y)

Natural Extension of Independence Definition for Random

Variables

Independence Uncorrelated

Uncorrelated + Gaussian Independence

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Problem Statement

pdf fX (x)

Create New Vector of Random Variables Y Using

Transformation

Y1 = g1 (X1 , X2 , . . . , Xn )

Y2 = g2 (X1 , X2 , . . . , Xn )

..

.

Yn = gn (X1 , X2 , . . . , Xn )

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Problem Solution

x1 = 1 (y1 , y2 , . . . , yn )

x2 = 2 (y1 , y2 , . . . , yn )

..

.

xn = n (y1 , y2 , . . . , yn )

In General, There Could Be Multiple Solutions, Denoted as

x(i) for i = 1, . . . , r

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Problem Solution

r

X

fY (y) = fX (x(i) )/|Ji |

i=1

denotes the determinant operation

The Jacobian Is Defined As

g1 g1

x1 xn

J = ... ..

.

gn gn

x1 xn

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

An Example

Notes

Random Vector Transformation

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Covariance Matrices

Definition

Definition

Let X be a real-valued random vector with associated mean vector

. The covariance matrix K is

K = E [(X )(X )T ]

Notes

Covariance Form

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Covariance Matrices

zT Az 0

A matrix A is positive definite if

zT Az > 0

Notes

A Proof

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Covariance Matrices

characteristic equation

A =

Typically Normalize Eigenvectors To Unit Length, i.e.

T = ||||2 = 1

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Covariance Matrices

Comments

signal processing, STAP, information theory, pattern

recognition, classification, etc

Can Show That Eigenvalues of a Real, Symmetric Matrix are

Always 0

Can Show That Eigenvectors of R.S. Matrix Are Mutually

Orthogonal

Eigenvalues Satisfy det(A I) = 0

Notes/Matlab

Eigenvalue and Eigenvector Computations

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformation/Diagonalization

Introduction

Matrices

Diagonalization of a Covariance Matrix

Joint Diagonalization of Two Covariance Matrices

These Techniques Are Useful In Signal Processing,

Classification/Discrimination, etc

These Approaches Are Often Seen In Journal Papers,

Textbooks, etc

Were Essentially Going to Rotate The Covariance Matrix

To Get Uncoupled Random Variables

First we Need Some Basic Definitions/Theorems

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformation/Diagonalization

Definitions

Definition

Two n n matrices A and B are similar if there exists an n n

matrix T with det(T) 6= 0 such that

T1 AT = B

T Is A Transformation Matrix

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformation/Diagonalization

Theorems

Theorem

Let M be a real symmetric (r.s.) matrix with eigenvalues

1 , . . . , n . Then M has n mutually orthogonal unit eigenvectors

1 , . . . , n .

Theorem

An n n matrix M is similar to a diagonal matrix if and only if M

has n linearly independent eigenvectors.

Matrix

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformation/Diagonalization

Diagonalization

Covariance Matrix M, i.e.

U = (1 , . . . , n )

Matrix M Is Transformed As

U1 MU =

Matlab

Covariance Diagonalization

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformation/Diagonalization

Joint Diagonalization

Positive Definite

There Exists an n n Matrix V Such That

VT PV = I

VT QV = diag(1 , . . . , n )

where 1 , . . . , n are generalized eigenvalues satisfying

Qvi = i Pvi

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformation/Diagonalization

P1 Q

0

2 Calculate the unnormalized eigenvectors vi for i = 1, . . . , n by

solving

0

(P1 Q I)vi = 0

3 Find normalization constants Ki for i = 1, . . . , n such that

0

vi = Ki vi satisfies viT Pvi = 1

4 Vectors vi form the columns of V

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Example

Matlab

Joint Diagonalization

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Random Vector X Is Our Observation or Measurement Vector

We Define

i = E [X|i ]

Ki = E [(X i )(X i )T |i ]

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Given ndimensional observation X, we reduce to scalar

feature Y via

Y = aT X

This Operation Projects X Along the Direction a Where

||a|| = 1

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

i = E [Y |i ]

= aT i

= aT Ki a

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Different Classes

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Choice Of a Is Important

Want to Maximize Distance Between Means i and Minimize

Variances i2

Want to Maximize Cost Function

(1 2 )2

J(a) =

12 + 22

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

(1 2 )2

J(a) =

12 + 22

(aT 1 aT 2 )2

=

aT K1 a + aT K2 a

(aT (1 2 ))2

=

aT (K1 + K2 )a

(aT (1 2 )(1 2 )T a)

=

aT (K1 + K2 )a

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Q = (1 2 )(1 2 )T

P = K1 + K2

So J(a) Can Be Written As

aT Qa

J(a) =

aT Pa

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Let

a = Vb

So J(a) Can Be Written As

bT VT QVb

J(a) =

bT VT PVb

bT b

=

||b||2

Maximize via the Theorem to Follow

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Theorem

Let M be a real symmetric (r.s.) matrix with largest eigenvalue 1 ,

then

xT Mx

1 = max

||x||2

and the maximum is achieved for x = K 1 where 1 is the unit

eigenvector associated with 1 and K is any real-valued constant.

a = Vb = V1 = v1

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Since a Is Just The Eigenvector Associated With 1 It

Satisfies

P1 Qa = 1 a

Substituting for Q We Have That a Satisfies

P1 (1 2 )(1 2 )T a = 1 a

But (1 2 )T a Is Just a Scalar, Lets Denote as k

k 1

a= P (1 2 )

1

a is called the Fisher Linear Discriminant

Usually normalize such that ||a|| = 1

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Using This Vector Gives Us The Best Chance of Making the

Correct Decision

This Example And Others Similar Would Be Covered in a

Detection and Estimation Theory Course

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Introduction

!

1 x 2

1

fX (x) = exp

2 2

Components

We Can Write pdf as

n

" #

1 1 X xi i 2

= exp

(2)n/2 1 n 2

i=1

i

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Can Be Written More Compactly Using Matrices As

1 1 T 1

fX (x) = exp (x ) K (x )

(2)n/2 det(K)1/2 2

where 2

1 0

K=

..

.

0 n2

What Is det(K)?

What Is K1 ?

Easy To See It Holds For Independent Variables, But Also

Holds for Arbitrary K

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformed pdf

Introduction

Consider Now Transforming X Using the Nonsingular n n

Transformation Matrix A To Yield

Y = AX

What Is The Distribution of Y?

Using Transformation Of Random Vectors Can Show That Y

Is Also A Gaussian Random Vector With

E [Y] = E [AX] = AE [X] = A = Y

= E [A(X )(X )T AT ]

= AE [(X )(X )T ]AT

= AKAT

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformed pdf

1 1 T 1

fY (y) = exp (y Y ) KY (y Y )

(2)n/2 det(KY )1/2 2

Random Signals in Communications

- endsem.pdfUploaded bySAIKAT KUMAR JANA ee14d208
- v2_8Uploaded byamahesh13
- Notes11-09.pdfUploaded bythinh
- Algebra.pdfUploaded bymk3chan
- Eigenvalues Eigenvectors and Differential EquationsUploaded by322399mk7086
- SyllabusUploaded byAzima Tarannum
- maths1-formula.pdfUploaded byRajkumar
- linear algebraUploaded byaniezahid
- geometría analíticaUploaded byroberto
- 04156203Uploaded byshamzaam
- PecheUploaded byMorglodos Morglodos
- m110 Fr+Ins+Sp Exs.psUploaded bychuyenvien94
- statika-noteeng (1)Uploaded byRavi Khandelwal
- Lecture 5Uploaded byKeith Tanaka Magaka
- Probability density functionsUploaded byThilini Nadeesha
- AIAA-96-3981-RO Unsteady Adyn & AE Models Using Karhunen-Loeve Eigenmodes CcAE-ROM-AfoilUploaded byHasan Junaid Hasham
- AIC729Uploaded byYevilina Rizka
- ACP_PS3_sol(1).pdfUploaded byRyan Travers
- ijcttjournal-v1i1p13Uploaded bysurendiran123
- Numerical Techniques for Shallow Water WavesUploaded byNilim Sarma
- 9Uploaded byAnkur Singh
- Cayley GraphsUploaded bydflevine13
- kernellink.pdfUploaded byoreske
- CAAM 335 TextbookUploaded byReny Jose
- NumPyMarkovChainsExamplePartIIIUploaded byivan2
- Stability AnalysisUploaded byDaniel Turizo
- Let Us Break Down the Discussion in to Small TheoremsUploaded bygillnumber22
- FP3 QPUploaded bykwy_candy
- Rsm Module 13Uploaded byLuis AC
- meyer.pdfUploaded byKaffel

- lecture05_083116.pdfUploaded bySuhas Ramesh
- lecture04_08292016.pdfUploaded bySuhas Ramesh
- Homework 01 ProblemsUploaded bySuhas Ramesh
- lecture02_08222016.pdfUploaded bySuhas Ramesh
- lecture03_08242016.pdfUploaded bySuhas Ramesh
- lecture01_08172016Uploaded bySuhas Ramesh
- Seminar Synopsis OnUploaded bySuhas Ramesh
- HelUploaded bySuhas Ramesh
- Ece Vii Embedded System Design [10ec74] NotesUploaded bydeeksha
- Ece Vii Embedded System Design [10ec74] NotesUploaded bydeeksha
- Msinus e BookUploaded byManu Ravindra
- 2014020985506873Uploaded bySuhas Ramesh

- P. Van Den DriesscheUploaded byAle Morales
- Quaternions and Projective Geometry (January 1, 1903)Uploaded bydharmawisdom
- Echo 3Uploaded byIndera Vyas
- Hog WildUploaded bySalman Shah
- Jason Cantarella, Dennis DeTurck, Herman Gluck and Mikhail Teytel- Isoperimetric problems for the helicity of vector fields and the Biot–Savart and curl operatorsUploaded byKeomss
- Cardinal Interpolation and Spline FunctionsUploaded byJose Joaquin Hernandez
- 203672883-Kreider-Introduction-to-Linear-Analysis.pdfUploaded byluisentropia
- Reddy 3e Chapter 6Uploaded byAnonymous ya6gBBwHJF
- C. T. Sun Mechanics of Aircraft StructuresUploaded byYogendraKumar
- Quantnet QuestionsUploaded byKhang Tran
- 0821883194Uploaded byCarmen S.
- Exposing Digital Image Forgeries by IlluminationUploaded byKesav Raj
- s1-s2Uploaded byJacob Jose
- J. J. Sakurai, Jim J. Napolitano-Instructor’s Solutions Manual to Modern Quantum Mechanics (2nd Edition)-Pearson (2010)Uploaded byPrashant Chauhan
- Example ProblemsUploaded byc_a_lindbergh
- analisisUploaded byDanny Chamorro
- StructuralMechanicsModuleUsersGuide.pdfUploaded byFrancisco Ribeiro
- Ch8.pdfUploaded by王大洋
- 2014.pdfUploaded byBIPIN SHARMA
- F1DB5d01Uploaded byRauf Khan Rind
- btechUploaded byAkhil Thota
- chap7Uploaded byhakom
- AndrewUploaded byNguyen Thanh Khuong
- Kivela 2014 multilayer graphUploaded byNorbert Feron
- CHAPTER 2 UpdatedUploaded byMuhammad Mursyid
- Convex FunctionsUploaded byeustachy2
- Curriculum 2013 MTech VLSIUploaded byVenkat Ramanan
- Koch I. Analysis of Multivariate and High-Dimensional Data 2013Uploaded byrciani
- LADW-2014-06.pdfUploaded byKenny
- Ch5 - Response of MDOF SystemsUploaded bykalpanaadhi