for engineering students

© All Rights Reserved

3 views

for engineering students

© All Rights Reserved

- Optimization of Chemical Processes,Soln
- Multivariate Quality Control Thesis
- P. Karasudhi Foundations of Solid Mechanic
- GQB_ME.pdf
- ch1
- TRENCH_LAGRANGE_METHOD.PDF
- Linear Algebra With Applications 5th Edition
- geometría analítica
- Macro Summary
- Introduction to Computational Chemistry Teaching Hückel Molecular Orbital Theory Using an Excel
- Basics of Multivariate Normal
- Vibration
- LIBRO+DE+MODELAMIENTO
- Matlab Tut
- FinalMath121BFall10 Solutions
- lab.pdf
- Diffeq 3 Systems of Linear Diffeq
- AN IMPROVED TECHNIQUE FOR HUMAN FACE RECOGNITION USING IMAGE PROCESSING
- Matrix Eigenvector
- 41f3f420095a49369979d9d016c0c97360ad

You are on page 1of 45

Random Vectors

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

1 Introduction

2 Basic Quantities

Definitions

3 Transformation

Transformation of Random Vectors

4 Covariance Matrices

Covariance Matrices

Transformation/Diagonalization

Examples

5 Gaussian Random Vectors

pdf

Transformed pdf

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Introduction

A Random Vector Is Just A Collection of Random Variables

Consider Random Vectors X and Y

X = (X1 , . . . , Xn )T

Y = (Y1 , . . . , Ym )T

Each Entry Is Itself a Random Variable

Could Have Identical Or Different Distribution for Each

Dimension

Could Be Independent Or Have Correlation Between

Dimensions

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Definition

The Joint Distribution Function of the random variables X and Y

is defined as

FXY (x, y ) = P[X x, Y y ]

Properties Include

FXY (, ) = 1

FXY (, y ) = FXY (x, ) = 0

FXY (, y ) = FY (y )

FXY (x, ) = FX (x)

2

xy FXY (x, y ) = fXY (x, y )

Can Generalize to N Random Variables

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Definition

The Probability Distribution Function (PDF) of the random vector

X = (X1 , X2 , . . . , Xn )T is defined as

FX (x) = P(X x)

= P(X1 x1 , X2 x2 , . . . , Xn xn )

= P (nk=1 {Xk xk })

Has Same Properties FX () = 1 and FX () = 0

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Definition

The Probability Density Function (pdf) of the random vector

X = (X1 , X2 , . . . , Xn )T can be obtained from FX (x) as

n FX (x)

fX (x) =

x1 . . . xn

Derivatives must exist

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Definition

The Joint Distribution Function of the random vectors

X = (X1 , X2 , . . . , Xn )T and Y = (Y1 , Y2 , . . . , Ym )T is defined as

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Definition

The Joint Density Function of the random vectors

X = (X1 , X2 , . . . , Xn )T and Y = (Y1 , Y2 , . . . , Ym )T is defined as

fX,Y (x, y) =

x1 . . . xn y1 . . . ym

Derivatives must exist

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Definition

The Marginal Density Function of the random vector

X = (X1 , X2 , . . . , Xn )T can be obtained from the joint density

function as

Z Z

fX (x) = fXY (x, y)dy1 . . . dym

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Mean Vector

Definition

The Mean Vector of the random vector X = (X1 , X2 , . . . , Xn )T

is a vector whose elements are given by

Z Z

i = xi fX (x1 , . . . , xn )dx1 . . . dxn

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Definition

The random vector X = (X1 , X2 , . . . , Xn )T is Jointly Gaussian iff

the joint density function has the form

1 1 1 T

fX (x) = exp (x )K (x )

(2)n/2 | det(K)|1/2 2

with Ki,j = E {(Xi i )(Xj j )}, i is the mean of Xi ,

x = (x1 , . . . , xn )T , and = (1 , . . . , n )T

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Uncorrelated

Definition

Let X and Y be real ndimensional random vectors with mean

vectors X and Y respectively. The random vectors are

uncorrelated if

E {XYT } = X T Y

Natural Extension of Uncorrelated Definition for Random

Variables

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Orthogonal

Definition

Let X and Y be real ndimensional random vectors. The random

vectors are orthogonal if

E {XYT } = 0

Implies That Expected Value of Inner Product is Zero, i.e.

E {XT Y} = 0

The Inner Product Yields a Scalar

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Definitions

Independent

Definition

Let X and Y be real ndimensional random vectors with joint pdf

fXY (x, y). The random vectors are independent if

fXY (x, y) = fX (x)fY (y)

Natural Extension of Independence Definition for Random

Variables

Independence Uncorrelated

Uncorrelated + Gaussian Independence

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Problem Statement

pdf fX (x)

Create New Vector of Random Variables Y Using

Transformation

Y1 = g1 (X1 , X2 , . . . , Xn )

Y2 = g2 (X1 , X2 , . . . , Xn )

..

.

Yn = gn (X1 , X2 , . . . , Xn )

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Problem Solution

x1 = 1 (y1 , y2 , . . . , yn )

x2 = 2 (y1 , y2 , . . . , yn )

..

.

xn = n (y1 , y2 , . . . , yn )

In General, There Could Be Multiple Solutions, Denoted as

x(i) for i = 1, . . . , r

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Problem Solution

r

X

fY (y) = fX (x(i) )/|Ji |

i=1

denotes the determinant operation

The Jacobian Is Defined As

g1 g1

x1 xn

J = ... ..

.

gn gn

x1 xn

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

An Example

Notes

Random Vector Transformation

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Covariance Matrices

Definition

Definition

Let X be a real-valued random vector with associated mean vector

. The covariance matrix K is

K = E [(X )(X )T ]

Notes

Covariance Form

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Covariance Matrices

zT Az 0

A matrix A is positive definite if

zT Az > 0

Notes

A Proof

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Covariance Matrices

characteristic equation

A =

Typically Normalize Eigenvectors To Unit Length, i.e.

T = ||||2 = 1

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Covariance Matrices

Comments

signal processing, STAP, information theory, pattern

recognition, classification, etc

Can Show That Eigenvalues of a Real, Symmetric Matrix are

Always 0

Can Show That Eigenvectors of R.S. Matrix Are Mutually

Orthogonal

Eigenvalues Satisfy det(A I) = 0

Notes/Matlab

Eigenvalue and Eigenvector Computations

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformation/Diagonalization

Introduction

Matrices

Diagonalization of a Covariance Matrix

Joint Diagonalization of Two Covariance Matrices

These Techniques Are Useful In Signal Processing,

Classification/Discrimination, etc

These Approaches Are Often Seen In Journal Papers,

Textbooks, etc

Were Essentially Going to Rotate The Covariance Matrix

To Get Uncoupled Random Variables

First we Need Some Basic Definitions/Theorems

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformation/Diagonalization

Definitions

Definition

Two n n matrices A and B are similar if there exists an n n

matrix T with det(T) 6= 0 such that

T1 AT = B

T Is A Transformation Matrix

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformation/Diagonalization

Theorems

Theorem

Let M be a real symmetric (r.s.) matrix with eigenvalues

1 , . . . , n . Then M has n mutually orthogonal unit eigenvectors

1 , . . . , n .

Theorem

An n n matrix M is similar to a diagonal matrix if and only if M

has n linearly independent eigenvectors.

Matrix

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformation/Diagonalization

Diagonalization

Covariance Matrix M, i.e.

U = (1 , . . . , n )

Matrix M Is Transformed As

U1 MU =

Matlab

Covariance Diagonalization

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformation/Diagonalization

Joint Diagonalization

Positive Definite

There Exists an n n Matrix V Such That

VT PV = I

VT QV = diag(1 , . . . , n )

where 1 , . . . , n are generalized eigenvalues satisfying

Qvi = i Pvi

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformation/Diagonalization

P1 Q

0

2 Calculate the unnormalized eigenvectors vi for i = 1, . . . , n by

solving

0

(P1 Q I)vi = 0

3 Find normalization constants Ki for i = 1, . . . , n such that

0

vi = Ki vi satisfies viT Pvi = 1

4 Vectors vi form the columns of V

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Example

Matlab

Joint Diagonalization

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Random Vector X Is Our Observation or Measurement Vector

We Define

i = E [X|i ]

Ki = E [(X i )(X i )T |i ]

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Given ndimensional observation X, we reduce to scalar

feature Y via

Y = aT X

This Operation Projects X Along the Direction a Where

||a|| = 1

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

i = E [Y |i ]

= aT i

= aT Ki a

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Different Classes

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Choice Of a Is Important

Want to Maximize Distance Between Means i and Minimize

Variances i2

Want to Maximize Cost Function

(1 2 )2

J(a) =

12 + 22

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

(1 2 )2

J(a) =

12 + 22

(aT 1 aT 2 )2

=

aT K1 a + aT K2 a

(aT (1 2 ))2

=

aT (K1 + K2 )a

(aT (1 2 )(1 2 )T a)

=

aT (K1 + K2 )a

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Q = (1 2 )(1 2 )T

P = K1 + K2

So J(a) Can Be Written As

aT Qa

J(a) =

aT Pa

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Let

a = Vb

So J(a) Can Be Written As

bT VT QVb

J(a) =

bT VT PVb

bT b

=

||b||2

Maximize via the Theorem to Follow

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Theorem

Let M be a real symmetric (r.s.) matrix with largest eigenvalue 1 ,

then

xT Mx

1 = max

||x||2

and the maximum is achieved for x = K 1 where 1 is the unit

eigenvector associated with 1 and K is any real-valued constant.

a = Vb = V1 = v1

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Since a Is Just The Eigenvector Associated With 1 It

Satisfies

P1 Qa = 1 a

Substituting for Q We Have That a Satisfies

P1 (1 2 )(1 2 )T a = 1 a

But (1 2 )T a Is Just a Scalar, Lets Denote as k

k 1

a= P (1 2 )

1

a is called the Fisher Linear Discriminant

Usually normalize such that ||a|| = 1

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Examples

Classification Example

Using This Vector Gives Us The Best Chance of Making the

Correct Decision

This Example And Others Similar Would Be Covered in a

Detection and Estimation Theory Course

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Introduction

!

1 x 2

1

fX (x) = exp

2 2

Components

We Can Write pdf as

n

" #

1 1 X xi i 2

= exp

(2)n/2 1 n 2

i=1

i

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Can Be Written More Compactly Using Matrices As

1 1 T 1

fX (x) = exp (x ) K (x )

(2)n/2 det(K)1/2 2

where 2

1 0

K=

..

.

0 n2

What Is det(K)?

What Is K1 ?

Easy To See It Holds For Independent Variables, But Also

Holds for Arbitrary K

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformed pdf

Introduction

Consider Now Transforming X Using the Nonsingular n n

Transformation Matrix A To Yield

Y = AX

What Is The Distribution of Y?

Using Transformation Of Random Vectors Can Show That Y

Is Also A Gaussian Random Vector With

E [Y] = E [AX] = AE [X] = A = Y

= E [A(X )(X )T AT ]

= AE [(X )(X )T ]AT

= AKAT

Dr. Adam Panagos

Random Signals in Communications

Outline Introduction Basic Quantities Transformation Covariance Matrices Gaussian Random Vectors

Transformed pdf

1 1 T 1

fY (y) = exp (y Y ) KY (y Y )

(2)n/2 det(KY )1/2 2

Random Signals in Communications

- Optimization of Chemical Processes,SolnUploaded byCristhian Gómez
- Multivariate Quality Control ThesisUploaded byDeivid Zharatte Enry Quez
- P. Karasudhi Foundations of Solid MechanicUploaded bySuraparb Keawsawasvong
- GQB_ME.pdfUploaded byckvirtualize
- ch1Uploaded byhammoudeh13
- TRENCH_LAGRANGE_METHOD.PDFUploaded bycarolina
- Linear Algebra With Applications 5th EditionUploaded byGabriel
- geometría analíticaUploaded byroberto
- Macro SummaryUploaded byEduardo Jimenez
- Introduction to Computational Chemistry Teaching Hückel Molecular Orbital Theory Using an ExcelUploaded bySergio A Rueda Villanoba
- Basics of Multivariate NormalUploaded bySayan Gupta
- VibrationUploaded bysrinathgudur11
- LIBRO+DE+MODELAMIENTOUploaded byLily- XztrzitA-
- Matlab TutUploaded bymohamed ahmed
- FinalMath121BFall10 SolutionsUploaded bymstan11
- lab.pdfUploaded byali
- Diffeq 3 Systems of Linear DiffeqUploaded byvivek patel
- AN IMPROVED TECHNIQUE FOR HUMAN FACE RECOGNITION USING IMAGE PROCESSINGUploaded byIJIERT-International Journal of Innovations in Engineering Research and Technology
- Matrix EigenvectorUploaded bySelva Raj
- 41f3f420095a49369979d9d016c0c97360adUploaded byCơm Sườn
- Online Handwritten Signature Verification Using Neural Network Classifier Based on Principal Component AnalysisUploaded byvahmah2001
- Definition SheetUploaded byJoey
- Coupled LinesUploaded byFatih Tokgöz
- EC_2005- By EasyEngineering.netUploaded bySRINIVASA RAO
- Linear Algebra Assignment 2.pdfUploaded byjeremy
- 3DStress.pdfUploaded byenatt2012
- 60571990Uploaded byCiobanu Marian
- 10.1.1.59Uploaded byJun Fuller
- MathUploaded byjai
- CDVVUploaded bySimion Cristian Constantin

- Ece Vii Embedded System Design [10ec74] NotesUploaded bydeeksha
- Homework 01 ProblemsUploaded bySuhas Ramesh
- lecture05_083116.pdfUploaded bySuhas Ramesh
- lecture03_08242016.pdfUploaded bySuhas Ramesh
- HelUploaded bySuhas Ramesh
- lecture02_08222016.pdfUploaded bySuhas Ramesh
- lecture01_08172016Uploaded bySuhas Ramesh
- 2014020985506873Uploaded bySuhas Ramesh
- Msinus e BookUploaded byManu Ravindra
- Seminar Synopsis OnUploaded bySuhas Ramesh

- symmetryUploaded byapi-404444554
- Physics -Gausss Law Flux and Charge 1Uploaded byMani Pillai
- nicole lang final projectUploaded byapi-419007722
- Mathschallenge 3 StarUploaded byrshijin
- Lecture Notes on Quicksort AnalysisUploaded byKrsto Prorokovic
- c LabmanualUploaded byChintha Venu
- ISI Interview.pdfUploaded byকৃষ্ণেন্দু সাহা
- AN_CLPUploaded byhopethiswurks
- Using Hidden Markov Model for Stock Day Trade ForecastingUploaded byPatrick Langstrom
- UT Dallas Syllabus for ee4310.501.11s taught by Charles Bernardin (cpb021000)Uploaded byUT Dallas Provost's Technology Group
- CTPS_suUploaded byGauravSabhahit
- MathUploaded bysj314
- LATIHAN MATEMATIKUploaded byAhmad Nadzlee
- Most Action From LeastUploaded bymastinaropuru
- InequalitiesUploaded bykrissypainty
- EE418 HW5 SolutionsUploaded byDuc-Lam Nguyen
- Statistical TheoryUploaded byilhanayd
- Trabajo de Pseudo-metodosUploaded byDiego Mendoza Yerren
- Calculus 08 Techniques of IntegrationUploaded byDeny Purwita Putra
- Handbook of Philosophical Logic Second Edition 10Uploaded byobwohl
- FemUploaded byAditya Sharma
- George Sebastian 12400026Uploaded byYadukrishnansp
- ELEG5443 Nonlinear Systems Syllabus 2017 Detailed v01Uploaded byAkram Mohamed
- qmassgn16173(quantujm 3)Uploaded bySanju Roy
- High Wage Workers and High Wage FirmsUploaded byDM
- QUASI-INJECTIVE GAMMA MODULES.Uploaded byIJAR Journal
- VariationUploaded bytemujin03
- 1-s2.0-S0967066113000105-mainUploaded byJulio César
- Implementation of RSA Cryptosystem Using VerilogUploaded byVlase Paul
- Chaos. Making a New Science James GleickUploaded bymitchxp