You are on page 1of 2

Principal Component Analysis

A mathematical approach that converts correlated variables into linearly uncorrelated variables with the help of orthogonal transformation is known as Principal Component Analysis and the linearly uncorrelated variable called principal components. The number of linearly uncorrelated variables is less than or equal to the number of correlated variables. Principal Component Analysis is the most efficient approach for Facerecognition and eigenvector-based multivariable analysis.

1.1 Mathematical ApproachEigen values and Eigen vectors


In linear algebra, the eigenvectors of a linear operator are non-zero vectors which, when operated by the operator, result in a scalar multiple of them. Scalar is then called Eigen value ( ) associated with the eigenvector (X). Eigen vector is a vector that is scaled by linear transformation. It is a property of matrix. When a matrix acts on it, only the vector magnitude is changed not the direction.

AX = X, where A is a vector function, X Eigenvector and is Eigenvalue.

(A- I)X = 0, where I is the identity matrix.

This is a homogeneous system of equations and form fundamental linear algebra. We know a non-trivial solution exists if and only if Det(A - I) = 0, where det denotes determinant.

When evaluated becomes a polynomial of degree n. This is called characteristic polynomial of A. If A is N by N then there are n solutions or n roots of the characteristic polynomial. Thus there are n eigenvalues of A satisfying the equation AXi = Xi , where i = 1,2,3,.....n

If

the

eigenvalues

are

all

distinct,

there

are

associated

linearly independent

eigenvectors, whose directions are unique, which span an n dimensional eucledian space.

Face Recognition using Eigen faces

You might also like