Professional Documents
Culture Documents
• https://drive.google.com/file/d/1IjgXMUiqIR0abEcPavWEHPPGkne4A9CX/view?usp=sharing
EIGENVALUES, EIGEN DONE BY: BINIT GIRISH
VECTORS AND ITS SAHANI
REAL LIFE APPLICATIONS REG NO: 22BCE3625
WHAT IS EIGEN VALUE?
Find A*u.
WHAT IS EIGEN VALUE?
EIGEN VALUE AND EIGEN VECTOR
• Eigenvalues are the special set of scalars associated with the system of linear
equations. It is mostly used in matrix equations. ‘Eigen’ is a German word that
means ‘proper’ or ‘characteristic’. Therefore, the term eigenvalue can be termed as
characteristic value, characteristic root, proper values or latent roots as well.
• Eigenvectors are the vectors (non-zero) that do not change the direction when any
linear transformation is applied. It changes by only a scalar factor. Eigen vector is
also called characteristic vector.
det(A − λI) = 0
• In face recognition, eigenvalues represent how much the images in the training set vary
from the mean image. The higher the eigenvalue, the more important the eigenvector is.
• Eigenfaces is a computer vision method for face recognition. It's an appearance-based approach
that uses the variation in a collection of face images to encode and compare individual faces.
• The Eigenface approach uses eigenvalues and eigenvectors to reduce dimensionality and project a
training sample or data on a small feature space. The number of eigenfaces to be used is chosen
heuristically based on the eigenvalues.
PRINCIPAL COMPONENT ANALYSIS (PCA) AND
SINGULAR VALUE DECOMPOSITION (SVD)
• PCA is a dimensionality reduction technique widely used in image and signal processing.
The basic idea is to transform the original data into a new coordinate system where the data
variance is maximized along the principal components (eigenvectors). The eigenvalues
associated with these eigenvectors indicate the amount of variance explained by each
principal component.
• The eigenvectors with the highest associated eigenvalues capture the most significant
variability in the data. By selecting a subset of these eigenvectors, you can create a new,
lower-dimensional representation of the data.
• This reduced representation retains the essential features of the original data while discarding
less significant variations. In the context of images, this can be especially useful for tasks
like facial recognition, where the key features can be preserved in a lower-dimensional
space.
• SVD is another matrix factorization technique that is closely related to PCA and is widely used in
image and signal processing.
IDENTIFYING FACE USING EIGENFACE