You are on page 1of 19

LINK FOR THE VIDEOS

• https://drive.google.com/file/d/1IjgXMUiqIR0abEcPavWEHPPGkne4A9CX/view?usp=sharing
EIGENVALUES, EIGEN DONE BY: BINIT GIRISH
VECTORS AND ITS SAHANI
REAL LIFE APPLICATIONS REG NO: 22BCE3625
WHAT IS EIGEN VALUE?

Find A*u.
WHAT IS EIGEN VALUE?
EIGEN VALUE AND EIGEN VECTOR

• For a non-zero vector u the scalar λ is called an eigenvalue of the matrix


A and the vector u is called an eigenvector belonging to or corresponding
to λ, which satisfies Au = λu.
• If vector u is 0, AO = λO = O. In this case we say that we have the trivial
solution u = O. In this chapter we consider the non-trivial solutions, u !=
O (not zero)
DEFINITION

• Eigenvalues are the special set of scalars associated with the system of linear
equations. It is mostly used in matrix equations. ‘Eigen’ is a German word that
means ‘proper’ or ‘characteristic’. Therefore, the term eigenvalue can be termed as
characteristic value, characteristic root, proper values or latent roots as well.

• Eigenvectors are the vectors (non-zero) that do not change the direction when any
linear transformation is applied. It changes by only a scalar factor. Eigen vector is
also called characteristic vector.

Note : 1) There could be infinitely many Eigenvectors, corresponding to


one eigenvalue.
• 2) For distinct eigenvalues, the eigenvectors are linearly dependent.
PROPERTIES OF EIGEN VALUES

• Let A be a square matrix of order n. Then,


1) If λ1, λ2,…. λn are the eigen values of A,
(i) λ1+ λ2+…. +λn = trace of A
(ii) λ1* λ2,….* λn = |A|
(iii) λ1, λ2,…. λn are also the eigen values of At
(iv) For any positive integer k, λ1k, λ2k,…. Λnk are the eigen values of Ak
2) If A is non singular, then all the eigen values are non-zero.
3) Inverse Matrix: If A is a square matrix, λ is an eigenvalue of A, then λ-1 is an eigenvalue of A-1
4) For a scalar multiple of a matrix: If A is a square matrix and λ is an eigenvalue of A. Then, a*λ
is an eigenvalue of a*A.
5) If A is a diagonal or triangular matrix, then the principle diagonal elements are the eigen values of
the matrix A.
6) The eigen values of an orthogonal matrix are unit modulus.
CHARACTERISTIC EQUATION

• We know that Au = λu;


• Au = λIu
• [λIu = λu – multiplying by the identity keeps it the same] where I is the identity matrix. We can
rewrite this as
• Au−λIu = O
• (A − λI)u = O
• If Ax = O has an infinite number of solutions ⇔ det (A) = 0.
• Applying this result to (A − λI)u = O means that we must have a non-zero vector u

det(A − λI) = 0

This is the Characteristic Equation

• The roots of the characteristics equation of A gives the eigen value of A.


EIGEN VECTOR

• For the eigenvalue λ determine the corresponding eigenvector u by solving the


system (A − λI)u = O.
REAL LIFE APPLICATIONS

• There are many applications of eigen values in real life.


Few are:
1. Mechanical Engineering
2. Eigenvalue analysis is commonly used by oil firms to
explore land for oil.
3. Bridge Construction
4. Automobile Stereo System Design
5. Chemistry
6. Image and Signal Processing
EIGEN VALUE IN FACIAL
RECOGNITION

• In face recognition, eigenvalues represent how much the images in the training set vary
from the mean image. The higher the eigenvalue, the more important the eigenvector is.
• Eigenfaces is a computer vision method for face recognition. It's an appearance-based approach
that uses the variation in a collection of face images to encode and compare individual faces.
• The Eigenface approach uses eigenvalues and eigenvectors to reduce dimensionality and project a
training sample or data on a small feature space. The number of eigenfaces to be used is chosen
heuristically based on the eigenvalues.
PRINCIPAL COMPONENT ANALYSIS (PCA) AND
SINGULAR VALUE DECOMPOSITION (SVD)

• PCA is a dimensionality reduction technique widely used in image and signal processing.
The basic idea is to transform the original data into a new coordinate system where the data
variance is maximized along the principal components (eigenvectors). The eigenvalues
associated with these eigenvectors indicate the amount of variance explained by each
principal component.
• The eigenvectors with the highest associated eigenvalues capture the most significant
variability in the data. By selecting a subset of these eigenvectors, you can create a new,
lower-dimensional representation of the data.
• This reduced representation retains the essential features of the original data while discarding
less significant variations. In the context of images, this can be especially useful for tasks
like facial recognition, where the key features can be preserved in a lower-dimensional
space.
• SVD is another matrix factorization technique that is closely related to PCA and is widely used in
image and signal processing.
IDENTIFYING FACE USING EIGENFACE

• Query Face = 36% of Eigenface + -8% of Eigenface + … + 21% of Eigenface N.


• To perform the actual face identification, Sirovich and Kirby proposed taking the
Euclidean distance between projected eigenface representations.
THANK YOU

You might also like