You are on page 1of 4

ORDINARY

DIFFERENTIAL
EQUATIONS

ASSIGNMENT

NAME: Muhammad Shoaib Khan


Roll #: 1713-MSCMATH-17
TopIc: Eigen Value & Vectors
pRof. NAME: Doc. Azhar Ali Zafar
Eigen Values and Eigen
Vectors
Eigenvalues and eigenvectors feature prominently in the analysis of
linear transformations. The prefix Eigen- is adopted from
the German word Eigen for "proper", "characteristic". Originally utilized to
[4]

study principal axes of the rotational motion of rigid bodies, eigenvalues


and eigenvectors have a wide range of applications, for example in stability
analysis, vibration analysis, atomic orbital’s, facial recognition, and matrix
diagonalization.

Eigenvectors are much the same. ... Each eigenvector is like a


skewer which helps to hold the linear transformation into place. Very (very,
very) roughly then, the Eigenvalues of a linear mapping is a measure of
the distortion induced by the transformation and the eigenvectors tell
you about how the distortion is oriented.

Eigenvalues are a special set of scalars associated with a linear


system of equations (i.e., a matrix equation) that are sometimes also
known as characteristic roots, characteristic values (Hoffman and Kunze
1971), proper values, or latent roots (Marcus and Minc 1988, p. 144).

The determination of the eigenvalues and eigenvectors of a system is


extremely important in physics and engineering, where it is equivalent
to matrix diagonalization and arises in such common applications as
stability analysis, the physics of rotating bodies, and small oscillations of
vibrating systems, to name only a few. Each eigenvalue is paired with a
corresponding so-called eigenvector (or, in general, a corresponding right
eigenvector and a corresponding left eigenvector; there is no analogous
distinction between left and right for eigenvalues).

The decomposition of a square matrix into eigenvalues and


eigenvectors is known in this work as Eigen Decomposition, and the fact
that this decomposition is always possible as long as the matrix consisting
of the eigenvectors of is square is known as the “Eigen Decomposition
theorem”.
EigenValues
1. Each of a set of values of a parameter for which a differential equation has a non-
zero solution (an eigenfunction) under given conditions.
2. Any number such that a given matrix minus that number times the identity matrix has
zero determinants.
3. An eigenvector v of a linear transformation T is a non-zero vector that, when T is
applied to it, the eigenvector only scales the eigenvector by the scalar value λ, called
an eigenvalue.

EigenVectors
1. A vector which when operated on by a given operator gives a scalar multiple of itself is
known as EigenVector.
2. In linear algebra, an eigenvector or characteristic vector of a linear transformation is a
non-zero vector that only changes by an overall scale when that linear transformation is
applied to it. More formally, if T is a linear transformation from a vector space V over a field
F into itself and v is a vector in V that is not the zero vector, then v is an eigenvector of T if
T(v) is a scalar multiple of v.
3. An eigenvector v of a linear transformation T is a non-zero vector that, when T is
applied to it, does not change direction.
Definition (in Matrices):
First let us think what a square matrix does to a vector. Consider a
matrix A∈Rn×n. Let us see what the matrix “A” acting on a vector x does to
this vector. By action, we mean multiplication i.e. we get a new
vector y=Ax.

The matrix acting on a vector ‘x’ does two things to the vector ‘x’.
1. It scales the vector.
2. It rotates the vector.

However, for any matrix A, there are some favored vectors/directions.


When the matrix acts on these favored vectors, the action essentially
results in just scaling the vector. There is no rotation. These favored
vectors are precisely the eigenvectors and the amount by which each of
these favored vectors stretches or compresses is the eigenvalue.
So why are these eigenvectors and eigenvalues important? Consider the
eigenvector corresponding to the maximum (absolute) eigenvalue. If we
take a vector along this eigenvector, then the action of the matrix is
maximum. No other vector when acted by this matrix will get stretched
as much as this eigenvector.
Hence, if a vector were to lie "close" to this eigen direction, then the
"effect" of action by this matrix will be "large" i.e. the action by this matrix
results in "large" response for this vector. The effect of the action by this
matrix is high for large (absolute) eigenvalues and less for small (absolute)
eigenvalues. Hence, the directions/vectors along which this action is high
are called the principal directions or principal eigenvectors. The
corresponding eigenvalues are called the principal values.

You might also like