You are on page 1of 14

Date: 22-1-21

MATRICES

Q1. What is Diagonalization in matrices means? Also explain


by constructing an example.

A. Diagonalization is the process of transforming a matrix into diagonal form.


It is the process of taking a square matrix and converting it into a special
type of matrix so-called diagonal matrix that shares the same fundamental
properties of the underlying matrix. A diagonal matrix is a matrix in which
non-zero values appear only on its main diagonal. In other words, every
entry not on the diagonal is 0. Diagonal matrices represent the eigenvalues
of a matrix in a clear manner. Matrix diagonalization is equivalent to
transforming the underlying system of equations into a special set of
coordinate axes in which the matrix takes this canonical form. Diagonalizing
a matrix is also equivalent to finding the matrix's eigenvalues, which turn
out to be precisely the entries of the diagonalized matrix. Similarly,
the eigenvectors make up the new set of axes corresponding to the diagonal
matrix.
The remarkable relationship between a diagonalized matrix, eigenvalues,
and eigenvectors follows from the beautiful mathematical identity that
a square matrix A can be decomposed into the very special form i.e.
A= P D P-1
An n-square matrix A is similar to a diagonal matrix D if and only if A has
n linearly independent eigenvectors. In this case, the diagonal elements of D
are the corresponding eigenvalues and D= P-1A P, where P is the matrix
whose columns are the eigenvectors. By diagonal factorization, we have A=
P D P-1 where P is a matrix composed of the eigenvectors of A, D is
the diagonal matrix constructed from the corresponding eigenvalues, and P-
1
  is the matrix inverse of P.
Q2. Answer the following questions.

i. What are Eigenvalues and Eigenvectors?

A. In linear algebra, an eigenvector or characteristic vector of a linear


transformation is a nonzero vector that changes by a scalar factor when that
linear transformation is applied to it. The corresponding eigenvalue, often
denoted by  is the factor by which the eigenvector is scaled.
If A is a n x n matrix over R, then a scaler   R is called an eigenvalue of
A if there exists a nonzero column vector v  Rn such that Av = v. In this
case, v is called eigenvector of A corresponding to the eigenvalue .
If T is a linear transformation from a vector space V over a field F into
itself and v is a nonzero vector in V, then v is an eigenvector of T if T(v) is a
scalar multiple of v. This can be written as
T(v) = v
where λ is a scalar in F, known as the eigenvalue, characteristic value,
or characteristic root associated with v.

There is a direct correspondence between n-by-n square matrices and linear


transformations from an n-dimensional vector space into itself, given
any basis of the vector space. Hence, in a finite-dimensional vector space, it
is equivalent to define eigenvalues and eigenvectors using either the
language of matrices, or the language of linear transformations. An
eigenvector does not change direction in a transformation.
If V is finite-dimensional, the above equation is equivalent to
Au = u
where A is the matrix representation of T and u is the coordinate vector of v.
ii. What are their Properties?

A. Some of the Properties are:

Left Eigenvectors: -
The first property concerns the eigenvalues of the transpose of a matrix. If
A be a n x n square matrix then a scalar   is an eigenvalue of A if and only
if it is an eigenvalue of  AT

Eigenvalues of a Triangular Matrix: -


The diagonal elements of a triangular matrix are equal to its eigenvalues. If
A be a n x n triangular matrix then each of the diagonal entries of A is an
eigenvalue of A.

Zero Eigenvalues and Invertibility: -


Eigenvalues allow us to tell whether a matrix is invertible. If A be a n x n 
matrix then A is invertible if and only if it has no zero eigenvalues.

Eigenvalues and Eigenvectors of the Inverse Matrix: -


The eigenvalues of the inverse are easy to compute. If A  be a n x n  
invertible matrix then   is an eigenvalue of A corresponding to an
eigenvector X if and only if  -1 is an eigenvalue of A -1 corresponding to the
same eigenvector X.

Conjugate Pairs: -
An interesting fact is that complex eigenvalues of real matrices always
come in conjugate pairs. If A be a n x n matrix having real entries. A
complex number  is an eigenvalue of corresponding to the eigenvector X if
and only if its complex conjugate ¯λ is an eigenvalue corresponding to
the conjugate vector ˉx.

Scalar Multiples: -
If we multiply a matrix by a scalar, then all its eigenvalues are multiplied
by the same scalar. If A be a n x n matrix and a ≠ 0 a scalar and  is an
eigenvalue of A corresponding to the eigenvector X , then a  is an
eigenvalue of aA corresponding to the same eigenvector X.

Matrix Powers: -
Let n be a natural number. Then nth power of a square matrix  is A to nth
times of A. In other words, the nth power is obtained by performing  matrix
multiplications of  by itself. It is easy to derive the eigenvalues of A^n from
those of A . Let A be a n x n matrix. If   is an eigenvalue of  corresponding
to the eigenvector X, then  is an eigenvalue of  corresponding to the same
eigenvector.

iii. How they are useful in linear algebra?

A. There are multiple uses of eigenvalues and eigenvectors: -

1. Eigenvalues and Eigenvectors have their importance in linear differential


equations where you want to find a rate of change or when you want to
maintain relationships between two variables.
2. We can represent a large set of information in a matrix. Performing
computations on a large matrix is a very slow process. To elaborate, one
of the key methodologies to improve efficiency in computationally
intensive tasks is to reduce the dimensions after ensuring most of the key
information is maintained. Hence, one eigenvalue and eigenvector are
used to capture key information that is stored in a large matrix. This
technique can also be used to improve the performance of data churning
components.

3. Component analysis is one of the key strategies that is utilised to


reduce dimension space without losing valuable information. The core
of component analysis (PCA) is built on the concept of eigenvalues
and eigenvectors. The concept revolves around computing
eigenvectors and eigenvalues of the covariance matrix of the features.

4. Additionally, eigenvectors and eigenvalues are used in facial


recognition techniques such as Eigenfaces.

5. They are used to reduce dimension space. The technique of


Eigenvectors and Eigenvalues are used to compress the data. As
mentioned above, many algorithms such as PCA rely on eigenvalues
and eigenvectors to reduce the dimensions.

6. Eigenvalues are also used in regularization and they can be used to


prevent overfitting.
7. Occasionally we gather data that contains a large amount of noise.
Finding important or meaningful patterns within the data can be
extremely difficult. Eigenvectors and eigenvalues can be used to
construct spectral clustering. They are also used in singular value
decomposition.

8. We can also use eigenvector to rank items in a dataset. They are


heavily used in search engines and calculus.

9. Lastly, in non-linear motion dynamics, eigenvalues and eigenvectors


can be used to help us understand the data better as they can be used
to transform and represent data into manageable sets.

iv. What is Cayley-Hamilton Theorem?

A. The Cayley–Hamilton theorem states that substituting the matrix A for x in


polynomial, p(x) = det(xIn – A), results in the zero matrices, such as:
p(A) = 0
It states that a ‘n x n’ matrix A is demolished by its characteristic polynomial
det(tI – A), which is monic polynomial of degree n. The powers of A, found by
substitution from powers of x, are defined by recurrent matrix multiplication; the
constant term of p(x) provides a multiple of the power A0, where power is
described as the identity matrix.
The theorem allows An to be articulated as a linear combination of the lower
matrix powers of A. If the ring is a field, the Cayley–Hamilton theorem is equal to
the declaration that the smallest polynomial of a square matrix divided by its
characteristic polynomial.
v. Give example of any order finding eigenvalues and
eigenvectors.

You might also like