You are on page 1of 6

# Revision: Eigenvalues and Eigenvectors

## Let A be a square matrix of order n.

A nonzero column vector u in R
n
is called an eigenvector of
A if Au = u for some scalar .
The scalar is called an eigenvalue of A and u is said to be
an eigenvector of A associated with the eigenvalue .
is an eigenvalue of A if and only if det(I A) = 0.
The polynomial det(I A) is called the characteristic
polynomial of A.
The solution space E

## of (I A)u = 0 is called the

eigenspace of A associated with the eigenvalue .
Every nonzero vector in E

is an eigenvector of A associated
with the eigenvalue .
Revision: Diagonalization
A square matrix A is called diagonalizable if there exists an
invertible matrix P such that P

1
AP is a diagonal matrix.
The matrix P is said to diagonalize A.
An n n square matrix A is diagonalizable if and only if A
has n linearly independent eigenvectors.
If u
1
, u
2
, ..., u
n
are linearly independent eigenvectors of A,
then P = (

u
1
u
2
... u
n
) diagonalizes A.

Suppose Au
i
=

u
i
for each i. Then

0
...
0
2
1
1
AP P
Matrix Multiplication in Blocks
Suppose A is an r m matrx, B is an r n matrx,
C is an s m matrx, D is an s n matrx,
E is an m t matrx, F is an m u matrx,
G is an n t matrx, H is an n u matrx.
Then

Warning: To do such a matrix multiplication, you need to
make sure that the sub-matrices can be multipied with each
other.

+ +
+ +
=

DH CF DG CE
BH AF BG AE
H G
F E
D C
B A
A Theorem (to replace Theorem 9.3.10)
Let T be a linear operator on a finite dimensional space V
and let C be an ordered basis for V.
Then a square matrix D is similar to [T]
C
if and only if
D = [T]
B
for an ordered basis B for V.

Proof
() It follows from Discussion 9.3.8.

A Theorem (to replace Theorem 9.3.10)
() (The proof is the same as the last paragraph of the proof
of Theorem 11.2.2 in p.109.)
Let C = { u
1
, u
2
, , u
n
} where dim(V) = n.
Suppose D = P
1
[T]
C
P where P = (p
i

j
) is an n n
invertible matrix.
Define B = { v
1
, v
2
, , v
n
} such that
v
j
= p
1j
u
1
+ p
2j
u
2
+ ... + p
n

j
u
n

for j = 1, 2, ..., n.
Using B as an ordered basis for V, we have [I
V
]
C, B
= P.
Then by Discussion 9.3.8, [T]
B
= P
1
[T]
C
P.

Triangular Forms
Let T be a linear operator on a finite dimensional space V
over F. Suppose the characteristic polynomial of T can be
factorized into linear factors over F.
Take any basis C for V.
Since the characteristic polynomial of the matrix [T]
C
can be
factorized into linear factors over F, we can find an
invertible matrix P such that P
1
[T]
C
P is an upper
triangular matrix
By the result we have just proved, there exists an ordered
basis B for V such that [T]
B
is an upper triangular matrix.