Professional Documents
Culture Documents
1. Introduction
If A is an 3 3 matrix and x is a vector in R 3 , then there is usually no general geometric relationship
between the vector x and the vector Ax , see figure (a). However, there are often certain nonzero vectors
x such that x and Ax are scalar multiples of one another, that is, Ax x for some real number ,
see figure (b). Such vectors arise naturally in the study of vibrations, electrical systems, genetics,
chemical reactions, quantum mechanics, mechanical stress, economics, and geometry.
Let A be an n n square matrix and consider the equation Ax x . A value of for which Ax x
has a non-trivial solution x 0 is called an eigenvalue (or characteristic value) of A and the
corresponding solution x 0 is an eigenvector (or characteristic vector). The set of all eigenvalues of A
is known as the spectrum of A.
Or
1 1 0 i1 i1''
1 2 1 i LC i ''
2 2
0 1 1 i i3''
3
1
Look for solutions which are sinusoidal and all have the same frequency, that is,
i1 I1 sin wt 1 , i2 I 2 sin wt 2 , i3 I 3 sin wt 3 .
Then i1'' w2i1, i2" w2i2 and i3" w2i3 and the problem becomes
1 1 0 i1 i1
1 2 1 i2 w LC i2 , which is an eigenvalue problem with w LC
2 2
0 1 1 i i
3 3
Ax x Ax I x Ax x 0 A I x 0 .
Example
1 4
Find eigenvalues of A and the corresponding eigenvectors.
2 3
Solution:
1 4 1 0 1 4
A I
3
.
2 3 0 1 2
1 4
det A I 1 3 8 2 4 3 8 2 4 5 5 1 .
2 3
Hence the eigenvalues are 5 and 1
To find the corresponding eigenvectors we solve:
When 5
4 4 x1 0 4 4 0 1 1 0
2 2 x 0 0 ~ x1 x2
2 2 2 0 0 0 0
x 1
Let x2 k 0 , then the corresponding eigenvectors of A for 5 are 1 k , k 0 .
x2 1
1
And one of the eigenvectors for k 1 is .
1
When 1
2 4 x1 0 2 4 0 1 2 0 x 2
2 4 x 0 0 ~ x1 2 x2 1 k , k 0 .
2
2 4 0 0 0 0 x2 1
2
2
Let k 1 , then is one of the corresponding eigenvectors of A for 1 .
1
Example
1 1 0
Find eigenvalues of A 1 2 1 and the corresponding eigenvectors.
0 1 1
Solution:
1 1 0
A I 1 2 1 1 1 2 1 1
0 1 1
1 2 3 2 2 1 3
When 0
1 1 0 x1 1 1 0 0 1 1 0 0 1 1 0 0
1
2 1 x2 0 1 2 1 0 ~ 0 1 1 0 ~ 0 1 1 0
0 1 1 x3 0 1 1 0 0 1 1 0 0 0 0 0
x1 1
x2 x3 , x1 x2 x1 x2 x3 x2 k 1 , k 0
x 1
3
1
Let k 1 , then one of the eigenvectors for 0 is 1
1
When 1
0 1 0 x1 0 1 0 0 1 1 1 0 1 1 1 0
1
1 1 x2 0 1 1 1 0 ~ 0 1 0 0 ~ 0 1 0 0
0 1 0 x3 0 1 0 0 0 1 0 0 0 0 0 0
x1 1
x2 0, x1 x2 x3 x3 x2 k 0 , k 0
x 1
3
1
One of the eigenvectors for 1 is 0 .
1
When 3
3
2 1 0 x1 2 1 0 0 1 1 1 0
1 1 1 x2 0 1 1 1 0 ~ 2 1 0 0
0 1 2 x3 0 1 2 0 0 1 2 0
1 1 1 0 1 1 1 0 x1 1
~ 0 1 2 0 ~ 0 1 2 0 x2 2 x3 , x1 x2 x3 x3 x2 k 2 , k 0
0 1 2 0 0 0 0 0 x 1
3
1
One of the eigenvectors for 3 is 2 .
1
Example
1 1
Find eigenvalues of A and the corresponding eigenvectors.
4 1
Solution:
1 1
A I 2 2 5 1 2i
4 1
For 1 2i , we have
1 i i
2i 1 x1 0 2i 1 0 1 0 1 0 1 0
~ 2i 2 ~ 2
4 2i x2 0 4 2i 0 12i r1 4 2i 0 4 2i 0 r2 4 r1 0 0 0
i i
x1 k
x1 x2 2 k 2 , k 0
i
2 x2 k
1
1 i i
Notice .
2i 2i 2
2
i
An eigenvector for 1 2i is 2 .
1
For 1 2i , we have
1 i i
2i 1 x1 0 2i 1 0 1 0 1 0 1 0
2 r ~4 r
i
x 1~ 2i 2 x1 x2
4 2i 2 0 4 2i 0 2i r1 4 2i 0 4 2i 0 2 1 0 0 2
0
i i
x1 k
2 k 2 , k 0
x2 k
1
1 i i i
Notice 2 .
2i 2i 2 2
i
An eigenvector for 1 2i is 2 .
1
4
Example
5 3
Find eigenvalues of A and the corresponding eigenvectors.
3 1
Solution:
5 3
A I 2 , 2 is the only eigenvalue.
2
3 1
3 3 x1 0 x1 1
3 3 x 0 x k 1 , k 0
2 2
3 Eigenvalue Results
Theorem
Eigenvectors corresponding to distinct eigenvalues are linearly independent.
Proof:
The proof is by induction on the number of distinct eigenvalues.
Any single eigenvector by itself, being nonzero, is linearly independent.
Assume any k 1 eigenvectors associated with k 1 distinct eigenvalues are linearly independent.
Suppose v1 , , vk are eigenvectors associated, respectively, with k distinct eigenvalues 1 , , k .
Consider c1 v1 ck vk 0 , claim c1 ck 0 .
Suppose not, then without loss of generality suppose, say c1 0 .
Then
A 1I c1 v1 ck vk A 1I c1 v1 A 1I ck vk A 1I 0
Ac v Ic v 0
Ac1 v1 1 Ic1 v1 Ac2 v2 1 Ic2 v2 k k 1 k k
c Av c I v c Av c v c Av c I v 0
1 1 1 1 1 2 2 1 2 2 k k k 1 k
c v c v c v c v c v c v 0
1 1 1 1 1 1 2 2 1 2 1 1 k k 1 k 1 1
c2 2 1 v2 ck k 1 vk 0
By induction hypothesis v2 , , vk are linearly independent, so c2 2 1 ck k 1 0 .
Then 2 1 0, , k 1 0 c2 ck 0 . It follows that c1 v1 0 .
It is also possible to have a repeated eigenvalue which does lead to more than one linearly independent
eigenvector.
Example
Let J n , n 2,3, be a square matrix with all its entries being 1, that is,
5
1 1 1
Jn .
1 1 1
(a) Find all eigenvalues of J n .
(b) Find n linearly independent eigenvectors.
Solution:
1 1 11 11 11 n 1
Observe that n so n is an eigenvalue of J n and one of
1 1 1 1
1 11 11 n
1
its eigenvector is .
1
1 1
1 1 11 11 0
1 1 1
In addition, see that 0 0 0 .
1 1 1 11 11 0
0 0
1
1
So 0 is also an eigenvalue of J n , n 2,3, and one of its eigenvector is 0 .
0
For 0 , J n 0I n J n .
1 1 1 0
1 1 1 0
0 0 0 0
Solve .
1 1
1 0
0 0 0 0
So x1 x2 xn 0 . Then xn x1 xn1 .
Let x1 a1 , x2 a2 , , xn1 an1 where a1 , a2 , , an1 are arbitrary real numbers.
Then
0
a1 0
x1 x1 a1 a2
0
0
0
xn 1 xn 1 an 1
0 an 1
xn x1 xn 1 a1 an 1 a 0 a
1 a n 1
2
6
0
1 0
1
0 0
a1 a2 an 1 0 . Observe that
0 0 1
1 1
1
0
1 0 a1 0
1
0 0 a2 0
0
a1 a2 an 1 0
0 1 0 an 1 0
0
1
1 a1 a2 an 1 0
1
a1 0, a2 0, , an 1 0
0
1 0
1
0 0
It follows that , , , 0 are linearly independent.
0 0 1
1 1
1
0
1 0
1 1
0 0
Claim that , , , 0 , are linearly independent.
0 0 1 1
1 1
1
0
1 0
1
0 1 0
0
Consider x1 x2 xn 1 0 xn (A).
0 0 1 1 0
1
1
1
0
1 0
1
0 1 0
0
Both sides of (A) time J n , then x1 J n x2 J n xn 1 J n 0 xn J n J n .
1 0
0 0 1
1
1
1
7
0
1 0
1
0 1 0
0
Since J n 0, J n 0, , J n 0 0 , it follows that xn n then xn 0 .
1 0
0 0 1
1
1
1
0
1 0
1 0
0 0
Thus, x1 x2 xn 1 0 .
0 0 1 0
1 1
1
0
1 0
1
0 0
As , , , 0 are linearly independent, x1 x2 xn1 0
0 0 1
1 1
1
0
1 0
1 1
0 0
It follows that x1 x2 xn1 xn 0 and , , , 0 , are linearly independent.
0 0 1 1
1 1
1
Theorem
If A is a real symmetric matrix, then
(a) the eigenvalues are all real (but not necessarily distinct).
(b) eigenvectors corresponding to distinct eigenvalues are orthogonal.
Proof:
(a)
Suppose Av v where v 0 .
x1
t t t t
Consider v Av v v v v x1 x1 xn xn where v x1 , , xn if v .
x
n
t t t t
Then v Av v Av v Av v Av .
t t t t t t t t
Observe that v Av is a 11 matrix, so v Av v Av v At v v Av .
t t t
As a result, we have v Av v Av . It follows that v Av is real.
8
t
v Av
We conclude that is real (note that x1 x1 xn xn is real).
x1 x1 xn xn
(b)
Suppose Av1 1 v1 , Av2 2 v2 where 1 2 , v1 , v2 0 .
t
t
t
t t t
t
1 v1 v2 1 v1 v2 Av1 v2 v1 At v2 v1 A v2 v1 Av2 v1 2 v2 2 v1 v2
t
1 2 v1 v2 0 v1 v2 0 .
t t
1 2 0
Theorem
If A is an n n real symmetric matrix, then it is possible to find n linearly independent eigenvectors (even
though the eigenvalues may not be distinct).
Proof: Omitted.
Example
1 1 1
Consider J 3 1 1 1 , which is real symmetric.
1 1 1
1 1 1
J 3 1 1 1 has eigenvalue 3 of multiplicity 1 and eigenvalue 0 of multiplicity 2, all of them are real.
1 1 1
1
For eigenvalue 3, one of its corresponding eigenvectors is 1 .
1
1 0
For eigenvalue 0, its independent eigenvectors are 0 , 1 .
1 1
1 1 1 0
Observe that 1 0 0, 1 1 0 .
1 1 1 1
1 1 0
1 , 0 , 1 are linearly independent.
1 1 1
We observe that:
9
det D I det P 1 AP I det P 1 AP P 1IP det P 1 A I P
det P 1 det A I det P det A I
Thus, A and P1 AP D have the same characteristic polynomial thus the same eigenvalues. We also
notice that the eigenvalus of D are just the main diagonal entries.
Theorem
If n n matrix A has n linearly independent eigenvectors then it is diagonalizable.
Remark:
Let A be a p m matrix and B an m n matrix. Recall that the product of AB can be evaluated in two
different ways:
a11 a12 a1m a11 a1m
a a
a22 a2 m a21
, , am
2m
A A a1 a2 am , where a1
21
a p1 a p 2 a pm a p1 a pm
b11 a1 b21 a2 bm1 am b12 a1 b22 a2 bm 2 am b1n a1 b2 n a2 bmn am
Proof:
Suppose x1 , , xn are linearly independent eigenvectors with corresponding eigenvalues 1 , , n such
that Ax1 1 x1, , Axn n xn .
Construct n n matrix P x1 x2
xn with x1 , , xn as its columns. Then
eigenvectors of A. Then
AP Ax1 Ax2
Axn 1 x11 2 x2 n xn
1 x1 0 x2 0 xn 0 x1 2 x2 0 xn 0 x1 0 xn 1 n xn ,
1 0 0
0
x1 x2
xn
2 PD
0 n
10
1 0 0
0
, where D 2 diag , , .
1 n
0 n
As x1 , , xn are linearly independent, P 1 exist. Thus we have P1 AP P1PD D
Example
1 4
Show that A is diagonalizable.
2 3
Proof:
1 4
For A , we have eigenvalues 1 5 , 2 1 (they are distinct).
2 3
1
For 1 5 , one of the corresponding eigenvector is .
1
2
For 2 1 , one of the corresponding eigenvector is .
1
1
1 2 1 2
Let P , as the eigenvectors of A are linearly independent, P 1 exists and
1 1 1 1
1 2 1 2
3 3 3 3 1 4 1 2 5 0
P 1 1
. Also P AP .
1 1 1 1 2 3 1 1 0 1
3 3 3 3
1 4
It concludes that A is diagonalizable.
2 3
Example
5 3
Is A diagonalizable?
3 1
Solution:
5 3
A I 2 and 2 is the only eigenvalue.
2
3 1
3 3 x1 0 x1 1
3 3 x 0 x k 1 , k 0
2 2
5 3
As there is only one linearly independent eigenvector, A is not diagonalizable.
3 1
11
Example
1 3 3
Diagonalize A 3 5 3 , if possible.
3 3 1
Proof:
1 3 3
A I 3 5 3 3 3 2 4 1 2
2
3 3 1
When 1
0 3 3 x1 0 3 3 0 0 1 1 0 0 1 1 0 0
3 6 3 x = 0 3 6 3 0 ~ 0 3 3 0 ~ 0 1 1 0
2
3 0 0 0 0
3 0 x3 0 0 3 3 0 0 3 3 0
x1 1
x2 x3 , x1 x2 x3 x2 k 1 , k 0
x 1
3
When 2
3 3 3 x1 0 3 3 3 0 1 1 1 0
3 3 3 x 0 0 0 0 0 ~ 0 0 0 0 x t, x s, x s t
2 3 2 1
3
3 3 x3 0 0 0 0 0 0 0 0 0
x1 s t 1 1
x2 s t 0 s 1 , t 2 s 2 0
x t 1 0
3
1 3 3 1 3 3
A 3 5 3 has three linearly independent eigenvectors, therefore A 3 5 3 is
3 3 1 3 3 1
1
1 1 1 1 3 3 1 1 1 2 0 0
diagonalizable and 0 1 1 3 5 3 0 1 1 0 2 0 .
1 0 1 3 3 1 1 0 1 0 0 1
Example
Diagonalisation is useful if it is required to raise a matrix A to a high power , that is, A m .
If P 1 AP D , then A PDP 1 and
Am PDP1 PDP1 PDP1PDP1 PDP1PDP1 PDIDI IDIDP1 PDm P1 and
1 0 0 1m 0 0
0
m
diag m , , nm
0
D 2 diag , , n D m 2
1
1
0 n 0 nm
12
An important application of diagonalisation occurs in the solution of systems of differential equations.
Example
Consider a system of n first order linear differential equations for the dependent variables x1, x2 , , xn as
follows.
dx1
dt a11 x1 a12 x2 a1n xn
dx2 a x a x a x
dt
21 1 22 2 2n n
(E) , where the a’s are given constants. The system (E) can be written
dxn a x a x a x
dt n1 1 n2 2 nn n
Example
13
1 1 1
(a) Use Gauss-Jordan elimination to compute the inverse of 1 2 1 .
0 0 1
(b) Let A be a 3 3 matrix which has eigenvalues 1, 1, 0 with corresponding eigenvectors
1 1 1
1, 2, 1 . With the help of (a), find A10 .
0 0 1
Solution:
(a)
1 1 1 1 0 0 1 1 1 1 0 0 1 1 0 1 0 1
1 2 1 0 1 0 r2~r1 0 1 0 1 1 0 r1~r3 0 1 0 1 1 0
0 0 1 0 0 1 0 0 1 0 0 1 0 0 1 0 0 1
1 0 0 2 1 1
~ 0 1 0 1 1 0
r1 r2
0 0 1 0 0 1
1
1 1 1 2 1 1
So 1 2 1 1 1 0 .
0 0 1 0 0 1
(b)
1 1 1 1 1 1 1 0 0
We have A 1 2 1 1 2 1 0 1 0 .
0 0 1 0 0 1 0 0 0
1
1 1 1 1 0 0 1 1 1 1 1 1 1 0 0 2 1 1
A 1 2 1 0 1 0 1 2 1 1 2 1 0 1 0 1 1 0
0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 0 1
1 1 1 1 0 0 2 1 1 1 1 1 1 0 0 2 1 1
10 10
A10 1 2 1 0 1 0 1 1 0 1 2 1 0 110 0 1 1 0
0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 0 1
1 1 1 1 0 0 2 1 1 1 1 0 2 1 1 1 0 1
1 2 1 0 1 0 1 1 0 1 2 0 1 1 0 0 1 1
0 0 1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0
Example
(a) Let B be a 3 3 matrix and 1 , 2 , 3 are the eigenvalues of B.
Write down the characteristic equation of B and show that 123 det B .
14
1 1 1 2
(b) Let A33 be a real symmetric matrix. Suppose det A 1, A 2 2 , A 0 0 ,
1 1 1 2
(i) use part (a), or otherwise, to find the eigenvalues of A;
to be the unit eigenvectors of A and show that PT P I ;
(ii) find a matrix P having its columns
(iii) show that A is diagonalizable and find the matrix A.
Solution:
(a)
b11 b12 b13
det I B det b21 b22 b23 1 2 3
b b32 b33
31
(b)
(i)
1 1 1 2 1 1 1 1 2 1
A 2 2 , A 0 0 A 2 2 1 2 , A 0 0 2 0 .
1 1 1 2 1 1 1 1 2 1
So we have, say 1 1, 2 2 .
1 1
det A 1 123 , 1 1, 2 2 1 23 3 . As a result, 1 1, 2 2, 3 .
2 2
(Observe that the real symmetric A has distinct simple (of multiplicity 1) eigenvalues, therefore, A has
the orthonormal (orthogonal unit) eigenvectors which are linearly independent.)
(ii)
1 1
6 6
1 1
2 2
For 1 1 , A 2 1 2 A 1 6 .
1 1
6
1 1
6 6
1 1
1 1 2 2
For 1 2 , A 0 2 0 A 0 2 0 .
1 1 1
1
2 2
x x
1
Suppose A y y . Since A is real symmetric, we have
z 2z
15
x 1
y 2 0 1
z 1 x 2 y z 0 y z 1 is an eigenvector of A for 1
. Let z 1 , then
x z 0 x z
3
x 1 1 2
y 0 0
z 1
1
3
1
and the corresponding unit eigenvector is .
3
1
3
1 1 1 1 2 1
6 2 3 6 6 6
2 1 1 1
Let P 0 . Observe that P
T
0 .
6 3 2 2
1 1 1 1
1 1
6 2 3 3 3 3
1 2 1 1 1 1
6 6
6 6 2 3
1 0 0
1 1 2 1
And P P
T
0 0 0 1 0 .
0 0 1
2 2 6 3
1 1 1 1 1 1
3 3 3 6 2 3
(Observe matrix P which is formed by the orthonormal eigenvectors of A has its inverse just its
transpose)
(iii)
1 1 1 1 1 1
6 2 3 6 2 3 1 0 0
2 1 2 1
A 0 0 0 2 0
6 3 6 3
1
1 1 1 1 1 1 0 0
2
6 2 3 6 2 3
1
1 1 1 1 1 1
6 2 3 1 0 0 6 2 3
2 1 2 1
A 0 0 2 0 0
6 3 6 3
1
1 1 1 0 0 1 1 1
2
6 2 3 6 2 3
16
1 1 1 1 2 1
6 2 3 1 0 0 6 6 6
2 1 1 1
0 0 2 0 0
6 3 2 2
1
1 1 1 0 0 1 1 1
2
6 2 3 3 3 3
1 2 1 1 2 1
6 2 2 3 6 6 6
2 1 1 1
0 0
6 2 3 2 2
1 2 1 1 1 1
6 2 2 3 3 3 3
1 1 2 1 1 1 1
6 1 6
6 6
1 1
6 6
2
1
2 1 4 1 2 1 1 1 1
6 6 6 6 6 6 2 2 2
1 1 2 1 1 1 1
1 1 1 1
6 6 6 6 6 6 2
-End-
17