Professional Documents
Culture Documents
1. Similar matrices
Let A be an n × n matrix. Recall that A defines a linear transfor-
mation A : Cn → Cn . Suppose that {v1 , . . . , vn } ⊂ Cn is another basis
of Cn . Thus the n × n matrix with these vectors as columns
S = v1 · · · vn
is invertible. The matrix B = (bij of this linear transformation in terms
of the basis {v1 , . . . , vn } is defined by
n
X
Avj = bij vi , j = 1, . . . , n
i=1
0 0
has 1 dimensional nullspace. ♦
But we have seen that if the eigenvalues are all distinct then it it
diagonalizable. In the next section we will see other cases in which a
matrix is diagonalizable.
Proposition 1.2. Similar matrices have the same characteristic poly-
nomials. That is, if B = S −1 AS, then PA (z) = PB (z).
In particular, similar matrices have the same eigenvalues.
This is not surprising, as we have seen that similar matrices can
be considered as matrices of the same linear transformation, but for
different bases.
Proof.
PB (z) = det(B − zI) = det(S −1 AS − zI)
= det(S −1 (A − zI)S)
= det(S −1 ) det(A − zI) det(S)
= det(A − zI)
= PA (z)
One can easily check that if λ is an eigenvalue of A with eigenvector
v, then S −1 v is an eigenvector of B with eigenvalue λ.
2. Spectral theorem
In this section we will consider matrices up to unitary similarity,
B = U ∗ AU where U is an unitary matrix. Under a restriction on A,
that it be normal, we will prove that A is unitary similar to a diagonal
matrix.
Lemma 2.1 (Schur’s lemma). Let A be an n × n matrix with complex
entries. There exists an unitary matrix U and an upper triangular
matrix T , with complex entries, so that
U ∗ AU = T.
4 CRAIG VAN COEVERING
0
is an unitary matrix. And
λ1 ∗ ∗ · · · ∗
0 λ2 ∗ · · · ∗
U2∗ U1∗ AU1 U2 0 0
=
.. .. A
. . 2
0 0
Continue by finding an eigenvalue λ3 and eigenvector of the (n − 2) ×
(n − 2) matrix A2 . In the end we get unitary matrices U1 , U2 , . . . Un ,
and if we set U = U1 U2 · · · Un , U ∗ AU = T is upper triangular.
The Spectral theorem will apply to a class of complex matrices more
general than Hermitian matrices.
Normal matrix: A complex n × n matrix A is normal if A∗ A =
AA∗ .
In other words, A and A∗ commute.
Many of the classes of matrices we have seen are normal.
(1) Hermitian matrices: If A∗ = A, then A∗ A = AA∗ clearly
holds.
(2) Skew-Hermitian matrices: This means A∗ = −A.
MATH 201 COMPLEX MATRICES 5
The following follows from the Spectral theorem but we can give a
direct proof also.
Proposition 2.5. Let A be a normal matrix. If v, w ∈ Cn are eigen-
vectors
Av = λv, Aw = µw, λ 6= µ
then v ⊥ w.
Lemma 2.6. If A is normal, then kAxk = kA∗ xk for all x ∈ Cn .
In particular, Null(A) = Null(A∗ ).
Proof.
kAk2 = (Ax, Ax)
= (A∗ Ax, x)
= (AA∗ x, x)
= (A∗ x, A∗ x) = kA∗ xk2
MATH 201 COMPLEX MATRICES 7
1 1
1
2 1
2 1
2
This has a 1-dimensional eigenspace for λ = 1 and for λ = 2. ♦
Let A be an n × n matrix.
Generalized eigenspace: A vector v is a generalized eigenvec-
tor of A with corresponding eigenvalues λ if
N
A − λI v = 0
for some N ≥ 1. If (A − λ)N −1 v 6= 0, then the generalized
eigenvector v has rank N .
N
Note if v is a generalized eigenvalue then A − λI v = 0 for some
1 ≤ N ≤ n, and v is an ordinary eigenvector if and only if satisfied for
N = 1.
The importance of this concept comes from the fact that A has n
linearly independent generalized eigenvectors, in contrast to ordinary
eigenvectors. Let Gλ denotes the generalized eigenspace of λ ∈ C
N
Gλ = v ∈ Cn | A − λI v = 0 for some N >> 0
4. Applications
The Jordan canonical form has applications to problems where we
used diagonalization. Since not every matrix is diagonalizable is may
be necessary to just the Jordan form.
One application of matrix algebra is in differential equations. Sup-
pose we have an n × n linear system ordinary differential equation
ẋ(t) = Ax(t)
where A is an n × n matrix and
x1 (t)
x(t) = ...
xn (t)
We have seen that if u is an eigenvector of eigenvalue λ then
x(t) = ceλt u
is a solution to the system. Thus we get an independent solution for
each independent eigenvector. This gives a complete set of solutions
if A is diagonalizable. But in general, we must look for other types of
solutions to get a complete set of n solutions.
The matrix exponential will give a complete set of solutions.
∞ k k
tA
X t A A2
e = = I + tA + t2 + ···
k=0
k! 2
can be shown to converge and give a differentiable function of t. Then
x(t) = etA u0
satisfies ẋ(t) = AetA u0 = Ax(t) with x(0) = u0 .
12 CRAIG VAN COEVERING
Example 4.1. Consider Jλ,4 , the 4×4 Jordan block. The computation
above gives 2 3
1 t t2 t6
0 1 t t2
etJλ,4 = eλt 2
0 0 1 t
0 0 0 1
Note that the columns give independent solutions to the correspond-
ing differential equation.
♦