You are on page 1of 14

# Notes on Linear Algebra August 24, 2016

## Jordan Canonical Form

Rakesh Jana IIT Guwahati

Notations:
PA (t) : Characteristic polynomial of A in t.
F : Scalar field, Rn or Cn .
Mnm : Set of all matrices of order n-by-m over the field F.
Jk () : Jordan block for of size k.
PA (t) : Characteristic polynomial of A in t.
QA (t) : Minimal polynomial of A in t.

## 1 Matrix form of linear map

We all know that the coordinate expression of a vector in a vector space (or linear space)
depends on the choice of an ordered basis. Hence, the matrix representation of a linear
transformation is also dependent in the choice of bases.
Let X be a linear space over F and B = {v1 , v2 , , vn } be an ordered basis for X. Then
any x X can be expressed uniquely as

1 x=
n
X
xi vi , where xi F
i=1

x1
x2


= v1 v2 vn ..
.
xn
= B[x]B ,
   t
where B = v1 v2 vn is called basis matrix and [x]B = x1 x2 xn Fn is
called basis representation (or coordinate expression) of x with respect to basis B.

## [x]B = B 1 x, for any x X.

[x + y]B = [x]B + [y]B , for any x, y X and for any , F.
Rakesh Jana IIT Guwahati

## Consider T : X Y be a linear transformation and B = {f1 , f2 , , fn }, C = {g1 , g2 , , gm }

are basis of X and Y , respectively. Then from Observation 1.1 we have, for any x X,
n
X
[T x]C = [x1 T f1 + x2 T f2 + + xn T fn ]C , where x = xi fi = B[x]B X
i=1
= x1 [T f1 ]C + x2 [T f2 ]C + + xn [T fn ]C
h i
= [T f1 ]C [T f2 ]C [T fn ]C [x]B .
h i
The m-by-n matrix [T f1 ]C [T f2 ]C [T fn ]C is called B C basis representation (or
simply, matrix representation) of T and it is denoted by B [T ]C , that is,
h i
B [T ]C = [T f1 ]C [T f2 ]C [T fn ]C . (1.1.1)

## For B = C, then we denote B [T ]B by [T ]B . Thus given a linear transformation T : X Y

matrix representation of T is not unique, its depend on basis choice of X and Y . Consider
the following example.

## Example 1.2. Consider linear map T : R2 R2 defined by as follows

     
x1  x1 + x2 x1
T = , for any R2 .
x2 x1 x2 x2
n 1 0 o n 1 1 o
Also consider B = , and C = , be two basis of R2 . Now from 1.1.1
0 1 0 1
we have
     
2
 1   0 
B [T ]C = T T
0 C 1 C
      
1 1 0 2
= = .
1 C 1 C 1 1

Similarly we have  
1 2
C [T ]B = .
1 0
From the above example we observed that a linear map can have many matrix repre-
sentation. Also a same matrix can be matrix representation of T for two different pairs of
bases. For example consider the following

Example 1.3. Let X and Y are linear space over F with B = {f1 , f2 , , fn }, C =
{g1 , g2 , , gm } are basis of X and Y , respectively and T : X Y be a linear trans-
formation. Then B 0 = {af1 , af2 , , afn }, C 0 = {ag1 , ag2 , , agm } are also basis of X and

2
Rakesh Jana IIT Guwahati

## Y , respectively, where a F any fixed scalar. Then we have

h i
B 0 [T ]C 0 = [T (af1 )]C 0 [T (af2 )]C 0 [T (afn )]C 0
h i
= [T (f1 )]C [T (f2 )]C [T (fn )]C
= B [T ]C .
Till now we have discussed how to represent a linear map into matrix form. Now suppose
that A = B [T ]C is given. What can we say about the corresponding T : X Y . Now we
know that [T v]C = B [T ]C [v]B , for any v X. Thus we have, for any v X,
T v = C[T v]C = C B [T ]C [v]B = C B [T ]C B 1 v. (1.3.1)
Example 1.4. Consider B, C and A = B [T ]C is given as in Example 1.2. Then from 1.3.1
we have, for any x = [x1 x2 ]t R2 ,
T x = CAB 1 x
    
1 1 0 2 1 0 x1
=
0 1 1 1 0 1 x2
 
x1 + x2
= ,
x1 x2
as expected.
Now consider X = Y = Fn and B = {f1 , f2 , , fn } and C = {g1 , g2 , , gn } be two
ordered basis of Fn and T : Fn Fn is a linear map. Then we have
[T ]C = C 1 [T g1 T g2 T gn ]
3 = C 1 B[T ]B B 1 C
= (B 1 C)1 [T ]B (B 1 C) (1.4.1)
Notice that [T ]C and [T ]B both matrix representation of same linear transformation and
they related by a nonsingular matrix S such that
[T ]C = S 1 [T ]B S (1.4.2)
Thus if one knows the matrix representation of a linear operator in some ordered basis B
and wishes to find the matrix representation in another ordered basis C, then it is easy to
find using 1.4.1.
Lemma 1.5. Let T : X X be a linear transformation from a vector space X to X and
B be an ordered basis of X with |B| = n. Then set of all possible matrix representation (or
basis representation) is
n o
S 1 [T ]B S | S Mnn is invertable .

The relation 1.4.1 of [T ]B and [T ]C is called similarity. In general, we have the following
definition.

3
Rakesh Jana IIT Guwahati

## Definition 1.6. Let A, B Mn be two square matrix, A is said to be similar to B if

there exist a nonsingular matrix P such that B = P 1 AP .

Note that if A is similar to B, then B is similar to A. Thus we can cimply say that A
and B are similar. Also note that only matrix similar to the identity matrix I is I itself,
and only matrix similar to the zero matrix is the zero matrix itself. We already observe in
Lemma 1.5 that if A and B are n n matrices representing the same liniear transformation
T on a vector space V , then A and B are similar.
Exercise 1.7. Suppose that A and B are similar n n matrices. Show that
1. det(A) = det(B),
2. trace(A) = trace(B),
3. rank(A) = rank(B).
Note that similarity is an equivalence relation on Mnn , the set of n n matrices over
the field F. Thus similarity partitions the set Mnn into disjoint equivalence classes. Each
equivalence class is the set of all matrices in Mnn which are similar to each other and two
matrix belongs to different equivalence classes are not similar. That means each equivalence
class represent a unique linear transformation. Matrices belongs to same class share many
important property (e.g. rank, trace, det etc.)
Following lemma gives another necessary condition for similarity.
Lemma 1.8. Let A, B Mnn and A is similar to B. Then A and B have same charac-
teristic polynomial
Proof. Given that A is similar to B. Thus there exist a nonsingular matrix P such that
B = P 1 AP , then 4
PB (t) = det(tI B)
= det(tI P 1 AP )
= det[P 1 (tI A)P ]
= det(P 1 ) det(tI A) det(P )
= det(tI A) = PA (t).

Above lemma enables gives thta given any linear tranformation T one can define sensibly
the characteristic polynomial of the operator T as the characteristic polynomial of any matrix
representation of T .
Example 1.9. Converse of above theorem is not true, that is, having same  eigenvalues
  isa
0 1 0 0
necessary condition but not sufficient condition for similarity. Consider and ,
0 0 0 0
which have same eigenvalue (i.e. 0) but are not similar as rank of two matrices are not equal.

4
Rakesh Jana IIT Guwahati

We already notice that from Lemma ?? that similar matrices have same eigenvalues and
if a matrix similar to some diagonal matrix then diagonal elements of that diagonal matrix
are the eigenvalues of A. The class of matrices which are similar to diagonal matrix is called
diagonalizable.

## Definition 1.10 (Diagonalizable). A matrix A Mnn is said to be diagonalizable is it

is similar to a diagonal matrix.

1.1 Exercise
Exercise 1.11. Let T be a linear operator on Fn , let A be the matrix of T in the standard
ordered basis for Fn , and let W be the subspace of Fn spanned by the column vectors of A.
What does W have to do with T ?
Exercise 1.12. If A and B are in Mnn (C) are similar, then for any polynomial f (x) in x,
f (A) and f (B) are similar.
Exercise 1.13. If is an eigenvalue of A, then f () is an eigenvalue of f (A). In particular,
k is an eigenvalue of Ak .
Exercise 1.14. If AP = QA for diagonal P and Q, then Af (P ) = f (Q)A.
Exercise 1.15. Show that there are no matrices A and B in Mnn such that AB BA = In .
Exercise 1.16. Suppose that A and B be n n matrices. Show that if A is similar to B,
then An is similar to B n for all n 2.
Exercise 1.17. Let be a real number. Prove that the following matrices are similar over
5 the field of complex numbers:
   i 
cos sin e 0
,
sin cos 0 ei
Exercise 1.18. Show that an n n square matrix is similar to a diagonal matrix if and
only if the matrix has n linearly independent eigenvectors. Does the matrix have to haven
distinct eigenvalues?
Exercise 1.19. Let A Mnn . If A have n eigenvalues distinct from each other, then A is
diagonalizable.
Exercise 1.20. Let A Mnn . If matrix A commutes with a matrix with all distinct
eigenvalues, then A is diagonalizable.
Exercise 1.21. Find the eigenvalues and corresponding eigenvectors of the matrix

1 2 2
A = 2 1 2 .
2 2 1
And then find an invertible matrix P such that P 1 AP is diagonal.

5
Rakesh Jana IIT Guwahati

Exercise 1.22. Show that the following matrix is not similar to a diagonal matrix:

2 2 2
A= 1 4
2
1 3 1

Exercise 1.23. Show that if two real square matrices are similar over C, then they must
be similar over R. What if real is changed to rational? Find a rational matrix M such
that    
1 1 2 2 1
M M=
2 1 1 2

Exercise 1.24. For any n n complex matrix A, show that A and At are similar. Are A
and A necessarily similar? Can A be similar to A + I?
Exercise 1.25. Show that every matrix A satisfy A2 = A is similar to diagonal matrix.

## 2 Jordan Canonical Forms

Most problems in Mathematics related to a matrix can be solved easily if the matrix is
similar to diagonal matrix, that is, if the matrix is diagonalizable. For example, this is true
in computing power of An , in solving a linear differential equation x0 (t) = Ax(t). So, it is
very much required a sufficient condition for similarity.
We already noticed that characteristic polynomial, determinant, rank, trace are necessary
condition not sufficient condition for similarity. Also notice that combining all the above
listed property also does not give sufficient condition for similarity. For example consider

0 1 0 0

0 1 0 0
6
0 0 0 0
and 0 0 1 0 .

0 0 0 1 0 0 0 0
0 0 0 0 0 0 0 0

Both the above matrix have the same eigenvalues, and hence they have the same character-
istic polynomial, trace, and determinant. They also have the same rank, but they are not
similar as A2 = 0 and B 2 6= 0.
Question: How we get sufficient condition for similarity? That means how one can tell
whether two given matrices are similar or not?
One approach to determining whether given two square matrices A and B are similar is
to set a special classes of matrices and see if both given matrices can be reduced by similarity
to the same special matrix then we can tell A and B must be similar, as similarity is an
equivalence relation. If not then we would like to be able to conclude that A and B are not
similar. What sets of special matrices would be suitable for this purpose? The following
theorem give some idea about special matrices.

6
Rakesh Jana IIT Guwahati

## Theorem 2.1 (Schur triangulation). Let A Mnn (C) have eigenvalues 1 , 2 , , n .

Then there exist a unitary matrix U Mnn (C) such that U AU = T = [tij ] is upper
triangular with diagonal entries tii = i , i = 1, 2, , n.

Above theorem gives that every square matrix is similar to an upper triangular matrix.
Thus the set of upper triangular matrix can serve the purpose but two upper triangular
matrices with same main diagonal but some different off-diagonal entries can still be similar.
Thus we have a uniqueness problem. For example consider

1 3 0 1 0 0
0 2 4 and 0 2 5 .
0 0 3 0 0 3

Thus if we reduce A and B to two unequal upper triangular matrices with the same main
diagonal, we cannot conclude from this fact alone that A and B are not similar.
Thus the class of upper triangular matrices is not served our purpose also it is too large for
our purposes, but what about the smaller class of diagonal matrices? In that case uniqueness
is no longer an issue, but now we have an existence problem: Some similarity equivalence
classes contain no diagonal matrices.
Viewing the above fact we will find a suitable class of upper triangular matrices. This
class of matrix is called as Jordan matrix.

## Definition 2.2 (Jordan block). A Jordan block Jk () is a k k upper triangular matrix

of the form
7
1 0 0

0 1 0

.. .. . . . . .. .
. . . . .

0 0 0 1
0 0 0

## where n1 + n2 + + nk = n, that is,

Jn1 (1 ) 0 0
0 J n2 (2 ) 0
J = .. (2.2.1)

.. .. ..
. . . .
0 0 Jnk (k )

7
Rakesh Jana IIT Guwahati

The following theorem gives that any square matrix A is similar to a unique1 similar
Jordan matrix. French mathematician Camille Jordan (1838 1922) first published a proof
of 2.3.

Theorem 2.3 (Jordan canonical form). Let A Mnn then there exist a unique (upto
permutations of the Jordan blocks) Jordan matrix J Mnn such that A is similar to J.
This J is called Jordan canonical form of A.

The proof of Theorem 2.3 may be beyond a beginning linear algebra course. Interested reader
can found the proof in Matrix Analysis, R. Horn and C. Johnson, Cambridge University
Press, 2013, pp 165. Here we are only concerned with how the Jordan canonical form J of
A.
If J is the Jordan canonical form of a matrix A, then they have the same eigenvalues
and the same number of linearly independent eigenvectors, but not the same set of them in
general. (Note that x is an eigenvector of J = P 1 AP if and only if P x is an eigenvector of
A).

## Definition 2.4 (Generalized eigenvector). A nonzero vector x Fn is said to be a

generalized eigenvector of A Mnn of rank k belongs to an eigenvalue if

## Exercise 2.5. Let x be a generalized eigenvector of A of rank k belongs to an eigenvalues

. Then {xi = (A I)i x : i = 0, 1, , (k 1)} is linearly independent.

## Solution. Let us consider ci F, i = 0, , (k 1) such that

8
k1
X
ci xi = 0. (2.5.1)
i=0

Multiply both side in the above equation by (A I)k1 gives c0 = 0. Do the same thing in
2.5.1 with (A I)k2 gives us c1 = 0. Proceeding successively, one can show that ci = 0,
for all i = 0, , (k 1) . 
Properties: Let k 2 be given. Let ei Ck denote the i-th basis element of standard
basis of Ck and x Ck . Then followings are hold:
 
t 0 0
1. Jk (0)Jk (0) = ;
0 Ik1
2. Jk (0)p = 0 if p k;
1
unique up to permutations of the Jordan blocks

8
Rakesh Jana IIT Guwahati

3. Jk (0)ei+1 = ei , i = 1, 2, , k 1;
 
x
4. I Jkt (0)Jk (0) x = 1 =< x, e1 > e1 .
 
0

## 2.1 Number of Jordan Blocks

Before going to start anything let us see an example which illustrates which matrices A have
a Jordan canonical form J and how the number of linearly independent eigenvectors of A
(or J) correspond to the Jordan blocks in J.

## Example 2.6. Let J is Jordan canonical matrix of the form:

 
6 1
0 6 0 0
  J2 (6) 0 0
J = 0 2 1 = 0 J2 (2) 0
0
0 2  
0 0 J1 (2)
0 0 2

Find the number of linearly independent eigenvectors of J and determine all matrices whose
Jordan canonical forms is J.

Solution. Clearly the eigenvalues of J are 6 and 2 with multiplicities 2 and 3, respectively.
Since nullity of (J 6I) = 5 rank(J 6I) = 1, thus there has single linearly independent
eigenvectors corresponding to eigenvalue 6. Similarly nullity of (J 2I) = 5 3 = 2 gives
that there has two linearly independent eigenvectors corresponding to eigenvalue 2. Notice
9 that in J we have single block for eigenvalue 6 and two blocks for eigenvalue 2.
Hence, one can conclude that if a matrix A is similar to J, then A is 5 5 matrix whose
eigenvalues are 6 and 2 with algebraic multiplicity 2 and 3, and geometric multiplicity 1 and
2, receptively. Moreover converse is also true by Theorem 2.3. 
In general, one can say that if a matrix A is similar to a Jordan canonical matrix J, then
both matrices have the same eigenvalues of the same algebraic and geometric multiplicities.

Exercise 2.7. Let J is Jordan canonical form of A. Then show that rank(A I)k =
rank(J I)k , for any k N and number of linearly independent eigenvector of A and J
are equal.

## Furthermore, if J Mnn is the Jordan canonical form of A Mnn and be an

eigenvalue of A, then the sequence J I, (J I)2 , (J I)3 , , all Jordan blocks in
J belongs to the eigenvalue will terminate to a zero matrix but all other block (belongs
to an eigenvalue different from ) remains upper triangular matrices with nonzero diagonal
entries. Hence, the sequence of rank(J I)k must stop decreasing at n a , where a is
algebraic multiplicity of .

9
Rakesh Jana IIT Guwahati

Exercise 2.8. Let J is the Jordan canonical form of A and be an eigenvalue of A. Then
size of largest Jordan block for (is called index of ) is the least positive integer such that
(A I)k = 0.

Let mi be the number of Jordan block for of size atleast i. Notice that Jk ()I = Jk (0)
has rank k 1 and for k 2, (Jk () I)2 = Jk (0))2 has rank k 2.
Thus the number of Jordan block for in J can be calculated from the following equations:

m1 = n rank(A I) (2.8.1)
mi = rank(A I)i1 rank(A I)i , for i( N) 2. (2.8.2)

The following observation about Jordan block will help us to find Jordan canonical form
of a matrix.

Observations:

1. For the matrix Jk (), the algebraic multiplicity of is k but geometric multiplicity of
is 1.
2. If J, Jordan canonical form of A, has k Jordan blocks for , then rank(J I) =
n k = rank(A I), i.e. A has k linearly independent eigenvectors for .
3. The number of Jordan blocks for in Jordan canonical form of A is the geometric
multiplicity of .
4. The same of size of all Jordan blocks for in Jordan canonical form of A is the algebraic
multiplicity of ,

## 2.2 Minimal Polynomial

10
We know that a polynomial P (t) is said to be annihilating polynomial of A if P (A) = 0. In
that case we call that P annihilates A. The Cayley-Hamilton theorem guarantees that for
each A Mnn there is a monic polynomial pA (t) of degree n (the characteristic polynomial)
such that pA (A) = 0. Of course, there may be a monic polynomial of degree n 1 that
annihilates A, or one of degree n 2 or less. So, one can try to find a monic polynomial of
minimum degree that annihilates A. It is clear that such a polynomial exists; the following
theorem says that it is unique.

Theorem 2.9. Let A Mnn . Then there is a unique monic nonzero polynomial Q(t) of
minimum degree such that Q(A) = 0 and Q(t) divides P (t), for any annihilating polynomial
P (t) of A.

Viewing the above theorem one can define minimal polynomial of a matrix A. Mini

10
Rakesh Jana IIT Guwahati

Definition 2.10. The minimal polynomial of a matrix A is the unique monic polynomial
of smallest degree QA (t) such that QA (t) = 0.

Clearly, the minimal polynomial QA (t) divides any polynomial P (t) satisfying P (A) = 0
(Why?). In particular, the minimal polynomial divide the characteristic polynomial.

## Lemma 2.11. Similar matrices have the same minimal polynomial.

Example 2.12. Notice that converse of above Lemma is not true. Same minimal polynomial
does not give that matrices are similar. For example consider, for any a F,

a 1 0 0 a 1 0 0
0 a 0 0 0 a 0 0
A= 0 0 a 0 and B = 0 0 a 1 .

0 0 0 a 0 0 0 a

Notice in the above example both matrix have same characteristic polynomial as well as
same characteristic polynomial but they are not similar. Hence same minimal polynomial
and same characteristic polynomial also is not sufficient condition for similarity.

Theorem 2.13. Let A Mnn be a given matrix whose distinct eigenvalues are 1 , , k .
The minimal polynomial of A is
k
Y
QA (t) = (t i )ri
11 i=1

where ri is the size of the largest Jordan block of A corresponding to the eigenvalue i .

Converse of the above theorem is also true. That means if we know the minimal poly-
nomial then we can say about the size of largest Jordan block of A corresponding to that
eigenvalue.
In practice, this result is not very helpful in computing the minimal polynomial since it
is usually harder to determine the Jordan canonical form of a matrix than it is to determine
its minimal polynomial.
The following algorithm is to find the minimal polynomial of a given matrix A:

Step 1: First compute the eigenvalues of A, together with their algebraic multiplicities and
write the characteristic polynomial and factoring it completely.

Step 2: Starting with the linear product of factor in characteristic polynomial check which of
them annihilates A; that will be the minimal polynomial.

11
Rakesh Jana IIT Guwahati

## 2.3 The real Jordan canonical form

Suppose that A Mnn (R), so any nonreal eigenvalues must occur in complex conjugate
k
pairs. We have rank(A I)k = rank (A I)k = rank (A I) = rank(A I)k for any
C and all k = 1, 2, . Also Jordan structure of A corresponding to any eigenvalue
is the same as the Jordan structure of A corresponding to the eigenvalue . Thus, all the
Jordan blocks of A of all sizes with non-real eigenvalues occur in conjugate pairs of equal
size. Also notice that the matrix
 
1
0 0

 
1
0
0

## is permutation similar to the block upper triangular matrix

   
1 1 0
0 0 1  
D() I2
 = ,

0 D()

1
0
0
 
1
where D() = . In general, any Jordan matrix of the form
0
 
Jk ()
M2k2k
Jk () 12
is permutation similar to block upper triangula matrix

D() I2
D() I2
.. ..

. . M2k2k

(2.13.1)
..
. I2
D()

which has k 2 2 D() on the main diagonal and k 1 blocks I2 on the block superdiagonal.
Now let = a + ib, a, b R. Then D() similar to a real matrix
 
a b
C(a, b) =
b a

Also, every block matrix of the form 2.13.1 with non-real is similar to a real block matrix

12
Rakesh Jana IIT Guwahati

of the form

C(a, b) I2
C(a, b) I2
.. ..

Ck (a, b) =
. . M2k2k

(2.13.2)
..
. I2
C(a, b)
These observation leads us to the real Jordan canonical form theorem.

Theorem 2.14 (Real Jordan canonical form). Let A Mnn (R) have non-real noncon-
jugate eigenvalues i = ai + ibi , where ai , bi R, for each i =, p and real distince
eigenvalues q , r . Then A is similar to a real block diagonal matrix of the form

Cn1 (a1 , b1 )
..
.

Cnp (ap , bp )
, (2.14.1)

Jnq (q )

..
.
Jnr (r )

where ach real block triangular matrix Cnk (ak , bk ) M2k2k is of form 2.13.2 and corre-
sponds to a pair of conjugate Jordan blocks Jnk (k ), Jnk (k ) Mkk with non-real k in
13 the Jordan canonical form of 2.3 of A

Exercise 2.15. let = a + ib, a, b R. Then show that D() similar to a real matrix
C(a, b).

2.4 Exercise
Exercise 2.16. Suppose that A Mnn (R) and A2 = In . Show that n must be even and
that there is a nonsingular S Mnn (R) such that
 
1 0 In/2
S AS = .
In/2 0
Exercise 2.17. Let N1 and N2 be 3 3 nilpotent matrices over the field F. Prove that N1
and N2 are similar if and only if they have the same minimal polynomial.
Exercise 2.18. use the result 2.18 and the Jordan form to prove the following: Let A and
B be n n matrices over the field F which have the same characteristic polynomial
f = (x c1 )d1 (x ck )dk
and the same minimal polynomial. If no di is greater than 3, then A and B are similar.

13
Rakesh Jana IIT Guwahati

Exercise 2.19. Let A and B be n n matrices over the field F which have the same
characteristic polynomial and the same minimal polynomial. Can u conclude that A and B
are similar.
Solution. No. The minimal polynomial just tells us the size of the biggest Jordan blocks for
the respective eigenvalues. For example, for any a F , consider

Exercise 2.20. If n 3, then show that two matrices are similar if and only if they have
the same characteristic polynomial and the same minimal polynomial.
Exercise 2.21. If A is a complex 5 5 matrix with characteristic polynomial

f = (x 2)3 (x + 7)2

and minimal polynomial p = (x 2)2 (x + 7), what is the Jordan form for A?
Exercise 2.22. How many possible Jordan forms are there for a 6 6 complex matrix with
characteristic polynomial (x + 2)4 (x 1)2 ?
Exercise 2.23. Find the Jordan form for A, where

2 0 0 0 0 0
1 2 0 0 0 0

1 0 2 0 0 0
A= 0

1 0 2 0 0
1 1 1 1 2 0
0 0 0 0 1 1

Exercise 2.24. show that every projection matrix A (i.e., idempotent matrix, A2 = A) is
14
diagonalizable. What is the minimal polynomial of A? What can you say if A is tripotent (
A3 = A)? What if Ak = A?
Exercise 2.25. Classify up to similarity all 3 3 complex matrices such that A3 = I
Exercise 2.26. Classify up to similarity all n n complex matrices such that An = I
Exercise 2.27. Show that there is no real 3-by-3 matrix whose minimal polynomial is x2 +1,
but that there is a real 2-by-2 matrix as well as a complex 3-by-3 matrix with this property.
Exercise 2.28. If N is a k k elementary nilpotent matrix, i.e., N k = 0 but N k1 6= 0, show
that N t is similar to N . Now use the Jordan form to prove that every matrix A Mnn (C)
is similar to its transpose, AT .
Exercise 2.29. Let N1 and N2 be 6 6 nilpotent matrices over the field F. Suppose that
N1 and N2 have the same minimal polynomial and the same nullity. Prove that N1 and N2
are similar. Show that this is not true for 7 7 nilpotent matrices.

14