You are on page 1of 2

Adjugate matrices[edit]

All proofs below use the notion of the adjugate matrix adj(M) of an n × n matrix M, the transpose of
its cofactor matrix. This is a matrix whose coefficients are given by polynomial expressions in the
coefficients of M (in fact, by certain (n − 1) × (n − 1) determinants), in such a way that the following
fundamental relations hold,

These relations are a direct consequence of the basic properties of determinants: evaluation of
the (i, j) entry of the matrix product on the left gives the expansion by column j of the
determinant of the matrix obtained from M by replacing column i by a copy of column j, which
is det(M) if i = j and zero otherwise; the matrix product on the right is similar, but for expansions
by rows.
Being a consequence of just algebraic expression manipulation, these relations are valid for
matrices with entries in any commutative ring (commutativity must be assumed for determinants
to be defined in the first place). This is important to note here, because these relations will be
applied below for matrices with non-numeric entries such as polynomials.

A direct algebraic proof[edit]


This proof uses just the kind of objects needed to formulate the Cayley–Hamilton theorem:
matrices with polynomials as entries. The matrix t In − A whose determinant is the characteristic
polynomial of A is such a matrix, and since polynomials form a commutative ring, it has
an adjugate

Then, according to the right-hand fundamental relation of the adjugate, one has

Since B is also a matrix with polynomials in t as entries, one can, for each i, collect the
coefficients of t i in each entry to form a matrix Bi of numbers, such that one has

(The way the entries of B are defined makes clear that no powers higher than t 
n−1
 occur). While this looks like a polynomial with matrices as coefficients, we shall
not consider such a notion; it is just a way to write a matrix with polynomial entries
as a linear combination of n constant matrices, and the coefficient t i has been
written to the left of the matrix to stress this point of view.
Now, one can expand the matrix product in our equation by bilinearity:

Writing
one obtains an equality of two matrices with polynomial entries, written as
linear combinations of constant matrices with powers of t as coefficients.
Such an equality can hold only if in any matrix position the entry that is
multiplied by a given power t i is the same on both sides; it follows that the
constant matrices with coefficient t i in both expressions must be equal.
Writing these equations then for i from n down to 0, one finds

Finally, multiply the equation of the coefficients of t i from the left by Ai,
and sum up:

The left-hand sides form a telescoping sum and cancel completely; the

right-hand sides add up to  :

This completes the proof.

You might also like