You are on page 1of 8

DETERMINANT

With each square matrix corresponds just one number. This number is called the
determinant of the matrix. The determinant of a matrix A is denoted det(A) or |A|. Now
we'll define this correspondence.

Determinant of a 1 x 1 matrix

De determinant of the matrix is the element itself.


Ex: det([-7]) = -7

Permutation of n ordered elements

Say S is an ordered set of n elements. A one-one transformation t of the set S onto itself is
a permutation of S.
Example: S = (1, 2, 3, 4, 5) . A permutation t is defined by
t(1, 2, 3, 4, 5) = (2, 5, 4, 1, 3)
The permutation, witch transforms (2, 5, 4, 1, 3) back to (1, 2, 3, 4, 5) is called the
inverse permutation of t.

Transposition

A permutation, which interchanges two elements, and fixes all others, is called a
transposition.
Example: S = (1, 2, 3, 4, 5) . The permutation defined by
t(1, 2, 3, 4, 5) = (1, 4, 3, 2, 5) is a transposition

Theorem

Every permutation of n ordered elements can be expressed as a sequence of


transpositions. If this permutation is a sequence of an even number of transpositions, it is
impossible to write this permutation as a sequence of an odd number of transpositions.

Even and odd permutations

If a permutation of n ordered elements can be expressed as an even number of


transpositions, then it is called an even permutation. If a permutation of n ordered
elements can be expressed as an odd number of transpositions, then it is called an odd
permutation.
The inverse permutation has exactly the same number of transpositions as t.
So, if t is even, its inverse is even too.
Example: S = (1, 2, 3, 4, 5)
t(1, 2, 3, 4, 5) = (1, 3, 4, 2, 5) is an even permutation.
t(1, 2, 3, 4, 5) = (1, 3, 4, 5, 2) is an odd permutation.
Sign of a permutation

The sign of an even permutation t is +1. We write : sgn(t) = +1.


The sign of an odd permutation t is -1. We write : sgn(t) = -1.

The determinant of an n x n matrix

Let S = (1, 2, 3, ... , n)


t is a permutation of S, so t(1, 2, 3, ... , n) = (t(1), t(2), ... , t(n)).
There are n! permutations of S.
A is a n x n matrix with elements ai, j.

Now, with each permutation t of S, create the product


sgn(t) . a1, t(1) . a2, t(2) . a3, t(3) . ... . an, t(n). There are n! such products.
|A| is defined as the sum of all those products.
Note that each term of |A| involves each row and each column only once.

Example1 : We want to calculate the determinant of a 2x2 matrix A.


Now n = 2 and there are only two permutations of S = (1, 2).
t(1, 2) = (1, 2) with sgn(t) = +1
t'(1, 2) = (2, 1) with sign(t') = -1
We have only two terms +1.a1, 1 . a2, 2 and -1.a1, 2 . a2, 1
Thus the determinant of A is a1, 1 . a2, 2 - a1, 2 . a2, 1
We don't forget the rule :

|a b|
|c d|

= ad - cb

Example2 : We want to calculate the determinant of a 3x3 matrix A.


Now n = 3 and there are only 6 permutations of S = (1, 2, 3).
These 6 permutations transform (1, 2, 3) in:

(1, 2, 3) (2, 3, 1) (3, 1, 2) (even permutations)


(3, 2, 1) (1, 3, 2) (2, 1, 3) (odd permutations)
Now we have six terms to add
a1, 1 . a2, 2 . a3, 3 + a1, 2 . a2, 3 . a3, 1 + a1, 3 . a2, 1 . a3, 2
-a1, 3 . a2, 2 . a3, 1 - a1, 1 . a2, 3 . a3, 2 - a1, 2 . a2, 1 . a3, 3
We don't forget the rule :
|a b c|
|d e f|
|g h i|

= aei + bfg + cdh - ceg - afh - bdi


The last rule is known as the Sarrus rule for 3 x 3 determinants.

To calculate larger determinants there are a lot of other methods involving various
properties of determinants.

Row and columns of the determinant

If we say the ith row of a determinant we mean the ith row of the matrix corresponding
with this determinant. If we say the ith column of a determinant we mean the ith column
of the matrix corresponding with this determinant.

Cofactor of an element ai, j

Now choose a fixed row value i.


Since each row appears once and only once in each term of |A|, each term of |A| contains
exactly one of the factors ai, 1, ai, 2, ai, 3, ... ai, n.
Thus, we can write |A| as a linear polynomial in ai, 1, ai, 2, ai, 3, ... ai, n.
We denote the coefficients respectively Ai, 1, Ai, 2, Ai, 3, ... Ai, n.
These coefficients are called the cofactors. Ai, j is the cofactor of ai, j.
|A| = Ai, 1 . ai, 1 + Ai, 2 . ai, 2 + Ai, 3 . ai, 3 + ... Ai, n . ai, n.
Since each term of |A| involves each row and each column only once, the cofactor Ai, j is
independent of the elements of the ith row and the elements of the jth column.
It contains only elements from the matrix obtained from A by crossing out the ith row
and the jth column.
Remark: If we write |A| = Ai, 1 . ai, 1 + Ai, 2 . ai, 2 + Ai, 3 . ai, 3 + ... Ai, n . ai, n, we say that the
determinant is calculated emanating from the ith row.
Example :
|a b c|
|d e f|
|g h i|

= aei + bfg + cdh - ceg - afh - bdi


Choose for instance row 2.
Each term of |A| contains exactly one of the factors d , e ,f .
Thus, we can write |A| as a linear polynomial in d , e, f.
|A| = (ch-bi)d + (ai-cg)e + (bg-ah)f
ch-bi is the cofactor of d.
ai-cg is the cofactor of e.
bg-ah is the cofactor of f.
Not any cofactor contains an element of the chosen row 2.
The cofactor of d contains neither an element of row 2 nor an element of column 1.

If we write |A| = (ch-bi)d + (ai-cg)e + (bg-ah)f , we say that the determinant is calculated
emanating from the second row.
Similarly, we can start with a fixed column and then write |A| as a linear polynomial in a1,
j, a2, j, a3, j, ... an, j. Then one finds the same cofactors. So ai, j has a unique cofactor Ai, j.
A matrix A and its transpose have the same determinant.

We call A' the transpose of A, ai, j' = aj, i.


We know that |A'|
= sum of all products sgn(t) . a1, t(1)' . a2, t(2)' . a3, t(3)' ... an, t(n)'
= sum of all products sgn(t)at(1), 1 . at(2), 2 . at(3), 3 ... at(n), n.
Since t(1), t(2), ... , t(n) is a permutation of 1, 2, 3 ... n , we can reorder the factors of each
term, according to the first index. This can be done using the inverse permutation of t.
The permutation t transforms (1, 2, 3 ... n) to (t(1), t(2), ... , t(n)), so the inverse
permutation t' brings (t(1), t(2), ... , t(n)) back to (1, 2, 3 ... n) and this inverse permutation
has exactly the same number of transpositions as t. So sign(t) = sign(t'). Then |A'|
= sum of all products sgn(t') . a1, t'(1) . a2, t'(2) . a3, t'(3) ... an, t'(n)
Because the set of all permutations is the same set of all inverse permutations. |A| = |A'|.

Important result

Appealing on previous property, it is immediate that each property we'll find for the rows
of a matrix, also holds for the columns and each property for the columns holds for the
rows.

Interchanging two columns of A

First, denote t' the permutation transposing only i and j.


Thus t'(1, ... ,i, ... , j, ... , n) = (1, ... , j, ... , i, ... , n). Sgn(t') = -1 and for each permutation t
we have sign(t't) = -sign(t).
Say A' is obtained by interchanging the column i and j of A.
For each k we have ak, i' = ak, j or even for each k and each l we have ak, l' = ak, t'(l)
We investigate |A'|.
We know that |A'|
= sum of all products sgn(t) . a1, t(1)' . a2, t(2)' . a3, t(3)' ... an, t(n)'
= sum of all products sgn(t) . a1, t't(1) . a2, t't(2) . a3, t't(3) ... an, t't(n)
= sum of all products -sgn(t't) . a1, t't(1) . a2, t't(2) . a3, t't(3) ... an, t't(n)
Since the set of permutations of (1 ... n) is a group, the set of all permutations t and the set
of all permutations t" = t't is the same set.
Therefore |A'|
= sum of all products -sgn(t't) . a1, t't(1) . a2, t't(2) . a3, t't(3) ... an, t't(n)
= sum of all products -sgn(t") . a1, t"(1) . a2, t"(2) . a3, t"(3) ... an, t"(n)
= -|A|
Conclusion :
When we change two columns in A, |A| changes sign.
When we change two rows in A, |A| changes sign.

Multiplying a row of A with a real number

Say A' is obtained by multiplying the ith row of A by a real number r.


Then ai, k' = r . ai, k for each k and fixed i, and if j is not i then aj, k' = aj, k
We know that |A'|
= sum of all products sgn(t) . a1, t(1)' . a2, t(2)' . a3, t(3)' ... ai, t(i)' ... an, t(n)'
= sum of all products sgn(t) . a1, t(1) . a2, t(2) . a3, t(3) ... r.ai, t(i) ... an, t(n)
= sum of all products r . sgn(t) . a1, t(1) . a2, t(2) . a3, t(3) ... ai, t(i) ... an, t(n)
= r.(sum of all products sgn(t) . a1, t(1) . a2, t(2) . a3, t(3) ... ai, t(i) ... an, t(n))
= r.|A|
Conclusion :
When we multiply a row in A with a real number r, |A| changes in r.|A|
When we multiply a column in A with a real number r, |A| changes in r.|A|

If A has two equal rows, |A| = 0

If we interchange these two rows, the determinant does not change. Appealing on
previous property the determinant changes in its opposite. This is only possible when the
determinant = 0.

Addition of two determinants, which differ only from the ith row

Say A and B are matrices which are only different in the ith row.
For all j different from i and all k we have aj,k = bj, k.
|A|
= sum of all products sgn(t) . a1, t(1) . a2, t(2) . a3, t(3) ... ai, t(i) ... an, t(n)
|B|
= sum of all products sgn(t) . b1, t(1) . b2, t(2) . b3, t(3) ... bi, t(i) ... bn, t(n)
= sum of all products sgn(t) . a1, t(1) . a2, t(2) . a3, t(3) ... bi, t(i) ... an, t(n)
So |A| + |B|
= sum of all products sgn(t) . a1, t(1) . a2, t(2) . a3, t(3) ... (ai, t(i) + bi, t(i)) ... an, t(n)
= determinant of the matrix formed by adding the ith row from A and B, and taking the
other elements from A or from B.
The same rule holds for columns
Ex.
|a b c| |a b' c| |a b+b' c|
|d e f|+|d e' f| = |d e+e' f|
|g h i| |g h' i| |g h+h' i|

Equal determinants

Let A be any square matrix.


Let B be the matrix formed by replacing in A the ith row with the jth row, leaving the jth
row unchanged.
Since B has two equal rows |B| = 0.
Let C be the matrix formed by multiplying the ith row from B with r, leaving the other
elements unchanged.
Then |C| = r.|B|=0
A an C differ only from the ith row, so we can use previous property and we have:
|A|+0 = |A|+|C| = determinant of the matrix formed by adding the ith row from A and C,
and taking the other elements from A or from C.
Therefore, a determinant does not change if we add a multiple of a row to another row.
The same rule holds for columns
Ex.
|a b c| |a+rd b+re c+rf|
|d e f| = | d e f |
|g h i| | g h i |

The cofactor A1, 1

Let A be any square matrix. We know that |A|


= sum of all products sgn(t) . a1, t(1) . a2, t(2) . a3, t(3) ... an, t(n).
A1, 1 is the coefficient of a1, 1 in this sum.
The terms containing a1, 1 are the terms with t(1) = 1.
Recall the sum of all products sgn(t) . a1, t(1) . a2, t(2) . a3, t(3) ... an, t(n).
Instead of taking this sum over all permutations of (1 ... n), we only take the sum over the
permutations t' with t(1) = 1.
This sum then gives A1, 1 . a1, 1.
These special permutations t' are all the permutations of (2..n).
Thus, A1, 1 . a1, 1= sum of all products sgn(t') . a1, 1 . a2, t'(2) . a3, t'(3') ... an, t'(n).
Then A1, 1 = sum of all products sgn(t') a2, t'(2) . a3, t'(3') ... an, t'(n).
This is, by the definition of a determinant, the determinant of the sub-matrix of A
obtained from A by crossing out the first row and the first column.
Conclusion:
A1, 1 = the determinant of the sub-matrix of A obtained from A by crossing out the first
row and the first column.

The cofactor Ai, j

Let A be any square matrix. Focus the element e = ai, j. Interchange in succession row i
and i-1; i-1 and i-2; ... until e is on the first row. This demands i-1 steps. Then we
interchange in succession column j and j-1; j-1 and j-2; ... until e is on the first column
and on the first row. This demands j-1 steps. During this process the determinant of the
matrix changes i+j-2 times sign. Now the cofactor of e is the determinant of the sub-
matrix of obtained from by crossing out the first row and the first column. Now return to
the original matrix.
The value of Ai, j= (-1)i+j-2.(the determinant of the sub-matrix of A, obtained from A by
crossing out the ith row and the jth column.
Or stated simpler: The value of Ai, j = (-1)i+j.(the determinant of the sub-matrix of A,
obtained from A by crossing out the ith row and the jth column.

|I| = 1

We can prove this property by complete induction. It is easy to see that the property holds
for the 2 x 2 identity matrix. Assume that the property holds for the k x k identity matrix,
and we'll prove it holds for the (k+1) x (k+1) identity matrix. Let I be the (k+1) x (k+1)
identity matrix and we calculate this matrix emanating from the first row. |A| = A1, 1 . a1, 1
+ A1, 2 . a1, 2 + A1, 3 . a1, 3 + ... A1, n . a1, n. |A| = A1, 1.1 + A1, 2.0 + A1, 3.0 + ... A1, n.0. |A| = A1, 1
Now, the cofactor A1, 1 is the determinant of the k x k identity matrix, and this
determinant is 1.

Determinant of a diagonal matrix

It is also easy to prove, as above by complete induction, that the determinant of a


diagonal matrix is the product of the diagonal elements.

Determinant of a product of two square matrices

It can be proved that |A|.|B| = |A.B|

Quick Reference
The determinant definition

Let S = (1, 2, 3, ... , n)


t is a permutation of S, so t(1, 2, 3, ... , n) = (t(1), t(2), ... , t(n)).
A is a n x n matrix with elements ai, j.

Now, with each permutation t, create the product


sgn(t) . a1, t(1) . a2, t(2) . a3, t(3) ... an, t(n). |A| is defined as the sum of all those products.

Determinant of a 2x2 matrix

|a b|
|c d|

= ad - cb

Determinant of a 3x3 matrix

The Sarrus rule :


|a b c|
|d e f|
|g h i|

= aei + bfg + cdh - ceg - afh - bdi

Cofactor of ai, j
Choose a fixed row value i.
The determinant is calculated emanating from the ith row.
|A| = Ai, 1 . ai, 1 + Ai, 2 . ai, 2 + Ai, 3 . ai, 3 + ... Ai, n.ai, n Ai, j is called the cofactor of ai, j.
The cofactor Ai, j is independent of the elements of the ith row and the elements of the jth
column.
The value of Ai, j = (-1)i+j.(the determinant of the sub-matrix of A, obtained from A by
crossing out the ith row and the jth column.

Determinant of a nxn matrix

Choose a fixed row value i.


The determinant can be calculated emanating from the ith row.
|A| = Ai, 1 . ai, 1 + Ai, 2 . ai, 2 + Ai, 3 . ai, 3 + ... Ai, n . ai, n Ai, j is called the cofactor of ai, j.
The value of Ai, j = (-1)i+j.(the determinant of the sub-matrix of A, obtained from A by
crossing out the ith row and the jth column.

Properties

 A matrix A and its transpose have the same determinant.


 When we change two columns in A, |A| changes sign.
 When we change two rows in A, |A| changes sign.
 When we multiply a row in A with a real number r, |A| changes in r.|A|
 When we multiply a column in A with a real number r, |A| changes in r.|A|

 |a b c| |a b' c| |a b+b' c|
 |d e f|+|d e' f| = |d e+e' f|
 |g h i| |g h' i| |g h+h' i|

 A determinant does not change if we add a multiple of a row to another row.
 The same rule holds for columns
 The determinant of the identity matrix is 1.
 The determinant of a diagonal matrix is the product of the diagonal elements.
 |A|.|B| = |A.B|

You might also like