You are on page 1of 7

Revision Notes on Matrices & Determinants

 Two matrices are said to be equal if they have the same order and each element of
one is equal to the corresponding element of the other.
 An m x n matrix A is said to be a square matrix if m = n i.e. number of rows = numb er
of columns.
 In a square matrix the diagonal from left hand side upper corner to right hand side
lower corner is known as leading diagonal or principal diagonal.
 The sum of the elements of a square matrix A lying along the principal diagonal is
called the trace of A i.e. tr(A). Thus if A = [a ij ]n× n , then tr(A) = ∑ n i=1 a ii = a 11 + a 22 +......+ a nn .
 For a square matrix A = [a ij ] n×n , if all the elements other than in the leading diagonal
are zero i.e. a ij = 0, whenever i ≠ j then A is said to be a diagonal matrix.
 A matrix A = [a ij ]n×n is said to be a scalar matrix if a ij = 0, i ≠ j
= m, i = j, where m ≠ 0
Properties of various types of matrices:
Given a square matrix A = [a ij ] n×n ,
 For upper triangular matrix, a ij = 0, ∀ i > j
 For lower triangular matrix, a ij = 0, ∀ i < j
 Diagonal matrix is both upper and lower triangular.
 A triangular matrix A = [a ij ]n×n is called strictly triangular if a ii = 0 for ∀1 < i < n.
Transpose of a matrix and its properties:
If A = [a ij ]m×n and transpose of A i.e. A' = [b ij ] n ×m then b ij =a ji , ∀i, j.
 (A')' = A
 (A + B)' = A' + B', A and B being conformable matrices
 (αA)' = αA', α being scalar
 (AB)' = B'A', A and B being conformable for multiplication
Properties of Conjugate of A i.e.

where α is any number real or complex


where A and B are comformable for multiplication
Properties of Transpose conjugate:
The transpose conjugate of A is denoted by A Θ.
If A = [aij ]m × n then A Θ = [b ji ]n× m where
, i.e. the (j, i) th element of A Θ = the conjugate
of (i, j) th element of A
1) (A θ) θ = A
2) (A + B) θ = A θ + B θ
3) (kA) θ = A θ , k being any number
4) (AB) θ = Bθ Aθ
Addition of matrices:
1) Only matrices of the same order can be added or subtracted.
2) Addition of matrices is commutative as well as associative.
3) Cancellation laws hold well in case of addition.
4) The equation A + X = 0 has a unique solution in the set of all m × n matric es.
5) All the laws of ordinary algebra hold for the addition or subtraction of matrices and
their multiplication by scalar.
Matrix Multiplication:
1) Matrix multiplication may or may not be commutative. i.e., AB may or may not be
equal to BA
2) If AB = BA, then matrices A and B are called Commutative Matrices.
3) If AB ≠ BA, then matrices A and B are called Anti-Commutative Matrices.
4) Matrix multiplication is Associative
5) Matrix multiplication is Distributive over Matrix Addition.
6) Cancellation Laws need not hold good in case of matrix multiplication i.e., if AB = AC
then B may or may not be equal to C even if A ≠ 0.
7) AB = 0 i.e., Null Matrix, does not necessarily imply that either A or B is a null matrix.
Special Matrices
 A square matrix A = [a ij ] is said to be symmetric when a ij = a ji for all i and j.
 If a ij = -aji for all i and j and all the leading diagonal elements are zero, then the matrix
is called a skew symmetric matrix.
 A square matrix A = [a ij ] is said to be Hermitian matrix if A θ = A.
1) Every diagonal element of a Hermitian Matrix is real.
2) A Hermitian matrix over the set of real numbers is actually a real symmetric
matrix.
 A square matrix, A = [a ij ] is said to be a skew-Hermitian matrix if A θ = -A.
1) If A is a skew-Hermitian matrix then the diagonal elements must be either
purely imaginary or zero.
2) A skew-Hermitian Matrix over the set of real numbers is actually a real skew -
symmetric matrix.
 Any square matrix A of order n is said to be orthogonal if AA' = A'A = I n .
 A matrix such that A 2 = I is called involuntary matrix.
 Let A be a square matrix of order n. Then A(adj A) = |A| I n = (adj A)A.
 The adjoint of a square matrix of order 2 can be easily obtained by interchanging the
diagonal elements and changing the signs of off-diagonal (left hand side lower corner
to right hand side upper corner) elements.
 A non-singular square matrix of order n is invertible if there exists a square matrix B
of the same order such that AB = I n = BA.
Elementary row/column operations:
 The following three operations can be applied on rows or columns of a matrix:
1) Interchange of any two rows (columns)
2) Multiplying all elements of a row (column) of a matrix by a non -zero scalar. If the
elements of ith row (column) are multiplied by non -zero scalar k, it will be denoted by
R i →R i (k) [C i →Ci (k)] or R i →kR i [C i →kCi ].
3) Adding to the elements of a row (column), the corresponding elements of any other
row (column) multiplied by any scalar k.
Rank of a matrix:
 A number ‘r’ is called the rank of a matrix if:
1) Every square sub matrix of order (r +1) or more is singular
2) There exists at least one square sub matrix of order r which is non -singular.
 It also equals the number of non-zero rows in the row echelon form of the matrix.
 The rank of the null matrix is not defined and the rank of every non null matrix is
greater than or equal to 1.
 Elementary transformations do not alter the rank of amtrix.
Minors, Cofactors and Determinant:
 Minor of the element at the ith row is the determinant obtained by deleting the ith
row and the jth column
 The cofactor of this element is (-1) i+j (minor).

where A 1 , B 1 and C 1 are the cofactors of a 1 , b 1 and c 1 respectively.


 The determinant can be expanded along any row or column, i.e.
Δ = a 2 A 2 + b 2 B 2 + c 2 C2 or Δ = a 1 A 1 + a 2 A 2 + a3 A 3 etc.
 The following result holds true for determinants of any order:
 The following result holds true for determinants of any order:
ai A j + b i B j + c i Cj = Δ if i = j,
= 0 if i ≠ j.
Adjoint and Inverse of a matrix:
Let A = [a ij ] be a square matrix of order n and let Cij be cofactor of aij in A. Then the
transpose of the matrix of cofactors of elements of A is called the adjoint of A an d is
denoted by adj A.
The inverse of A is given by A -1 = 1/|A|.adj A.
1) Every invertible matrix possesses a unique inverse.
2) If A and B are invertible matrices of the same order, then AB is invertible and (AB) -1 =
B -1 A -1 . This is also termed as the reversal law.
3) In general,if A,B,C,...are invertible matrices then (ABC....) -1 =..... C -1 B -1 A -1 .
4) If A is an invertible square matrix, then A T is also invertible and (A T) -1 = (A -1 )T .
(5) If A is a non-singular square matrix of order n, then |adj A| = |A| n-1 .
(6) If A and B are non-singular square matrices of the same order, then adj (AB) = (adj B)
(adj A).
(7) If A is an invertible square matrix, then adj(A T) = (adj A) T .
(8) If A is a non-singular square matrix, then adj(adjA) = |A| n-1 A.
Important Properties:
 If rows be changed into columns and columns into the rows, then the values of the
determinant remains unaltered.
 If any two rows (or columns) of a determinant are interchanged, the resulting
determinant is the negative of the original determinant.
 If two rows (or two columns) in a determinant have corresponding elements that are
equal, the value of determinant is equal to zero.
 If each element in a row (or column) of a determinant is written as the sum of two or
more terms then the determinant can be written as the sum of two or more
determinants.
 If to each element of a line (row or column) of a determinant, some mul tiples of
corresponding elements of one or more parallel lines are added, then the
determinant remains unaltered.
 If each element in any row (or any column) of determinant is zero, then the value of
determinant is equal to zero.
 If a determinant D vanishes for x = a, then (x - a) is a factor of D, in other words, if
two rows (or two columns) become identical for x = a, then (x-a) is a factor of D.
 In general, if r rows (or r columns) become identical when a is substituted for x, then
(x-) r-1 is a factor of D.
 The minor of an element of a determinant is again a determinant (of lesser order)
formed by excluding the row and column of the element.
 The determinant can be evaluated by multiplying the elements of a single row or a
column with their respective co-factors and then adding them, i.e.
Δ = ∑ m i =1 aij .Ci j , j = 1, 2, ...... m.
=∑ m
j =1 a ij .Ci j , i = 1, 2, ...... m.
 Sarrus Rule: This is the rule for evaluating the determinant of order 3. The process is
as given below:
1. Write down the three rows of a determinant.
2. Rewrite the first two rows.
3. The three diagonals sloping down the right give the three positive terms and the three
diagonals sloping down to the left give the three negative terms.

Multiplication of two Determinants:


1. Two determinants can be multiplied together only if they are of same order.
2. Take the first row of determinant and multiply it successively with 1st, 2nd & 3rd rows of
other determinant.
3. The three expressions thus obtained will be elements of 1st row of resultant
determinant. In a similar manner the element of 2nd & 3rd rows of determinant are
obtained.
 Symmetric determinant:
The elements situated at equal distance from the diagonal are equal both in magnitude
and sign. Eg:

 Skew- Symmetric determinant:


 In a skew symmetric determinant, all the diagonal elements are zero and the
elements situated at equal distance from the diagonal are equal in magnitude but
opposite in sign. The value of a skew symmetric determinant of odd order is zero.
 If the elements of a determinant are in cyclic arrangement, such a determinant is
termed as a Circulant determinant.
 A system of equations AX = D is called a homogeneous system if D = O. Otherwise it is
called a non-homogeneous systems of equations.
 If the system of equations has one or more solutions, then it is said to be a consistent
system of equations, otherwise it is an inconsistent system of equations.
 Let A be the co-efficient matrix of the linear system:
ax + by = e &
cx + dy = f.
If det A ≠ 0, then the system has exactly one solution. The solution is:

 Let A be the co-efficient matrix of the linear system:


ax + by + cz = j,
dx + ey + fz = k, and
gx + hy + iz = l.
If det A ≠ 0, then the system has exactly one solution. The solution is:
We have the following two cases:
Case I. When Δ ≠ 0
In this case we have,
x = Δ 1 /Δ, y = Δ 2 /Δ, z = Δ 3 /Δ
Hence unique value of x, y, z will be obtained.
Case II: When Δ = 0
(a) When at least one of Δ 1 , Δ 2 and Δ 3 is non zero then the system is inconsistent.
Let Δ 1 ≠ 0, then from case I, Δ 1 = x Δ will not be satisfied for any value of x because Δ = 0
and Δ1 ≠ 0 and hence no value of x is possible in this case.
Similarly when Δ 2 ≠ 0, and Δ 2 = yΔ and similarly for Δ 3 ≠ 0.
(b) When Δ = 0 and Δ 1 = Δ 2 = Δ 3 = 0 and we have,
Δ 1 = xΔ, Δ 2 = yΔ and Δ 3 = zΔ will be true for all values of x, y and z. But then only two of x,
y, z will be independent and third will be dependent on other two, therefore if Δ = Δ 1 =
Δ 2 = Δ 3 = 0, then the system of equations will be consistent and it will have infinitely
many solutions.
 If A is a non-singular matrix, then the system of equations given by AX = D has a
unique solutions given by X = A -1 D.
 If A is a singular matrix, and (adj A) D = O, then the system of equations given by AX =
D is consistent, with infinitely many solutions.
 If A is a singular matrix, and (adj A) D ≠ O, then the system of equation given by AX =
D is inconsistent.
 Let AX = O be a homogeneous system of n linear equation with n unknowns. Now if A
is non-singular then the system of equations will have a unique solution i.e. trivial
solution and if A is singular then the system of equations will have infinitely many
solutions.
 Suppose we have the following system:
a11 x 1 +a 12 x 2 +.... + a 1n x n = b 1
a2 1 x 1 +a2 2 x 2 +.... + a2 n x n = b2
….... …..... ….... …....
am 1 x 1 +am 2 x 2 +.... + am n x n = bn
Then the system is consistent iff the coefficient matrix A and the augmented matrix
(A|B) have the same rank. We then have the following caes:
Case 1: The system is consistent and m ≥ n
1. If r(a) = r((A|B) = n, then the system has a unique solution
2. If r(a) = r((A|B) = k < n ,then (n-k) unknowns are assigned arbitrary values.
Case 2: The system is consistent and m < n
1. If r(a) = r((A|B) = m, then (n-m) unknowns can be assigned arbitrary values.
2. If r(a) = r((A|B) = k < m ,then (n-k) unknowns are assigned arbitrary values.
Note: here ‘r’ denotes the “rank”.

You might also like