You are on page 1of 17

3.

2 Matrix Algebra Properties of Addition, Scalar Multiplication Theorem 1 Let A, B and C denote arbitrary m n matrices; let c, d denote scalars. Then

A + B = B + A Commutativity

A + (B + C) = (A + B) + C Associativity

There is an m n matrix O such that O + A = A for each A.

For each A there is an m n matrix A such that A + (A) = O.

c(A + B) = cA + cB Distributivity
1

(c + d)A = cA + dA Distributivity

(cd)A = c(dA)

1A = A

Notation: Recall A = [aij ]mn, (A)ij = aij . The ithe row of A: denoted by Ai or rowi(A); The jth column of A: denoted by aj or colj (A)

A+B =B+A: Proof: A + B = [aij + bij ] = [bij + aij ] = B + A OR (A + B)ij = (A)ij + (B)ij = (B)ij + (A)ij = (B + A)ij A + (A) = O: Proof: Dene A = [aij ]. Then A + (A) = [aij + (aij )] = [0ij ] = 0.

If A1, A2, . . . , Ak are matrices of the same size and c1, c2, . . . , ck are scalars, we may form the linear combination c1A1 + c2A2 + + ck Ak The scalars c1, c2, . . . , ck are the coecients of the linear combination. Example(Example 3.16) 1 0 0 1 Let A1 = , A2 = and A3 = 0 1 1 0 1 1 . Write B, C as linear combinations of 0 1 A1, A2 and A3, if possible, where B= 2 3 4 2 C= 2 5 0 3

The span of a set of matrices is the set of all linear combinations of the matrices. Example(Example 3.17) 1 0 0 1 Let A1 = , A2 = and A3 = 0 1 1 0 1 1 . Describe the span of the matrices A1, A2 0 1 and A3.

Matrices A1, A2, . . . , Ak of the same size are linearly independent if the only solution of the equation c1 A 1 + c2 A 2 + + ck A k = O is the trivial solution: c1 = c2 = = ck = 0. If there are nontrivial coecients that satisfy the equation then A1, A2, . . . , Ak are called linearly dependent. Example Let A1 = 1 0 , A2 = 0 1 0 1 1 0 and A3 =

1 1 . Determine whether or not A1, A2 and 0 1 A3 are linearly independent.

Properties of Matrix Multiplication Theorem 2 Let A, B and C be matrices (whose sizes are such that the indicated operations can be performed) and let k be a scalar. Then

1. A(BC) = (AB)C Associativity

2. A(B + C) = AB + AC Left distributivity

3. (A + B)C = AC + BC Right distributivity

4. k(AB) = (kA)B = A(kB)

5. ImA = A = AIn if A is m n

(A + B)C = AC + BC : Proof ((A + B)C)ij = rowi(A + B) colj (C) = [rowi(A) + rowi(B)] colj (C) = [rowi(A) colj (C)] + [rowi(B) colj (C)] = (AC)ij + (BC)ij Thus the result follows. AIn = A : Proof

. . . AIn = A e1 . e2 . . en . . . = = . . . Ae1 . Ae2 . . Aen . . . . . . a1 . a2 . . an . . . =A


8

Example(Example 3.20) 1 2 4 3 Let A = and B = . Conrm prop3 4 2 1 erty 4 of matrix multiplication listed above for these two matrices if k = 2.

Some basic rules for matrix multiplication:

1. Matrix multiplication is not commutative, i.e., in general AB = BA. 1 2 3 4 1 0 1 0 1 2 1 0 = = = 0 0 3 0 0 0 0 0 1 2 3 4

2. If AB = O it does not imply that A = O or B = O (that is, neither needs to equal the zero matrix). 1 1 1 1 1 1 0 0 = 1 1 0 0

3. AB = AC does not imply that B = C. 1 2 2 4 1 0 1 2 = 0 0 2 4 0 0 1 0 2

10

Properties of the Transpose Theorem 3 Let A and B be matrices (whose sizes are such that the indicated operations can be performed) and let k be a scalar. Then

1. (AT )T = A

2. (A + B)T = AT + B T

3. (kA)T = k(AT )

4. (AB)T = B T AT

5. (Ar )T = (AT )r for all nonnegative integers r.


11

(AT )T = A : Proof: ((AT )T )ij = (AT )ji = (A)ij

Let A be n n. (Ar )T = (AT )r for r = 0, 1, . . .: Proof (Mathematical Induction on r): Prove r = 0 is true: T (A0)T = (In)T = In = (In )0 Assume r = k is true i.e. (Ak )T = (AT )k Prove that r = k + 1 is true: (Ak+1)T = (Ak A)T = AT (Ak )T Rule 4 = AT (AT )k Inductive Hypothesis = (AT )k+1
12

Example Let A = 1 3 2 2 0 1 and B = Conrm 0 1 1 1 4 3 the properties of the transpose listed above for these two matrices.

13

Theorem 4 1. If A is a square matrix, then A + AT is a symmetric matrix.

2. For any matrix A, AAT and AT A are symmetric matrices. Example(Example 3.21) 1 2 Let A = . Show that A + AT , AAT and 3 4 AT A are symmetric.

14

A square matrix is called upper triangular if all the entries below the main diagonal are zero.
0 0 0 . . . . . . . . .

. . . 0 0 0

. . .

where the entries marked are arbitrary.

1 7 2 0 2 3 , 0 0 1

1 0 2 0 0 3 0 0 2

Similarly the following are lower triangular 1 0


5 2

0 0 , 3 4 1

5 0

3 0

0 0 2 3 1

15

A square matrix is called skew-symmetric if AT = A. Exercise Is the following matrix skew-symmetric?

0 1 2 1 0 5 2 5 0 Exercise Show that if A is square then A AT is skewsymmetric.

16

The trace of an n n matrix A = [aij ] is the sum of the entries on its main diagonal and is denoted by tr(A). That is, tr(A) = a11 + a22 + + ann Exercise If A and B are n n matrices, prove the following properties of the trace:

1. tr(A + B) = tr(A) + tr(B)

2. tr(kA) = ktr(A) where k is a scalar.

17

You might also like