You are on page 1of 10

Chapter 4

Linear Independence and


Linear Transformations
4.1

Linear Independence, Subspaces and Dimension

Suppose we are given vectors v1 , , vm in Rn . We would like to know


whether these vectors are linearly independent. This amounts to asking
whether
x1 v1 + + xm vm = 0
(4.1.1)
has a non-trivial solution (the trivial solution is x1 = = xm = 0, anything
else is a non-trivial solution). Let A be the n m matrix whose column
vectors are given by v . Then, the above question is equivalent to asking
whether
Ax = 0, x = (x1 , xm )T
(4.1.2)
where has a non-trivial solution. This is nothing other than (3.5.1). Theorem
8 tells us exactly when there are non-trivial solutions. If the rank r is equal
to m, then there is only the trivial solution, x = 0, and thus, the vectors
v1 , , vm are linearly independent. Otherwise, the vectors are linearly
dependent.
We state the above observation as a proposition.
Proposition 2. Let v1 , , vm be vectors in Rn , and let A be the n m
matirx whose column vectors are given by v1 , , vm . Then, if the rank r
of A is equal to m, then the vectors v1 , , vm are linearly independent. If
not, the vectors are linearly dependent.
51

As an immediate consequence of the above, we have the following.


Proposition 3. It is impossible to have more than n linearly independent
vectors in Rn .
Proof. Suppose we have m > n linearly independent vectors. This implies
that the n m matrix A formed by these vectors must have rank m, by
Proposition 2. But the row-echelon matrix is a n m matrix, and therefore,
can only have at most n pivot columns. Therefore, its rank is at most n.
Since m < n, this is a contradiction.
Another consequence of Proposition 2 is the following.
Theorem 9. For a n n matrix A, the following statements are equivalent.
1. A is invertible.
2. The column vectors of A are linearly independent.
3. The row vectors of A are linearly independent.
Proof. Item (2) is equivalent to the statement that
Ax = 0

(4.1.3)

has only the trivial solution. The equivalence of item 1 and item 2 thus
follows from Theorem 6. Since the row vectors of A are the column vectors
of AT , item 3 is equivalent to AT being invertible. So we have only to show
that the invertibility of A is equivalent to the invertibility of AT . Suppose
A is invertible. Then, there is a matrix B such that
AB = BA = I.

(4.1.4)

Let us now take the transpose of the above, and use (1.4.3).
B T AT = AT B T = I T = I.

(4.1.5)

We see therefore that B T is the inverse of AT and therefore, AT is invertible.


Since the transpose of AT is A, we can repeat the same argument to show
that the invertibility of AT implies the invertibility of A.
Now, suppose the rank r of A in (4.1.2) is smaller than m. Let us look
at the situation a little further. As we did in Section 3.4, we label the
columns with pivots as i1 < i2 < < ir and the columns without pivots as
MATH 2574H

52

Yoichiro Mori

j1 < j2 < < jmr . According to theorem 8, the solution to (4.1.2) can
be written as:
x = c1 a1 + + cmr amr .
(4.1.6)
Take any a , = 1, , mr. x = a is a solution to (4.1.2), and this implies
that the vector vj can be written as a linear combination of viq , iq < j . On
the other hand, the vectors vi1 , , vir are linearly independent. Indeed,
if vi1 , , vir is linearly dependent, there will be a nontrivial solution to
(4.1.2) such that xj = 0 for all j . But xj = 0 implies c = 0 in (4.1.6), a
contradiction. Let us put this into a proposition.
Proposition 4. Suppose we have a n m matrix A whose rank is r, and
suppose we reduced A to row echelon form R. The r column vectors of A
corresponding to the pivot columns of R are linearly independent, and the
rest of the column vectors of A can be written as linear combinations of these
r vectors.
Example 8. Consider the column vectors of the 4 5 matrix (3.5.5). The
rank of this matrix is 3, as can be seen by (3.5.6). Looking at the row echelon
form of this matrix (3.5.6), we see that the following column vectors 1, 2, 5
are linearly independent:



1
2
1
1
3
3



(4.1.7)
v1 =
2 , v2 = 2 , v5 = 2 .
0
3
3
The column vectors v3 and v4 are expressed as:

v3 = v1 + v2 , v4 = v1 v2 .

(4.1.8)

To proceed further, let us introduce the notion of a subspace.


Definition 4 (Subspace of Rn ). A subspace V of Rn is a subset of Rn with
the following two properties.
1. For a vector v V and an arbitrary scalar c, cv also belongs to V .
2. For two vectors v, w V , v + w is also in V .
Given m vectors v1 , , vm , the set of all vectors:
c1 v1 + + cm vm

(4.1.9)

forms a subspace. If all vectors in a subspace V can be written as linear


combinations of vectors v1 , , vm , we say that the vectors v1 , , vm spans
V.
MATH 2574H

53

Yoichiro Mori

Proposition 5. Every subspace V of Rn is spanned by a finite number of


linearly independent vectors.
Proof. If the subspace consists of just the 0 vector, there is nothing to prove.
Suppose otherwise. Pick a non-zero vector v1 that is in V . Consider the
span:
c1 v1 , c1 R.
(4.1.10)
If this spans all of V , we are done. If not, there must be a vector v2 that
cannot be expressed in the above form. Therefore, v1 and v2 are linearly
independent and
c1 v1 + c2 v2 , c1 , c2 R
(4.1.11)
must belong to V . If this spans all of V , we are done. If not, we add another
vector v3 not expressible as above. This is thus linearly independent with
respect to the rest. This process has to stop before we add the n+1st vector,
since there are at most n linearly vectors in Rn , according to Proposition
3.
Definition 5 (Basis). Suppose a subspace V of Rn is spanned by linearly
independent vectors v1 , , vm . We say that such vectors are a set of basis
vectors of V .
Proposition 5 thus states that every subspace has a basis.
Example 9. Consider the following subset V of R3 :


1
1

c1 2 + c2 1 ,
0
4

(4.1.12)

where c1 and c2 are arbitrary constants. This is a subspace of R3 spanned


by (1, 2, 4)T and (1, 1, 0)T . The two vectors are linearly independent, and
therefore, the two vectors form a basis of V and the dimension of V is 2. It
is also possible to express the same subspace as:


2
1
(4.1.13)
c1 3 + c2 1 .
4
0
There are thus many different ways of expressing the same subspace. It is
also true that V can be expressed as:



1
2
1
c1 2 + c2 3 + c3 1 .
(4.1.14)
4
4
0

MATH 2574H

54

Yoichiro Mori

but in this case, the three vectors are not linearly independent.
As we have seen above, there are various choices for basis vectors of a
subspace, but the number of basis vectors is always the same.
Proposition 6. Two sets of basis vectors always has the same number of
vectors.
Proof. Suppose otherwise. Then, there are basis vectors v1 , , vm and
w1 , , wq with m 6= q. Suppose m < q. Then, each vector in wk can be
written as:
wk = a1k v1 + a2k v2 + + amk vm .
(4.1.15)
where the ajk are scalar constants. To examine linear independence of wk ,
we must examine the expression
!
q
q
m
X
X
X
xk w k =
ajk xk vj = 0,
(4.1.16)
j=1

k=1

k=1

where xk are scalars. Since the vj are linearly independent, we have:


q
X

ajk xk = 0 for j = 1, , m

(4.1.17)

k=1

This is a linear homogeneous equation with m equations in q unknowns


x1 , , xq . Since m < q, by Proposition 1 there is a non-trivial solution.
This contradicts the assumption that w1 , , wq were linearly independent.
The case q < m can be handled in exactly the same manner.
The above proposition allows us to define the dimension of a subspace.
Definition 6. The dimension of a subspace V in Rn is the number of basis
vectors of the subspace.
In particular, this means that subspaces of Rn can be classified by their
dimension. Subspaces of R2 are:
Dimension 0: the origin.
Dimension 1: lines through the origin.
Dimension 2: the whole plane.
Subspaces of R3 are:
MATH 2574H

55

Yoichiro Mori

Dimension 0: the origin.


Dimension 1: lines through the origin.
Dimension 2: planes through the origin.
Dimension 3: the whole space.
A similar classification is possible for Rn .
Example 10. Consider the vectors:



7
4
1

v1 = 2 , v2 = 5 , v3 = 8 .
9
6
3

(4.1.18)

The span of these three vectors form a subspace in R3 . To find the dimension
of the subspace, form the matrix consisting of these three vectors:

1 4 7
A = 2 5 8
(4.1.19)
3 6 9
The row echelon form is:

1 0 1
R = 0 1 2
0 0 0

(4.1.20)

This shows that the space spanned by the three vectors is two dimensional,
and is spanned by vectors v1 and v2 , with
v3 = 2v2 v1 .

(4.1.21)

v1 and v2 are not the only vectors that form a basis of this subspace. Indeed,
one can find any number of bases. For example, v1 and v3 is also a basis of
the same subspace.

4.2

Linear Transformations

An n n matrix A can be seen as a map from Rn to Rn . This is called


a linear transformation. We define two important concepts for a linear
transformation.
MATH 2574H

56

Yoichiro Mori

Definition 7 (Kernel and Image). The kernel or nullspace of a matrix A


is set of vectors v Rn that satisfy:
Av = 0.

(4.2.1)

The kernel of A is written as kerA. The image of a matrix A is the set of


vectors in v Rn for which
Ax = v
(4.2.2)
has a solution x. The image of A is written as ImA.
Both the kernel and image are subspaces of Rn . This can be seen as
follows. Suppose v and w are in kerA. Then,
A(cv) = cA(v) = 0, A(v + w) = Av + Aw = 0.

(4.2.3)

Therefore, cv and v + w are in the kernel of A. Take two vectors v and w


in the image of A. This means that there are vectors x and y such that
Ax = v, Ay = w.

(4.2.4)

Therefore, we have
A(cx) = cAx = cv, A(x + y) = Ax + Ay = v + w.

(4.2.5)

Since the kernel and image are both subspaces of Rn , we can consider
their dimension. Let us now consider the dimension of the kernel and the
image.
Proposition 7. Let A be a n n matrix. The dimension of the kernel is
equal to n r, where r is the rank of the matrix.
Proof. Finding the kernel is the same as solving the equation:
Ax = 0, x Rn .

(4.2.6)

We know from item 1 of Theorem 8 that the solution to the above is written
as a linear combination of nr linearly independent vectors. This is nothing
other than the statement that the kernel has dimension n r.
We now turn to the image.
Proposition 8. Let A be a n n matrix. The dimension of the image is
equal to the rank r of the matrix A.
MATH 2574H

57

Yoichiro Mori

Proof. The image of the matrix A consists of all vectors of the form:
Ax = x1 v1 + + cn vn

(4.2.7)

where v1 , , vn are the column vectors of A and x = (x1 , , xn ). Therefore, the image is spanned by the column vectors. We know from Proposition
4 that the r vectors that correspond to the pivots are linearly independent
and that the rest are written as linear combinations of the others.
Thus follows the main result of this section.
Theorem 10. Suppose A is a n n square matrix. Then,
rankA + dimKerA = n,

(4.2.8)

where dimKerA is the dimension of the kernel of A.


Example 11. Consider the

A= 1
0

matrix A and its row-reduced form R:

1 0 1
2 3
(4.2.9)
3 5 , B = 0 1 2
0 0 0
1 2

We see from this that the image is two-dimensional, where




1
2
v1 = 1 , v2 = 3
0
1

(4.2.10)

can be taken as a basis. The kernel of A is one-dimensional and is spanned


by:

1
2 .
(4.2.11)
1
Consider the matrix

1
4
2
7
A=
3
4
1 1

and its row-reduced

3 2
1

5 3
0
, R=
0
1 1
0 1
0

form

0 1 2
1 1
1
.
0 0
0
0 0
0

The image is 2-dimensional and is spanned by:



1
4
2 7
, .
3 4
1
1
MATH 2574H

58

(4.2.12)

(4.2.13)

Yoichiro Mori

The kernel is also 2-dimensional and is spanned by:

1
1
,
1
0

4.3

2
1
.
0
1

(4.2.14)

Exercises

1. Consider the linear dependence/independence of the following set of


vectors. If linearly dependent, find a set of linearly independent vectors
and express the other vectors in terms of them.
(a) (1, 3, 1)T , (1, 0, 1)T , (1, 0, 1)T , (3, 3, 1)T .
(b) (2, 1, 0)T , (0, 1, 2)T , (1, 0, 1)T , (1, 1, 1)T .
(c) (3, 0, 0, 3)T , (1, 0, 1, 0)T , (0, 1, 0, 0)T .
2. Consider the xy plane (the plane z = 0) in the three-dimensional space
R3 . Find two different sets of basis vectors for the xy plane.
3. Let x = (x, y, z, w)T R4 . Consider the set of all vectors in R4
consisting of vectors w = 0.
(a) Show that this set is a subspace of R4 .
(b) Find a set of basis vectors for this subspace. What is its dimension?
4. Argue why the following subsets of R2 are not subpsaces.
(a) The inside of a circle in R2 centered at the origin.
(b) A line in R2 that does not go through the origin.
(c) The first quadrant of R2 .
(d) The first and third qudrants of R2 combined (including the x and
y axes).
5. Consider two n n matrices A and B. The matrix A has rank n and
B has rank r. What is the rank of the matrix AB? What about BA?
Can you say anything about the rank of A + B?
MATH 2574H

59

Yoichiro Mori

6. Find the image and kernel of the following matrices.



1 0
1 2 1
2 1 0
1 2 1 , 2 3 4 , 0 0
1 0
1 1 1
0 1 2

1 2 1
5
0 1 0 1
1 2 3 2 1 1 1 0

1 6 1 12 , 1 2 3 0
0 4 2 7
2 2 4 1
7. Consider the matrix:

1
0 ,
1

1 1 1
1
A = 1 1 1
3
1 1 1

(a) Find the image and kernel of A.

(b) Let v be a vector on the line spanned by (1, 1, 1)T . Where does
v get mapped to?
(c) Let w be a vector perpendicular to the vector (1, 1, 1)T . Where
does w get mapped to?
(d) Geometrically describe what kind of linear transformation A is.
(e) Show that A2 = A, and hence, (I A)2 = (I A).
(f) Geometrically describe what kind of linear transformation I A
is.

MATH 2574H

60

Yoichiro Mori