You are on page 1of 4

MATH 110: LINEAR ALGEBRA

FALL 2007/08
PROBLEM SET 9 SOLUTIONS

In the following V will denote a finite-dimensional vector space over R that is also an inner product
space with inner product denoted by h ·, · i. The norm induced by h ·, · i will be denoted k · k.

1. Let S be a subset (not necessarily a subspace) of V . We define the orthogonal annihilator of S,


denoted S ⊥ , to be the set
S ⊥ = {v ∈ V | hv, wi = 0 for all w ∈ S}.
(a) Show that S ⊥ is always a subspace of V .
Solution. Let v1 , v2 ∈ S ⊥ . Then hv1 , wi = 0 and hv2 , wi = 0 for all w ∈ S. So for any
α, β ∈ R,
hαv1 + βv2 , wi = αhv1 , wi + βhv2 , wi = 0
for all w ∈ S. Hence αv1 + βv2 ∈ S ⊥ .
(b) Show that S ⊆ (S ⊥ )⊥ .
Solution. Let w ∈ S. For any v ∈ S ⊥ , we have hv, wi = 0 by definition of S ⊥ . Since
this is true for all v ∈ S ⊥ and hw, vi = hv, wi, we see that hw, vi = 0 for all v ∈ S ⊥ , ie.
w ∈ S⊥.
(c) Show that span(S) ⊆ (S ⊥ )⊥ .
Solution. Since (S ⊥ )⊥ is a subspace by (a) (it is the orthogonal annihilator of S ⊥ ) and
S ⊆ (S ⊥ )⊥ by (b), we have span(S) ⊆ (S ⊥ )⊥ .
(d) Show that if S1 and S2 are subsets of V and S1 ⊆ S2 , then S2⊥ ⊆ S1⊥ .
Solution. Let v ∈ S2⊥ . Then hv, wi = 0 for all w ∈ S2 and so for all w ∈ S1 (since
S1 ⊆ S2 ). Hence v ∈ S1⊥ .
(e) Show that ((S ⊥ )⊥ )⊥ = S ⊥ .
Solution. Applying (d) to (b) with S1 = S and S2 = (S ⊥ )⊥ , we get
((S ⊥ )⊥ )⊥ ⊆ S ⊥ .
Apply (c) to S ⊥ , we get
span(S ⊥ ) ⊆ ((S ⊥ )⊥ )⊥
but by (a), span(S ⊥ ) = S ⊥ . Hence we get equality.
(f) Show that either S ∩ S ⊥ must be either the empty set ∅ or the zero subspace {0V }.
Solution. If S ∩ S ⊥ 6= ∅, then let v ∈ S ∩ S ⊥ . Since v ∈ S ⊥ , we have hv, wi = 0
for all w ∈ S. Since v ∈ S, in particular, kvk2 = hv, vi = 0, implying that v = 0V . In
other words, the only vector in S ∩ S ⊥ is 0V . So S ∩ S ⊥ = {0V }. On the other hand, if
0V ∈ / S ∩ S ⊥ and so S ∩ S ⊥ = ∅ (if not, then the previous argument gives a
/ S, then 0V ∈
contradiction).

2. Let W be a subspace of V . The subspace W ⊥ is called the orthogonal complement of W .


(a) Show that V = W ⊕ W ⊥ .
Solution. Since W and W ⊥ are both subspaces (the latter by Problem 1(a)), 0V ∈ W
and 0V ∈ W ⊥ . So by Problem 1(f), we have that W ∩W ⊥ = {0V }. It remains to show that

Date: December 9, 2007 (Version 1.0).

1
W + W ⊥ = V . Clearly W + W ⊥ ⊆ V . For the converse, let v ∈ V and let {w1 , . . . , wr } be
an orthonormal basis of W . Consider
x := hv, w1 iw1 + hv, w2 iw2 + · · · + hv, wr iwr
and
y := v − x.
Clearly x ∈ W . We claim that y ∈ W ⊥ . For any w ∈ W , we could write
w = hw, w1 iw1 + hw, w2 iw2 + · · · + hw, wr iwr
since {w1 , . . . , wr } is an orthonormal basis of W . So
hy, wi = hv − x, wi
= hv, wi − hx, wi
D Xr E DXr E
= v, hw, wi iwi − hv, wi iwi , w
i=1 i=1
Xr Xr
= hv, wi ihw, wi i − hv, wi ihwi , wi
i=1 i=1
=0
since hw, wi i = hwi , wi. Hence v = x + y ∈ W + W ⊥ .
(b) Show that W = (W ⊥ )⊥ .
Solution. By Problem 1(b), we have W ⊆ (W ⊥ )⊥ . But by (a), we have
W ⊕ W ⊥ = V = W ⊥ ⊕ (W ⊥ )⊥
and so
dim(W ) + dim(W ⊥ ) = dim(W ⊥ ) + dim((W ⊥ )⊥ )
and so
dim(W ) = dim((W ⊥ )⊥ ).
Hence W = (W ⊥ )⊥ .

3. Let A ∈ Rm×n and regard A : Rn → Rm as a linear transformation in the usual way. Show that
nullsp(A> ) = (colsp(A))⊥
and
rowsp(A) = (nullsp(A))⊥ .
Hence deduce that
Rm = nullsp(A> ) ⊕ colsp(A)
and
Rn = rowsp(A) ⊕ nullsp(A).
Solution. Clearly colsp(A) = {Ax ∈ Rm | x ∈ Rn }. So if v ∈ (colsp(A))⊥ , then
hv, Axi = 0
for all x ∈ Rn . So
hA> v, xi = (A> v)> x = v> Ax = hv, Axi = 0
for all x ∈ Rn . In particular, we may pick x = A> v to get
kA> vk2 = hA> v, A> vi = 0,
giving us
A> v = 0.
Hence v ∈ nullsp(A> ). Conversely, if v ∈ nullsp(A> ), then
hv, Axi = v> Ax = (A> v)> x = 0> x = 0

2
for all x ∈ Rn . So v ∈ (colsp(A))⊥ . This proves the first equality. For the second equality, just
note that
rowsp(A) = colsp(A> )
and apply the first equality to get
(colsp(A> ))⊥ = nullsp(A),
and then apply Problem 2(b) to get
rowsp(A) = ((rowsp(A))⊥ )⊥ = (nullsp(A))⊥ .
The next two inequalities are immediate consequences of Problem 2(a).

4. Let A, B ∈ Rn×n . Let h ·, · i denote the standard inner product on Rn .


(a) Show that if hAx, yi = 0 for all x, y ∈ Rn , then A = O, the zero matrix.
Solution. Let {e1 , . . . , en } be the standard basis of Rn . Observe that if A = [aij ]ni,j=1 ,
then
aij = e>i Aej = hei , Aej i = hAej , ei i = 0
by letting x and y be ei and ej respectively. Since this is true for all i and j, we have that
A = O.
(b) Is it true that if hAx, xi = 0 for all x ∈ Rn , then A = O, the zero matrix?
Solution. Recall notation from Homework 4, Problem 1(d). Let A ∈ ∧2 (Rn ), ie. A> =
−A. Then
hAx, xi = x> A> x = −x> Ax = −hx, Axi = −hAx, xi,
implying that hAx, xi = 0. So any non-zero skew-symmetric matrix would be a counter
example. An explicit 2 × 2 example is given by
 
0 −1
A= .
1 0
(c) Is it true that if hAx, xi = hBx, xi for all x ∈ Rn , then A = B?
Solution. Any two distinct skew-symmetric matrices would provide a counter example.
An explicit 2 × 2 example is given by
   
0 −1 0 0
A= , B= .
1 0 0 0

5. Let B1 = {x1 , . . . , xn } and B2 = {y1 , . . . , yn } be two bases of Rn . B1 and B2 are not necessarily
orthonormal bases.
(a) Show that there exists an orthogonal matrix Q ∈ Rn×n such that
Qxi = yi , i = 1, . . . , n,
if and only if
hxi , xj i = hyi , yj i, i, j = 1, . . . , n.
Solution. By a theorem we have proved in the lectures, an orthogonal matrix Q preserves
inner product and so satisfies
hyi , yj i = hQxi , Qxj i = hxi , xj i.
It remains to show that if the latter condition is satisfied, then such an orthogonal matrix
may be defined. First, observe that since B1 and B2 are bases, the equations
Qxi = yi , i = 1, . . . , n,

3
defines the matrix Q uniquely1. We only need to verify that Q is orthogonal. Let x ∈ Rn
and let Xn
x= α i xi
i=1
be the representation of x in terms of B1 . Then
kQxk2 = hQx, Qxi
DXn Xn E
= αi Qxi , αj Qxj
i=1 j=1
DXn Xn E
= α i yi , α j yj
i=1 j=1
Xn Xn
= αi αj hyi , yj i
i=1 j=1
Xn Xn
= αi αj hxi , xj i
i=1 j=1
DXn Xn E
= α i xi , α j xj
i=1 j=1
= hx, xi
= kxk2 .
Since this is true for all x ∈ Rn , ie. Q preserves norm, the aforementioned theorem in the
lectures imply that Q is orthogonal.
(b) Show that if B1 is an orthonormal basis and if Q ∈ Rn×n is an orthogonal matrix, then
{Qx1 , . . . , Qxn }
is also an orthonormal basis.
Solution. Clearly Qx1 , . . . , Qxn are linearly independent since if
α1 Qx1 + · · · + αn Qxn = 0,
then
Q(α1 x1 + · · · + αn xn ) = 0,
and so
α1 x1 + · · · + αn xn = Q> 0 = 0
and so α1 = · · · = αn = 0 by the linear independence of x1 , . . . , xn . Hence {Qx1 , . . . , Qxn }
is a basis (since the set has size n = dim(Rn )). The orthonormality follows from the fact
that Q preserves inner product and that x1 , . . . , xn form an orthonormal basis:
(
0 if i 6= j,
hQxi , Qxj i = hxi , xj i =
1 if i = j.

1In fact, Q is just the matrix whose coordinate representation with respect to B and B is the identity matrix, ie.
1 2
[Q]B1 ,B2 = I. Explicitly, if we let X and Y be the matrices whose columns are x1 , . . . , xn and y1 , . . . , yn respectively,
then Q = Y X −1 .

You might also like