You are on page 1of 6

MTH 102: Linear Algebra

Department of Mathematics and Statistics Indian Institute of Technology - Kanpur

Problem Set 5

Problems marked (T) are for discussions in Tutorial sessions.


1. Let S = {e1 + e4 , −e1 + 3e2 − e3 } ⊂ R4 . Find S ⊥ .

Solution: (e1 + e4 )⊥ is the set of all vectors that are orthogonal to e1 + e4. That is, the set of

1 0 0 1 0
all xT = (x1 , . . . , x4 ) such that x1 +x4 = 0. So S ⊥ is the solution space of .
−1 3 −1 0 0
Apply GJE and get it.
Otherwise apply GS with {e1 + e4 , −e1 + 3e2 − e3 , e1 , e2 , e3 , e4 }. Linear span of the last two
vectors of the orthonormal basis is S ⊥ .
2. Show that there are infinitely many orthonormal bases of R2 .
 
cos θ − sin θ
Solution: Columns of , for 0 ≤ θ < 2π, form bases of R2 . Idea is that take
sin θ cos θ
{e1 , e2 } and then counter-clockwise rotate the set by an angle θ.
3. What is the projection of v = e1 + 2e2 − 3e3 on H : x1 + 2x2 + 4x4 = 0?
      
 0 2 4 
0 −1  0
 
Solution: Basis for H:  ,  ,   .
 1
 0 0 

0 0 −1
 
0 2 4 
    

0 −1 8
 
1 1
Orthonormalize: w1 =  , w2 = √  ,w = √  0 .
  

 1 5 0 3 105 

0 0 −5
     
0 4 16
 0 20 8 1 32
The projection is hv, w1 iw1 + hv, w2 iw2 + hv, w3 iw3 =   + 0w2 + 105  0 = 21 −63.
 
−3
0 −5 20
 
1
2
Alternately: Let x be the projection. Then v − x is parallel to  , the normal vector of H.
0
4
1 1 1 16
       
2
5    2 5 2 1  32
\
So v − x = hv, v \
− xiv − x = 21 0. So x = −3 − 21 0 = 21 −63.
4 0 4 20

4. Let V be a subspace of Rn . Then show that dim V = n − 1 if and only if V = {x : aT x = 0}


for some a 6= 0.

Solution: Let V = {x : aT x = 0} = N (A), where A = aT . Since a 6= 0, we see that


rank(A) = 1 and hence dim V = n − 1 (use dim(N (A)) + dim(col(A)) = n).
Conversely, suppose that dim V = n − 1. Get an orthonormal basis {u1 , . . . , un−1 } of V and
extend it to an orthonormal basis {u1 , . . . , un } of Rn . Then V = {x : uTn x = 0}.
2

5. (T) Does there exist a real matrix A, for which, the Row space and column space are same
but the null-space and left null-space are different?

Solution: Not possible. Use the fundamental theorem of linear algebra which states that
⊥
N (A) = col(AT ) and N (AT ) = (col(A))⊥ .

That is, Same row and column spaces require us to have a square matrix. This further implies
that the dimension of null-spaces have to be same. Now, null-space and left-null-space are
orthogonal to row and column spaces, respectively (which are same in this case). Hence, the
Null-spaces are also same.

6. (T) Consider two real systems, say Ax = b and Cy = d. If the two systems have the same
nonempty solution set, then, is it necessary that row(A) = row(C)?

Solution: Yes. Observe that they have to be systems with the same number of variables.
So, the two matrices A and C have the same set of columns. Let S be the solution set and
x0 be a particular solution. Then, Sh = S − x0 is the solution set for Ax = 0 and Cy = 0.
That is, N (A) = N (C). So, by fundamental theorem of linear algebra, colAT = colC T . That
is, row(A) = row(C).

7. Show that the system of equations Ax = b given below

x1 + 2x2 + 2x3 = 5
2x1 + 2x2 + 3x3 = 5
3x1 + 4x2 + 5x3 = 9

has no solution by finding y ∈ N (AT ) such that yT b 6= 0.

Solution: Note that if the system has a solution x0 then, we get Ax0 = b. Thus, for any
y ∈ N (AT ), we have

yT b = yT (Ax0 ) = (yT A)x0 = (AT y)T x0 = 0T b = 0. (1)


   
−1 5
But, it is easy to check that −1 is in N (AT ) and −1 −1 1 5 = −1. A contradiction
 

1 9
to Equation (1). Thus, the given system has no solution.

8. (T) Suppose A is an n by n real invertible matrix. Describe the subspace of the row space
of A which is orthogonal to the first column of A−1 .
Solution: Let A[:, j] (respectively, A[:, i]) denote the j-th column (respectively, the i-th row)
of A. Then, AA−1 = In implies hA[i, :], A−1 [:, 1]i = 0 for 2 ≤ i ≤ n. So, the row subspace of
A that is orthogonal to the first column of A−1 equals LS(A[2, :], A[3, :], . . . , A[n, :]).

9. (T) Let An×n be any matrix. Then, the following statements are equivalent.

(i) A is unitary.
3

(ii) For any orthonormal basis {u1 , . . . , un } of Cn , the set {Au1 , . . . , Aun } is also an or-
thonormal basis.

Solution: (i) ⇒ (ii): Suppose A is unitary. Then hAui , Auj i = hui , A∗ Auj i = hui , uj i. It
follows that {Au1 , . . . , Aun } is orthonormal, hence a basis of Cn .
(ii) ⇒ (i): Suppose (ii) is satisfied by A. Consider the standard basis {e1 , . . . , en }. By
hypothesis {Ae1 , . . . , Aen } is an orthonormal basis. That is the columns of A form an or-
thonormal basis, that is, A∗ A = I.

10. Let V be an inner product space and S be a nonempty subset of V. Show that

(i) S ⊂ (S ⊥ )⊥ .
(ii) If V is finite dimensional and S is a subspace then (S ⊥ )⊥ = S.
(iii) If S ⊂ T ⊂ V, then S ⊥ ⊃ T ⊥ .
(iv) If S is a subspace then S ∩ S ⊥ = {0}.

Solution: (i) x ∈ S ⇒ hw, xi = 0,for all w ∈ S ⊥ ⇒ x ⊥ S ⊥ ⇒ x ∈ (S ⊥ )⊥ .


(ii) If S = {0}, V we have nothing to show. So let S 6= {0}, V. Take a basis of S, apply
GS to get an orthonormal basis {u1 , . . . , uk } of S. Extend that to an orthonormal basis
{u1 , . . . , uk , w1 , . . . , wm } of V. It is easy to show that wi ∈ S ⊥ .
Now let x ∈ (S ⊥ )⊥ ⊂ V. Thus x = αi ui + βj wj , for some αi , βj ∈ C. As x ∈ (S ⊥ )⊥ ,
P P
we have hx, ⊥
Pyi = 0, for all y ∈ S . In particular hx, wj i = 0,for all j. Thus βj = 0, for all j.
Thus x = αi ui ∈ S.
(iii) Obvious.
(iv) Let x ∈ S ∩ S ⊥ . Then x ⊥ S. In particular hx, xi = 0. Thus x = 0.
P 2
11. Let A1 , · · · , Ak be k real symmetric matrices of order n such that Ai = 0. Show that each
Ai = 0.

Solution: For each x ∈ Rn we have


X  X X X
0 = xT A2i x = xT A2i x = xT ATi Ai x = kAi xk2 .

Hence, Ai x = 0 for each i and for each x. In particular, Ai e1 = 0, Ai e2 = 0, . . . , Ai en = 0 ⇒


Ai = 0.

12. Let V be a normed linear space and x, y ∈ V. Is it true that kxk − kyk ≤ kx − yk?

13. (T) Polar Identity: The following identity holds in an inner product space.

• Complex IPS : 4hx, yi = kx + yk2 − kx − yk2 + ikx + iyk2 − ikx − iyk2 .


• Real IPS : 4hx, yi = kx + yk2 − kx − yk2
4

Solution: We see that, kx + yk2 = hx, xi + hx, yi + hy, xi + hy, yi,


kx − yk2 = hx, xi − hx, yi − hy, xi + hy, yi
ikx + iyk2 = ihx, xi + ihx, iyi + ihiy, xi + hiy, iyi and
ikx − iyk2 = ihx, xi − ihx, iyi − ihiy, xi + hiy, iyi. Hence

kx + yk2 − kx − yk2 + ikx + iyk2 − ikx − iyk2


= 2hx, yi + 2hy, xi + 2ihx, iyi + 2ihiy, xi
= 2hx, yi + 2hy, xi − 2i2 hx, yi + 2i2 hy, xi = 4hx, yi.

14. Just for knowledge, will NOT be asked Let k · k be a norm on V. Then k · k is induced
by some inner product if and only if k · k satisfies the parallelogram law:

kx + yk2 + kx − yk2 = 2kxk2 + 2kyk2 .

15. Show that an orthonormal set in an inner product space is linearly independent.
n
P
Solution: Let S be an orthonormal set and suppose that αi xi = 0, for some xi ∈ S.
i=1
n
P
Then αi = hxi , αj xj i = hxi , 0i = 0, for each i. Thus, S is linearly independent.
j=1

16. P
Let A be unitary equivalent to B (that is A = U ∗ BU for some unitary matrix U ). Then
2 2
P
|aij | = |bij | .
ij ij

Solution: We have
X X
|aij |2 = tr(A∗ A) = tr(U ∗ B ∗ U U ∗ BU ) = tr(U ∗ B ∗ BU ) = tr(B ∗ BU ∗ U ) = tr(B ∗ B) = |bij |2 .
ij ij

17. For the following questions, find a projection matrix P that projects b onto the column space
of A, that is, P b ∈ col(A) and b − P b is orthogonal to col(A).
       
1 0 0 1 1 −2 4 1
0 1 0 2  1 −1 1   1 
(i) A = 0 0 1 , b = 3
   (ii) A = 
 1 1 1  , b =  1 .
  

0 0 0 4 1 2 4 0

Solution: Note that an orthonormal basis of col(A) is given by {e1 , e2 , e3 } ⊂ R4 . Hence, the
projection matrix equals
 
1 0 0 0
0 1 0 0
P = e1 eT1 + e2 eT2 + e3 eT3 = 

.
0 0 1 0
0 0 0 0
5

1 T √1 (−2, −1, 1, 2)T


For the second question, we know w1 = 2 (1, 1, 1, 1) , w2 = 10
and w3 =
1 T
2 (1, −1, −1, 1) form an orthonormal basis of col(A). Thus, the projection matrix equals
 
9 2 −2 1
1 
2 6 4 −2
P = w1 w1T + w2 w2T + w3 w3T = .
10 −2 4
 6 2
1 −2 2 9

Alternate:
   −1
1 0 0   1 0 0  
0 1 0  1 0 0 0  1 0 0 0
= A(AT A)−1 AT =   0 1 0 0 0 1 0
 
P 0
 0 1 0 0
0 1  0 0 1
0 0 1 0 0 0 1 0
0 0 0 0 0 0
   
1 0 0  −1   1 0 0 0
0 1 0 1 0 0 1 0 0 0 0 1 0 0
= 
   0 1 0 0 1 0 0 =  .
0 0 1 0 0 1 0
0 0 1 0 0 1 0
0 0 0 0 0 0 0

Similarly, it can be verified that


 
9 2 −2 1
1 2 6 4 −2
P = A(AT A)−1 AT = .
10 −2 4 6 2
1 −2 2 9

18. We are looking for the parabola y = C + Dt + Et2 that gives the least squares fit to these
four measurements:
y = 1 at t = −2, y = 1 at t = −1, y = 1 at t = 1 and y = 0 at t = 2.

(a) Write down the four equations (Ax = b) for the parabola C + Dt + Et2 to go through
the given four points. Prove that Ax = b has no solution.
     
1 −2 4 1 1 0 0 0
 1 −1 1   1   0 1 0 0 
Solution: Verify: A =   1 1 1  , b =  1  and RREF ([A b] =  0 0 1 0 .
    

1 2 4 0 0 0 0 1
 
C
(b) For finding a least square fit of Ax = b, that is, of A  D  = b, what equations would
E
you solve?
Solution: Let y = Ax − b be the error vector. Then, the sum of squared errors equals

f (x1 , x2 , x3 ) = yT y = (Ax − b)T (Ax − b) = xT AT Ax − 2xT AT b + bT b.


6

Thus, differentiating w.r.t x1 , x2 and x3 , we get


∂f T
 
∂f ∂f
, , = 2AT Ax − 2AT b.
∂x1 ∂x2 ∂x3
 
C
Now, equating it to zero gives AT Ax = AT b. Thus, we want to solve AT A  D  =
E
T
A b.
(c) Compute AT A. Compute its determinant. Compute its inverse.
 
4 0 10
Solution: AT A =  0 10 0  , det(AT A) = 4 × 10 × 34 − 10 × 10 × 10 = 360 and
10 0 34
(AT A)−1 = det(A1 T A) C T , where C (cofactor matrix, symmetric in this case) is given by:
 
340 0 −100
C= 0 36 0 .
−100 0 40

(d) Now, determine the parabola y = C + Dt + Et2 that gives the least squares fit.
Solution: Using the previous two parts, we see that
   
C 35
D  = (AT A)−1 AT b = 1 −6 .
30
E −5
(e) The first two columns of A are already orthogonal. From column 3, subtract its pro-
jection onto the plane of the first two columns to get the third orthogonal vector v.
Normalize v to find the third orthonormal vector w3 from Gram-Schmidt.
Solution: Since third and second columns are already orthogonal, suffices to subtract
from the third column its projection onto the first column:
5
vT = (4, 1, 1, 4) − (1, 1, 1, 1) = (3/2, −3/2, −3/2, 3/2).
2
To find w3 , just divide v by its length, 3. So,
w3 = (1/2, −1/2, −1/2, 1/2).
 
C
(f) Now compute x = A D  to verify that x is indeed the projection vector onto the

E
column space of the matrix A.
Solution: Verify that for the value of P computed in the previous problem which
corresponds to w3 in the previous part, we have
1 
xT = 9 12 8 1 = (P b)T .

10

You might also like