Professional Documents
Culture Documents
PS5 Soln
PS5 Soln
Problem Set 5
Solution: (e1 + e4 )⊥ is the set of all vectors that are orthogonal to e1 + e4. That is, the set of
1 0 0 1 0
all xT = (x1 , . . . , x4 ) such that x1 +x4 = 0. So S ⊥ is the solution space of .
−1 3 −1 0 0
Apply GJE and get it.
Otherwise apply GS with {e1 + e4 , −e1 + 3e2 − e3 , e1 , e2 , e3 , e4 }. Linear span of the last two
vectors of the orthonormal basis is S ⊥ .
2. Show that there are infinitely many orthonormal bases of R2 .
cos θ − sin θ
Solution: Columns of , for 0 ≤ θ < 2π, form bases of R2 . Idea is that take
sin θ cos θ
{e1 , e2 } and then counter-clockwise rotate the set by an angle θ.
3. What is the projection of v = e1 + 2e2 − 3e3 on H : x1 + 2x2 + 4x4 = 0?
0 2 4
0 −1 0
Solution: Basis for H: , , .
1
0 0
0 0 −1
0 2 4
0 −1 8
1 1
Orthonormalize: w1 = , w2 = √ ,w = √ 0 .
1 5 0 3 105
0 0 −5
0 4 16
0 20 8 1 32
The projection is hv, w1 iw1 + hv, w2 iw2 + hv, w3 iw3 = + 0w2 + 105 0 = 21 −63.
−3
0 −5 20
1
2
Alternately: Let x be the projection. Then v − x is parallel to , the normal vector of H.
0
4
1 1 1 16
2
5 2 5 2 1 32
\
So v − x = hv, v \
− xiv − x = 21 0. So x = −3 − 21 0 = 21 −63.
4 0 4 20
5. (T) Does there exist a real matrix A, for which, the Row space and column space are same
but the null-space and left null-space are different?
Solution: Not possible. Use the fundamental theorem of linear algebra which states that
⊥
N (A) = col(AT ) and N (AT ) = (col(A))⊥ .
That is, Same row and column spaces require us to have a square matrix. This further implies
that the dimension of null-spaces have to be same. Now, null-space and left-null-space are
orthogonal to row and column spaces, respectively (which are same in this case). Hence, the
Null-spaces are also same.
6. (T) Consider two real systems, say Ax = b and Cy = d. If the two systems have the same
nonempty solution set, then, is it necessary that row(A) = row(C)?
Solution: Yes. Observe that they have to be systems with the same number of variables.
So, the two matrices A and C have the same set of columns. Let S be the solution set and
x0 be a particular solution. Then, Sh = S − x0 is the solution set for Ax = 0 and Cy = 0.
That is, N (A) = N (C). So, by fundamental theorem of linear algebra, colAT = colC T . That
is, row(A) = row(C).
x1 + 2x2 + 2x3 = 5
2x1 + 2x2 + 3x3 = 5
3x1 + 4x2 + 5x3 = 9
Solution: Note that if the system has a solution x0 then, we get Ax0 = b. Thus, for any
y ∈ N (AT ), we have
1 9
to Equation (1). Thus, the given system has no solution.
8. (T) Suppose A is an n by n real invertible matrix. Describe the subspace of the row space
of A which is orthogonal to the first column of A−1 .
Solution: Let A[:, j] (respectively, A[:, i]) denote the j-th column (respectively, the i-th row)
of A. Then, AA−1 = In implies hA[i, :], A−1 [:, 1]i = 0 for 2 ≤ i ≤ n. So, the row subspace of
A that is orthogonal to the first column of A−1 equals LS(A[2, :], A[3, :], . . . , A[n, :]).
9. (T) Let An×n be any matrix. Then, the following statements are equivalent.
(i) A is unitary.
3
(ii) For any orthonormal basis {u1 , . . . , un } of Cn , the set {Au1 , . . . , Aun } is also an or-
thonormal basis.
Solution: (i) ⇒ (ii): Suppose A is unitary. Then hAui , Auj i = hui , A∗ Auj i = hui , uj i. It
follows that {Au1 , . . . , Aun } is orthonormal, hence a basis of Cn .
(ii) ⇒ (i): Suppose (ii) is satisfied by A. Consider the standard basis {e1 , . . . , en }. By
hypothesis {Ae1 , . . . , Aen } is an orthonormal basis. That is the columns of A form an or-
thonormal basis, that is, A∗ A = I.
10. Let V be an inner product space and S be a nonempty subset of V. Show that
(i) S ⊂ (S ⊥ )⊥ .
(ii) If V is finite dimensional and S is a subspace then (S ⊥ )⊥ = S.
(iii) If S ⊂ T ⊂ V, then S ⊥ ⊃ T ⊥ .
(iv) If S is a subspace then S ∩ S ⊥ = {0}.
13. (T) Polar Identity: The following identity holds in an inner product space.
14. Just for knowledge, will NOT be asked Let k · k be a norm on V. Then k · k is induced
by some inner product if and only if k · k satisfies the parallelogram law:
15. Show that an orthonormal set in an inner product space is linearly independent.
n
P
Solution: Let S be an orthonormal set and suppose that αi xi = 0, for some xi ∈ S.
i=1
n
P
Then αi = hxi , αj xj i = hxi , 0i = 0, for each i. Thus, S is linearly independent.
j=1
16. P
Let A be unitary equivalent to B (that is A = U ∗ BU for some unitary matrix U ). Then
2 2
P
|aij | = |bij | .
ij ij
Solution: We have
X X
|aij |2 = tr(A∗ A) = tr(U ∗ B ∗ U U ∗ BU ) = tr(U ∗ B ∗ BU ) = tr(B ∗ BU ∗ U ) = tr(B ∗ B) = |bij |2 .
ij ij
17. For the following questions, find a projection matrix P that projects b onto the column space
of A, that is, P b ∈ col(A) and b − P b is orthogonal to col(A).
1 0 0 1 1 −2 4 1
0 1 0 2 1 −1 1 1
(i) A = 0 0 1 , b = 3
(ii) A =
1 1 1 , b = 1 .
0 0 0 4 1 2 4 0
Solution: Note that an orthonormal basis of col(A) is given by {e1 , e2 , e3 } ⊂ R4 . Hence, the
projection matrix equals
1 0 0 0
0 1 0 0
P = e1 eT1 + e2 eT2 + e3 eT3 =
.
0 0 1 0
0 0 0 0
5
Alternate:
−1
1 0 0 1 0 0
0 1 0 1 0 0 0 1 0 0 0
= A(AT A)−1 AT = 0 1 0 0 0 1 0
P 0
0 1 0 0
0 1 0 0 1
0 0 1 0 0 0 1 0
0 0 0 0 0 0
1 0 0 −1 1 0 0 0
0 1 0 1 0 0 1 0 0 0 0 1 0 0
=
0 1 0 0 1 0 0 = .
0 0 1 0 0 1 0
0 0 1 0 0 1 0
0 0 0 0 0 0 0
18. We are looking for the parabola y = C + Dt + Et2 that gives the least squares fit to these
four measurements:
y = 1 at t = −2, y = 1 at t = −1, y = 1 at t = 1 and y = 0 at t = 2.
(a) Write down the four equations (Ax = b) for the parabola C + Dt + Et2 to go through
the given four points. Prove that Ax = b has no solution.
1 −2 4 1 1 0 0 0
1 −1 1 1 0 1 0 0
Solution: Verify: A = 1 1 1 , b = 1 and RREF ([A b] = 0 0 1 0 .
1 2 4 0 0 0 0 1
C
(b) For finding a least square fit of Ax = b, that is, of A D = b, what equations would
E
you solve?
Solution: Let y = Ax − b be the error vector. Then, the sum of squared errors equals
(d) Now, determine the parabola y = C + Dt + Et2 that gives the least squares fit.
Solution: Using the previous two parts, we see that
C 35
D = (AT A)−1 AT b = 1 −6 .
30
E −5
(e) The first two columns of A are already orthogonal. From column 3, subtract its pro-
jection onto the plane of the first two columns to get the third orthogonal vector v.
Normalize v to find the third orthonormal vector w3 from Gram-Schmidt.
Solution: Since third and second columns are already orthogonal, suffices to subtract
from the third column its projection onto the first column:
5
vT = (4, 1, 1, 4) − (1, 1, 1, 1) = (3/2, −3/2, −3/2, 3/2).
2
To find w3 , just divide v by its length, 3. So,
w3 = (1/2, −1/2, −1/2, 1/2).
C
(f) Now compute x = A D to verify that x is indeed the projection vector onto the
E
column space of the matrix A.
Solution: Verify that for the value of P computed in the previous problem which
corresponds to w3 in the previous part, we have
1
xT = 9 12 8 1 = (P b)T .
10