Professional Documents
Culture Documents
HWset 7
HWset 7
1 1
1
2 2 2
1 1 − 1
u1 = 21 , u2 = 21 , u3 = 12
−
2 2 2
1
2 − 12 − 21
in R4 . Can you find a vector u4 such that u1 , u2 , u3 , u4 are orthonormal?
If so, how many such vectors are there?
1
2a + 21 b + 12 c + 12 d = 0
1
2a + 21 b − 12 c − 12 d = 0
1
2a − 21 b + 12 c − 12 d = 0
Row reducing these equations leads to the system
a+b+c+d=0
b+d=0
c+d=0
which has general solution u4 = (a, b, c, d)T = (t, −t, −t, t)T . Any such
vector has dot product zero with u1 , u2 , u3 . For an orthonormal basis, we want
u4 · u4 = 1, which gives t = ± 12 , so there are 2 choices for u4 , namely
1 1
2 −2
− 1 1
21 or 21
−
2 2
1
2 − 12
5.1.20 (See the book for the full statement of the problem, which involves
finding a least squares estimate for a line y = mx).
5.1.24 Complete the proof of Fact 5.1.4: Orthogonal Projection are linear
transformations.
We must show that proj V (x + y) = proj V (x) + proj V (y) and proj V (cx) =
cprojV (x).
The two quantities are equal. One can see this by writing x = y + w; then
y is the parallel component to V and w is orthogonal to V , and in particular
y · w = 0. Then x · y = (y + w) · y = y · y = ||y||2 as desired.
6 2
3 −6
2 3
We have
6/7
u1 = 3/7
2/7
2 6/7 2
⊥
v2 = −6 −0 3/7 = −6
3 2/7 3
2/7
u2 = −6/7
3/7
5.2.14 Perform Gram-Schmidt on the vectors
1 0 1
7 7 8
1 2 1
7 7 6
We have
1/10
7/10
u1 =
1/10
7/10
0 1/10 −1
7 7/10 0
v2⊥ =
2 −10
1/10
=
1
7√ 7/10 0
−1/ 2
0
u2 = √
1/ 2
0 √
1 1/10 −1/ 2 0
8 7/10 0 1
⊥
v3 = −10 −0 √ =
1 1/10 1/ 2 0
6 7/10 0 −1
0√
1/ 2
u3 = 0
√
−1 2
5.2.16 Find the QR decomposition of the matrix
6 2
3 −6
2 3
From the work in problem 2, we can take Q to be
6/7 2/7
3/7 −6/7
2/7 3/7
R is calculated by computing various dot products of the column vectors of
Q with the column vectors of the original matrix, so we get R to be
7 0
0 7
5.2.28 Find the QR decomposition of the matrix
1 0 1
7 7 8
1 2 1
7 7 6
From the work in problem 2, we can take Q to be
√
1/10 −1/ 2 0√
7/10 0√ 1/ 2
1/10 1/ 2 0√
7/10 0 −1/ 2
R is calculated by computing various dot products of the column vectors of
Q with the column vectors of the original matrix, so we get R to be
10 10
√ 10
0 2 √0
0 0 2
5.2.32 Find an orthonormal basis of the plane x1 + x2 + x3 = 0.
First we find a basis for the plane by backsolving the equation. For example,
one such basis is
−1 −1
v1 = 0 v2 = 1
1 0
Next we apply Gram-Schmidt to this basis to make it orthonormal.
√
−1/ 2
u1 = 0√
1/ 2 √
−1 √ −1/ 2 −1/2
v2⊥ = 1 −1/ 2 0 = 1
√
0 1/ 2 −1/2
√
−1/
√ √6
u2 = 2/√3
−1/ 6
u1 , u2 form an orthonormal basis for the plane x1 + x2 + x3 = 0.
5.3.2 Is
−0.8 0.6
0.6 0.8
orthogonal?
Yes. Either one can note that the columns are orthogonal vectors, or one
can compute AT A and see that you get the identity matrix.
Yes. By fact 5.3.4 a, B −1 is also orthogonal, and then applying Fact 5.3.4
b several times shows that the product B −1 AB of three orthogonal matrices is
an orthogonal matrix. (Alternatively, one can show (B −1 AB)T = (B −1 AB)−1
directly.)
1 3
Find the matrix of orthogonal projection onto W.
We start by finding an orthonormal basis of W using Gram-Schmidt. We get
1/2
1/2
u1 =
1/2
1/2
1 1/2 −1
9
1/2 7
v2⊥ = −5 −4
1/2
=
−7
3 1/2 1
−1/10
7/10
u2 = −7/10
1/10
So setting
−1
1
2 10
1 7
Q= 12 10
−7
2 10
1 1
2 10
26 18 32 24
100 100 100 100
18 74 −24 32
100 100
−24
100 100
32 74 18
100 100 100 100
24 32 18 26
100 100 100 100
5.3.54 Find the dimension of the space of all skew symmetric matrices.
2
The dimension is n 2−n , since such a matrix must have all 0’s on the diagonal,
and then is determined by the values in the strictly upper triangular part of the
matrix.