Professional Documents
Culture Documents
416 HWD S
416 HWD S
1. For each of the following inner products, write down an orthonormal basis for P3 (R) using
the Gram–Schmidt Algorithm applied to the basis (1, x, x2 , x3 ).
R1
(a) hf, gi = 0 f (t)g(t) dt,
R1
(b) hf, gi = −1 f (t)g(t) dt,
R∞ −t2 f (t)g(t) dt,
(c) hf, gi = −∞ e
Solution:
In all of these, we write wk = xk for k = 0, 1, 2, 3.
hw1 , v0 i 1/2
v 1 = w1 − v0 = x − 1 = x − 1/2.
hv0 , v0 i 1
hw2 , v1 i hw2 , v0 i
v 2 = w2 − v1 − v0
hv1 , v1 i hv0 , v0 i
2
x , x − 1/2 1/3
= x2 − (x − 1/2) −
hx − 1/2, x − 1/2i 1
1/12 1
= x2 − (x − 1/2) − = x2 − x + 1/6.
1/12 3
Finally,
We can rescale all of these and make them unit vectors, and then the set is orthonormal.
1
(b) Here we can definitely use the fact that the domain is symmetric and note that if f, g
have opposite parity (i.e. that their product has a odd degree) then the integral is zero
and they are orthogonal. Therefore h1, xi = 0 and we can choose v0 = w0 = 1 and
v1 = w1 = x. Then
hw2 , v1 i hw2 , v0 i
w2 = v 2 − v1 − v0 ,
hv1 , v1 i hv0 , v0 i
but since 2 + 1 is odd we can ignore the middle term and obtain
2/3
w2 = x 2 − = x2 − 1/3. (1)
2
Note that w2 remains an even polynomial, and so when we write
hw3 , v2 i hw3 , v1 i hw3 , v0 i
v 3 = w3 − v2 − v1 − v0 ,
hv2 , v2 i hv1 , v1 i hv0 , v0 i
we again only have to consider terms where the powers sum up to an even number, and
thus
hw3 , v1 i 2/5 3
v 3 = w3 − v1 = x3 − x = x3 − x. (2)
hv1 , v1 i 2/3 5
This gives an orthogonal set of
And we can again repeat every word until (2), and obtain
R ∞ −t2 4 √
hw3 , v1 i 3 −∞ e t dt 3 3 π/4 3
v 3 = w3 − v1 = x − R ∞ −t2 2 · x − x − √ x = x3 − x.
hv1 , v1 i −∞ e t dt π/2 2
2. The question about difficulties depends on how one approaches this problem. If one steps back
from the problem for a bit before computing, as we did above, then (b),(c) are basically equally
difficult and much easier than (a). If one just dives in and starts computing, then (b) and
especially (c) will give you a sad.
2
3. Let β be a basis for a subspace W of an inner product space V , and let z ∈ V . Show that
z ∈ W ⊥ if and only if hz, vi = 0 for all v ∈ β.
Solution: The forward direction is easy. Since z ∈ W ⊥ means that hz, wi = 0 for all w ∈ W ,
then since β ⊆ W , this means that hz, wi = 0 for all w inβ.
For the opposite direction, notice that if hz, vi i = 0 for all i ∈ I, then
* +
X X
z, vi = hz, vi i = 0,
i∈I i∈I
where I is any set that indexes the elements of β. From this we can deduce that z is orthogonal
to the span of β, which is W .
Solution:
(a) Recall that
S ⊥ = {x ∈ V : hx, yi = 0 ∀y ∈ S}.
Let y, z ∈ S ⊥ . This means that hx, yi = hx, zi = 0 for all x ∈ S. Then we see that
hx, ay + bzi = a hx, yi + b hx, zi = 0
for all x ∈ S, so that ay + bz ∈ S ⊥ . Since S ⊥ is a subset of V that is closed under
addition and scalar multiplication, it is a subspace of V .
(b) Consider the set S. Either it is linearly independent or not. If it is, then S is a basis for
Span(S). If it is not, remove vectors until it is, and then β ⊆ S is a basis for Span(S).
But Problem #2 above, Span(S)⊥ = β ⊥ = S ⊥ . Therefore we only need to prove that
Span(S) = (Span(S)⊥ )⊥ . But since Span(S) is a subspace of V , this follows from (c)
below.
(c) Let us write Y = W ⊥ , and recall that x ∈ Y if and only if hx, wi = 0 for all x ∈ W .
Therefore, if w ∈ W , then w ∈ Y ⊥ . This shows that W ⊆ Y ⊥ = (W ⊥ )⊥ .
Conversely, now assume that x 6∈ W . Let (v1 , . . . , vk ) be an orthonormal basis for W
(this exists since W is finite-dimensional). Then if we write
k
X
w= hx, vi i vi ,
i=1
3
then w ∈ W , and z = x − w ∈ W ⊥ . From this, we see that
γ 0 = (z1 , z2 , . . . , zk , . . . , zn )
where z1 , . . . , zk is an orthonormal basis for W . Now, given any v ∈ V , let us write the
expansion
Xn
v= hv, zi i zi ,
i=1
0 = (w − q) + (w0 − q 0 )
or
w − q = q 0 − w0
The left-hand side is in W and the right hand side is in W ⊥ , and therefore both are 0.
And thus w = q and w0 = q 0 .
4
5. Let β = (v1 , . . . , vn ) be an orthonormal basis for V . Show that for any x, y ∈ V ,
n
X
hx, yi = hx, vi i hy, vi i.
i=1
so
* n n
+
X X
hx, yi = hx, vi i vi , hy, vj i vj
i=1 j=1
n X
X n
= hx, vi i hy, vj i hvi , vj i .
i=1 j=1
Since hvi , vj i =
6 0 only if i = j, this double sum becomes a single sum, and we get
n
X
hx, vi i hy, vi i.
i=1
T (x) = hx, yi z.
Show that T is linear. Then show that T ∗ exists, and compute an explicit formula for it.
Compute the rank of T and T ∗ .
hT x, wi = hx, T ∗ wi .
So we compute
D E
hT x, wi = hhx, yi z, wi = hx, yi hz, wi = x, hz, wi y = hx, hw, zi yi .
Therefore T ∗ w = hw, zi y.
5
Now, we want to compute the rank of T . Notice that if q ∈ R(T ), then q is a scalar multiple
of the vector z. This means that T has rank at most one. Moreover, if we choose x = y,
then T (y) = hy, yi z, and as long as y 6= 0, this is a non-zero number. So R(T ) is exactly
one-dimensional. Therefore it has rank one.
The rank of T ∗ is the same by the same argument (T ∗ w is a scalar multiple of y, etc.).
(A theorem in the book that we did not get to in class is that T and T ∗ always have the same
rank, so knowing this we could prove it this way as well.)
Solution: We write
Therefore U1∗ = U1 .
We proceed in a similar manner for U2 . We have
so U2∗ = U2 .
8. Let V be a finite-dimensional inner product space, and T a linear operator on V . Show that
if T is invertible, then T ∗ is invertible, and, moreover, that (T ∗ )−1 = (T −1 )∗ .
Solution: Since T is invertible, T −1 exists and is linear. As we showed in class, this means
that (T −1 )∗ exists, and it also linear. Moreover, we know that
−1
T x, y = x, (T −1 )∗ y .