You are on page 1of 15

MA 2030 - Linear Algebra and Numerical Analysis

Arindama Singh & A. V. Jayanthan

Orthonormal Sets
Definition 5: Let V be an ips. A set S V is called an
orthonormal set if S is orthogonal and kxk = 1 for each x S. In
addition, if an orthonormal set S is also a basis for V , then S is
called an orthonormal basis.
Example 12:
(a) The standard basis of Rn is an orthonormal basis of it.
(b) The set of functions {cos mt : m N} in the real ips C [0, 2]
R 2
with inner product as hf , g i = 0 f (t)g (t) dt is an orthogonal set.
R 2
But 0 cos2 t dt 6= 1. Hence, it is not an orthonormal set.

(c) However, {(cos mt)/ : m N} is an orthonormal set in


C [0, 2].
An orthogonal set must be lineraly independent.

A Result
Proposition 4: Let S = {u1 , u2 , . . . , un } be an orthonormal set in
an ips V . Let x span(S). Then
P
P
x = nj=1 hx, uj iuj and kxk2 = nj=1 |hx, uj i|2 .
Proof. x = 1 u1 + 2 u2 + + n un . Then hx, uj i = j .
P
P
Next, kxk2 = h j j uj , i i ui i
P P
= j i j i huj , ui i
P
= j j j
P
= nj=1 |hx, uj i|2 .

Some Corollaries
Proposition 5: (Fourior Expansion and Parsevals Identity)
Let {v1 , v2 , . . . , vn } be an orthonormal basis for an ips V . Let
x V . Then P
P
x = nj=1 hx, vj ivj and kxk2 = nj=1 |hx, vj i|2 .
Proposition 6: (Bessels Inequality) Let {u1 , u2 , . . . , un } be an
orthonormal set in an P
ips V . Let x V . Then
n
2
2
j=1 |hx, uj i| kxk .
P
Proof. Let y = nj=1 hx, uj i uj Then hx, ui i = hy , ui i.
That is, x y ui , for each i. So, x y y .
By Pythagoras theorem, kxk2 = kx y k2 + ky k2 ky k2 . As
y span{u
P 1 , . . . , un }, by Parsevals identitty,
ky k2 = nj=1 |hx, uj i|2 .

Projection
Given two linearly independent vectors u1 , u2 on the plane how do
we construct two orthogonal vectors?
Keep v1 = u1 .
Take out the projection of u2 on u1 to get u2 .
Now, u2 u1 .
What is the projection of u2 on u1 ? Its length is hu2 , u1 i. Its
direction is that of u1 , i.e., u1 /ku1 k. Thus
v1 = u1 , v2 = u2

hu2 , v1 i
v1 .
hv1 , v1 i

We may continue this process of taking out projections in n


dimensions.

Gram-Schmidt Orthogonalization
Let V = span{u1 , u2 , . . . , un }. Define
v1 = u1 .
2 ,v1 i
v2 = u2 hu
hv1 ,v1 i v1
..
.
,vn i
,v1 i
v1 huhvn+1
vn .
vn+1 = un+1 huhvn+1
n ,vn i
1 ,v1 i
Proposition 7: If {u1 , u2 , . . . , un } is linearly independet, then
{v1 , v2 , . . . , vn } is orthogonal and
span{v1 , v2 , . . . , vn } = span{u1 , u2 , . . . , un }.
Proof. hv2 , v1 i = hu2

hu2 ,v1 i
hv1 ,v1 i

v1 , v1 i

hu2 ,v1 i
hv1 ,v1 i hv1 , v1 i

= hu2 , v1 i
= 0.
Proof follows by induction.

Examples

Example 13 The vectors u1 = (1, 0, 0), u2 = (1, 1, 0), u3 = (1, 1, 1)


form a basis for R3 . Apply Gram-Schmidt Orthogonalization:
v1 = (1, 0, 0).
2 ,v1 i
v2 = u2 hu
hv1 ,v1 i v1
= (1, 1, 0) (1,1,0)(1,0,0)
(1,0,0)(1,0,0) (1, 0, 0)
= (1, 1, 0) 1 (1, 0, 0) = (0, 1, 0).
hu3 ,v2 i
3 ,v1 i
v3 = u3 hu
hv1 ,v1 i v1 hv2 ,v2 i v2
= (1, 1, 1) (1, 1, 1) (1, 0, 0)(1, 0, 0) (1, 1, 1) (0, 1, 0)(0, 1, 0)
= (1, 1, 1) (1, 0, 0) (0, 1, 0) = (0, 0, 1).
The set {(1, 0, 0), (0, 1, 0), (0, 0, 1)} is orthogonal.

Example 14:
The vectors u1 = (1, 1, 0), u2 = (0, 1, 1), u3 = (1, 0, 1) form a basis
for F3 . Apply Gram-Schmidt Orthogonalization:
v1 = (1, 1, 0).
2 ,v1 i
v2 = u2 hu
hv1 ,v1 i v1
(0,1,1)(1,1,0)
(1,1,0)(1,1,0) (1, 1, 0)
= (0, 1, 1) 21 (1, 1, 0) = ( 12 , 12 , 1).
hu3 ,v2 i
3 ,v1 i
v3 = u3 hu
hv1 ,v1 i v1 hv2 ,v2 i v2

= (0, 1, 1)

= (1, 0, 1) (1, 0, 1) (1, 1, 0)(1, 1, 0)


(1, 0, 1) ( 21 , 12 , 1) ( 12 , 12 , 1)
= (1, 0, 1) 12 (1, 1, 0) 13 ( 12 , 21 , 1) = ( 23 , 32 , 23 ).
The set {(1, 1, 0), ( 12 , 12 , 1), ( 32 , 32 , 23 )} is orthogonal.

Example 15:
The vectors u1 = 1, u2 = t, u3 = t 2 form a linearly independent set
in the ips of all polynomials considered as functions from [1, 1] to
R1
R; inner rpoduct is hp(t), q(t)i = 1 p(t)q(t) dt. Applying
Gram-Schmidt Process, we see:
v1 = u1 = 1.
v2 = u2

hu2 ,v1 i
hv1 ,v1 i

R1

v1 = t

R1
1
1

t dt
dt

hu3 ,v2 i
hu3 ,v1 i
hv1 ,v1 i v1 hv2 ,v2 i v2
R1 3
R1 2
t dt
1 t dt
2
t R1
1 R1
1
2
dt
1
1 t dt

1 = t.

v3 = u3
=

t = t 2 1/3.

The set {1, t, t 2 1/3} is an orthogonal set in this ips.


In fact, orthogonalization of {1, t, t 2 , t 3 , t 4 , . . .} gives the Legendre
Polynomials.
An orthogonal set can be made orthonormal by dividing each vector
by its norm. Therefore, every finite dimensional inner product space
has an orthonormal basis.

Best Approximation
What is the best point on a line that approximates a point not on
the line? What is the best point on a circle that approximates a
point not on the circle? What is the best point on a surface that
approximates a point not on the surface?
Definition 7: Let U be a subspace of an ips V . Let v V . A
vector u U is a best approximation of v if kv uk kv xk for
each x U.
Proposition 11: Let U be a subspace of an ips V . A vector u U
is a best approximation of v V iff v u U. Moreover, a best
approximation is unique.
Proof. (a) Suppose v u U. Let x U. Now,
kv xk2 = k(v u) + (u x)k2
= kv uk2 + ku xk2 kv uk2 .
as u x U and Pythagoras.
Hence, kv uk kv xk for each x U.

Proof of Prop 11 Cont.


(b) Suppose kv uk kv xk for each x U.
(*)
Let y U. To show: hv u, y i = 0.
For y = 0, clearly hv u, y i = 0.
For y 6= 0, let = hv u, y i/ky k2 . Then hv u, y i = ||2 ky k2 .
Thus, hy , v ui = ||2 ky k2 also. From (*),
kv uk2 kv u y k2 = hv u y , v u y i
= kv uk2 hv u, y i hy , v ui + ||2 ky k2
= kv uk2 hv u, y i = kv uk2 ||2 ky k2 .
Hence, hv u, y i = 0 hv u, y i = 0.
(c) If both u, w are best approximations to v , then
kv uk kv w k and kv w k kv uk.
So, kv uk = kv w k. Now,
kv uk2 = k(v w ) + (w u)k2 = kv w k2 + kw uk2
(since v w w u.)
2
= kv uk + kw uk2 . Thus, kw uk2 = 0.

Best Approximation
Proposition 12: Let {u1 , . . . , un } be an orthonormal basis for
U S V , an ips.
v V . The unique best approximation of v
PLet
n
from U is u = i=1 hv , ui iui .
P
Proof. Let x U. Then, x = nj=1 hx, uj iuj . Now,
P
P
hv u, xi = hv ni=1 hv , ui iui , nj=1 hx, uj iuj i
P
P P
= nj=1 hx, uj ihv , uj i ni=1 nj=1 hv , ui ihx, uj ihui , uj i
P
P
= nj=1 hx, uj ihv , uj i ni=1 hv , ui ihx, ui i = 0.
That is, v u U.


Pn

If {u1 , . . . , un } is a basis for U, write u = j=1 j uj . Our


requirement
v u uj means determining j from
Pn
hv j=1 j uj , ui i = 0. We solve the linear system
P
n
j=1 huj , ui ij = hv , ui i.

Examples
Example 16: What is the best approximation of v = (1, 0) R2
from U = {(a, a) : a R}?
Find (, ) so that (1, 0) (, ) (, ) for all .
Find (, ) so that (1 , ) (1, 1) = 0.
So, = 1/2. The best approximation here is (1/2, 1/2).
R1
Example 17: In C [0, 1] over R, with hf , g i = 0 f (t)g (t) dt, what
is the best approximation of t 2 from P1 ?
Determine , so that t 2 ( + t) 1 and t 2 ( + t) t.
R1 2
R1 3
2
(t

t)
dt
=
0
=
0
0 (t t t ) dt.
1/3 /2 = 0 = 1/4 /2 /3.
The best approximation is 1/6 + t.

Best Approximate Solution


Definition 8: Let U be a vector space, V an ips, A : U V a
linear transformation. A vector u U is a Best Approximate
Solution (also called a Least Square Solution) of the equation
Ax = y if kAu y k kAz y k for all z U.
Thus, u is a best approximate solution of Ax = y iff v = Au is a
best approximation of y from R(A), the range space of A. So,
Proposition 13: Let U be a vector space, V an ips, A : U V a
linear transformation.
(a) A vector u U is a best approximate solution
iff Au y R(A).
(b) If dim(R(A)) < , then Ax = y has a best
approximate solution.
(c) A best approximate solution is unique iff A is one-one.

BAS for Linear Systems


Proposition 14: Let A Rmn . A vector u Rn is a best
approximate solution of Ax = y iff At Au = At y .
Proof. u1 , . . . , un , the columns of A span R(A). Thus u is a best
approximate solution of Ax = y
iff hAu y , ui i = 0, for i = 1, . . . , n iff uit (Au y ) = 0 for these i
iff At (Au y ) = 0 iff At Au = At y .


 



1 1
0
1
, y=
, u=
.
Example 18: Let A =
0 0
1
1
Now, At Au = At y .
Thus u is a best approximate solution of Ax = y .
However, Ax = y has no solution.

You might also like