You are on page 1of 76

Chapter Content(EAC)

 Inner Products
 Angle and Orthogonality in Inner Product Spaces
 Orthonormal Bases; Gram-Schmidt Process; QR-
Decomposition
 Best Approximation; Least Squares

1
6-3 Orthonormal Basis

 A set of vectors in an inner product space is called an


orthogonal set if all pairs of distinct vectors in the set are
orthogonal.
 An orthogonal set in which each vector has norm 1 is
called orthonormal.

 Example 1
 Let u1 = (0, 1, 0), u2 = (1, 0, 1), u3 = (1, 0, -1) and assume
that R3 has the Euclidean inner product.
 It follows that the set of vectors S = {u1, u2, u3} is
orthogonal since
u1, u2 = u1, u3 = u2, u3 = 0.

2
3
6-3 Example 2

 Let u1 = (0, 1, 0), u2 = (1, 0, 1), u3 = (1, 0, -1)


 The Euclidean norms of the vectors are
u1  1, u2  2, u3  2
 Normalizing u1, u2, and u3 yields
u1 u 1 1 u 1 1
v1   (0,1,0), v 2  2  ( ,0, ), v 3  3  ( ,0, )
u1 u2 2 2 u3 2 2

 The set S = {v1, v2, v3} is orthonormal since


v1, v2 = v1, v3 = v2, v3 = 0 and ||v1|| = ||v2|| = ||v3|| = 1

4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
6-3 Orthonormal Basis

 Theorem 6.3.1*
 If S = {v1, v2, …, vn} is an orthonormal basis for an inner
product space V, and u is any vector in V, then
u = u, v1 v1 + u, v2 v2 + · · · + u, vn vn
 Remark
 The scalars u, v1, u, v2, … , u, vn are the coordinates of
the vector u relative to the orthonormal basis S = {v1, v2, …,
vn} and
(u)S = (u, v1, u, v2, … , u, vn)
is the coordinate vector of u relative to this basis

19
6-3 Example 3

 Let v1 = (0, 1, 0), v2 = (-4/5, 0, 3/5), v3 = (3/5, 0, 4/5).


It is easy to check that S = {v1, v2, v3} is an orthonormal basis
for R3 with the Euclidean inner product.
Express the vector u = (1, 1, 1) as a linear combination of the
vectors in S, and find the coordinate vector (u)s.

 Solution:
 u, v1 = 1, u, v2 = -1/5, u, v3 = 7/5
 Therefore, by Theorem 6.3.1 we have u = v1 – 1/5 v2 + 7/5 v3
 That is, (1, 1, 1) = (0, 1, 0) – 1/5 (-4/5, 0, 3/5) + 7/5 (3/5, 0, 4/5)
 The coordinate vector of u relative to S is
(u)s=(u, v1, u, v2, u, v3) = (1, -1/5, 7/5)

20
21
22
23
24
Theorem 6.3.2
 If S is an orthonormal basis for an n-dimensional inner product
space, and if (u)s = (u1, u2, …, un) and (v)s = (v1, v2, …, vn) then:
u  u12  u22    un2
d (u, v)  (u1  v1 ) 2  (u2  v2 ) 2    (un  vn ) 2
u, v  u1v1  u2 v2    un vn
 Example 4 (Calculating Norms Using Orthonormal Bases)
 u=(1, 1, 1)

25
26
27
28
6-3 Coordinates Relative to Orthogonal Bases
 If S = {v1, v2, …, vn} is an orthogonal basis for a vector space V,
then normalizing each of these vectors yields the orthonormal
basis
 v1 v 2 v n 
S'  , , , 
 v1 v 2 v n 
 Thus, if u is any vector in V, it follows from theorem 6.3.1 that
v1 v1 v v2 v vn
u  u,  u, 2    u, n
v1 v1 v2 v2 vn vn
or
u, v1 u, v 2 u, v n
u 2
v1  2
v2   2
vn
v1 v2 vn
 The above equation expresses u as a linear combination of the
vectors in the orthogonal basis S.
29
Theorem 6.3.3
 If S = {v1, v2, …, vn} is an orthogonal set of nonzero vectors in
an inner product space
 then S is linearly independent.

 Remark
 By working with orthonormal bases, the computation of
general norms and inner products can be reduced to the
computation of Euclidean norms and inner products of the
coordinate vectors.

30
Theorem 6.3.4 (Projection Theorem)
 If W is a finite-dimensional subspace of an product space V,
 then every vector u in V can be expressed in exactly one
way as
u = w1 + w2

where w1 is in W and w2 is in W.

31
32
33
Theorem 6.3.5
 Let W be a finite-dimensional subspace of an inner product
space V.
 If {v1, …, vr} is an orthonormal basis for W, and u is any
vector in V, then
projwu = u,v1 v1 + u,v2 v2 + … + u,vr vr

 If {v1, …, vr} is an orthogonal basis for W, and u is any


vector in V, then
u, v1 u, v 2 u, v r
proj W u  2
v1  2
v2   2
vr Need
v1 v2 vr Normalization

34
6-3 Example 6

 Let R3 have the Euclidean inner product, and let W be the


subspace spanned by the orthonormal vectors v1 = (0, 1, 0) and
v2 = (-4/5, 0, 3/5).
 From the above theorem, the orthogonal projection of u = (1, 1,
1) on W is projw u =< u, v1 > v1  < u, v 2 > v 2
1 4 3 4 3
=(1)(0, 1, 0)  ( )( , 0, )=( , 1,  )
5 5 5 25 25
 The component of u orthogonal to W is
4 3 21 28
projw u = u  projw u = (1, 1, 1)  ( , 1,  )  ( , 0, )
25 25 25 25
 Observe that projWu is orthogonal to both v1 and v2.

35
36
37
6-3 Finding Orthogonal/Orthonormal Bases

 Theorem 6.3.6
 Every nonzero finite-dimensional inner product space has
an orthonormal basis.

 Remark
 The step-by-step construction for converting an arbitrary
basis into an orthogonal basis is called the Gram-Schmidt
process.

38
39
40
41
42
43
44
6-3 Example 7(Gram-Schmidt Process)

 Consider the vector space R3 with the Euclidean inner product.


Apply the Gram-Schmidt process to transform the basis vectors
u1 = (1, 1, 1), u2 = (0, 1, 1), u3 = (0, 0, 1)
into an orthogonal basis {v1, v2, v3}; then normalize the orthogonal
basis vectors to obtain an orthonormal basis {q1, q2, q3}.

 Solution:
 Step 1: Let v1 = u1.That is, v1 = u1 = (1, 1, 1)
 Step 2: Let v2 = u2 – projW1u2. That is,

 u 2 , v1 
v 2  u 2  projw1 u 2  u 2  2
v1
v1
2 2 1 1
 (0, 1, 1)  (1, 1, 1)  (  , , ) 45
3 3 3 3
6-3 Example 7(Gram-Schmidt Process)
We have two vectors in W2 now!

 Step 3: Let v3 = u3 – projW2u3. That is,


 u3 , v1   u3 , v 2 
v 3  u3  projw 2 u3  u3  2
v1  2
v2
v1 v2
1 1/ 3 2 1 1 1 1
 (0, 1, 1)  (1, 1, 1)  ( , , )  (0,  , )
3 2/3 3 3 3 2 2

 Thus, v1 = (1, 1, 1), v2 = (-2/3, 1/3, 1/3), v3 = (0, -1/2, 1/2) form an
orthogonal basis for R3. The norms of these vectors are
6 1
v1  3, v 2  , v3 
3 2
so an orthonormal basis for R3 is
v1 1 1 1 v2 2 1 1
q1  ( , , ), q 2   ( , , ),
v1 3 3 3 v2 6 6 6
v3 1 1
q3   (0, - , )
v3 2 2

46
47
48
49
50
51
52
53
54
55
56
57
Theorem 6.3.7 (QR-Decomposition)

 If A is an mn matrix with linearly independent column


vectors, then A can be factored as
A = QR
where Q is an mn matrix with orthonormal column vectors,
and R is an nn invertible upper triangular matrix.

58
Solution from examples 7

59
60
61
6-3 Example 8
(QR-Decomposition of a 33 Matrix)
 Find the QR-decomposition of 1 0 0 
A  1 1 0 
 Solution: 1 1 1 
 The column vectors A are
1 0  0 
 
u1  1 , u 2  1  , u 3   1/ 2 
1 1   1/ 2 
 
 Applying the Gram-Schmidt process with subsequent normalization to
these column vectors yields the orthonormal vectors
1/ 3   2 / 6   0 
     
q1  1/ 3  , q 2   1/ 6  , q3   1/ 2  Q
     1/ 2 
1/ 3   1/ 6   

62
6-3 Example 8

 The matrix R is
 u1 , q1 u 2 , q1 u 3 , q1  3 / 3 2 / 3 1 / 3 
   0 
R 0 u2 , q2 u3 , q 2   2 / 6 1/ 6 
 0 0 u3 , q3   0 0 1 / 2 
 
 Thus, the QR-decomposition of A is

1 0 0  1/ 3 2 / 6 0  3 / 3 2 / 3 1/ 3 
1 1 0   1/ 3 1/ 6 
1/ 2   0

   2 / 6 1/ 6 
1 1 1  1/ 3 1/ 6  
 1/ 3   0 0 1/ 2 
A Q R

63
64
65
66
67
68
69
70
71
72
73
74
75
Assignments(EAC)

 Exercise-6.3: Examples 1 – 8 (Pages 478-490)


 Exercise-6.3: Problems 1– 24 (Pages 490-495)

76

You might also like