Professional Documents
Culture Documents
Lecture 12
Inner Product Spaces
July 7, 2016
1 / 19
2 / 19
Angles
Consider the triangle shown below
3 / 19
Projection
A useful application of the scalar product is to calculate the projection of
one vector on another. The vector u can be written as the sum of a vector
parallel to v, uk (called the projection of u onto v) and a vector
orthogonal to v, u
u = uk + u = av + u
Taking the scalar product with v gives
u v = av v
Thus
uk =
u v
vv
u = u uk
4 / 19
Orthonormal vectors
A set of vectors {v1 , v2 , , vn } that are mutually orthogonal also have to
be linearly independent. If
1 v1 + 2 v2 + + n vn = 0
Taking the scalar product of this with vi
0 = vi (1 v1 + 2 v2 + + n vn ) = i |vi |2
and so all the s are zero.
If the vectors are also unit vectors (length = 1), the set is called
orthonormal. If we have an orthonormal basis, {b1 , b2 , , bn } a general
vector can be expressed in the form
u = (u b1 )b1 + (u b2 )b2 + , (u bn )bn
this is called the Fourier expansion of u.
Linear Algebra and Differential EquationsLecture 12 Inner Product
July
Spaces
7, 2016
5 / 19
Gram-Schmidt Orthogonalization
Since orthonormal bases have such nice properties, it makes sense to work
with them. We can use the projection idea to construct an orthonormal
basis from a given basis. This procedure is called Gram-Schmidt
orthogonalization.
For example let us find an orthonormal basis for the span({v1 , v2 , v3 })
where
1
1
4
1
v2 = 4 v3 = 2
v1 =
1
4
2
1
1
0
Our first basis vector is b1 = v1 then
b2 = v2
v2 b1
b1
b1 b1
6 / 19
1
4 8
b2 =
4 4
1
Now
b3 = v3
and
4
2 4
b3 =
2 4
0
1
3
1 2
=
1 2
1
1
v3 b1
v3 b2
b1
b2
b1 b1
b2 b2
1
12
1
18
1
2
= 1
2 3
1
3
5
7
5
7 / 19
8 / 19
9 / 19
An Example
Consider C[-1,1], the set of continuous functions defined on [-1,1]. Define
an inner product by
Z 1
< f |g >=
f (t)g (t) dt
1
< f |g >=
f (t)g (t) dt =
1
< af |g >= a
< f |g + h >=
< f |f >=
(f (t))2 dt > 0
so it is an inner product.
Linear Algebra and Differential EquationsLecture 12 Inner Product
JulySpaces
7, 2016
10 / 19
Fourier series
Using the inner product in C[-1,1] used in the previous example an
orthonormal basis is given by {1, cos t, sin t, cos 2t, sin 2t, }. Using
this basis any continuous function can be expressed as a linear
combination of the basis vectors in the form
a0 X
+
(an cos nt + bn sin nt)
f (t) =
2
n=1
This is called the Fourier series for f (t). The coefficients can be obtained
by taking the inner product of f (t) with the basis vectors.
Z 1
Z 1
a0 =
f (t) dt
an =
f (t) cos nt dt
1
bn =
f (t) sin nt dt
1
11 / 19
a0 =
Z
t dt = 9
an =
t cos nt dt = 0
1
bn =
t sin nt dt
1
1
Z 1
1
1
cos nt
cos nt dt
= t
n
n
1
1
2
=
cos n
n
2
n odd
n
=
2
n
n even
Linear Algebra and Differential EquationsLecture 12 Inner Product
JulySpaces
7, 2016
12 / 19
3
4
Putting t = 1/2
1
2
1 1 1
=
1 + +
2
3 5 7
1 1 1
= 1 + +
4
3 5 7
13 / 19
Cauchy-Schwarz inequality
In an inner product space
| < u|v > | kuk kvk
with equality only if u and v are proportional
We can show this by
0 << u + tv|u + tv >= kuk2 + 2t < u|v > +t 2 kvk2
the expression can never be negative so the discriminant of the quadratic
expression must be negative (i.e. complex roots)
(2 < u|v >)2 4|u|2 |v|2 0
Taking square roots gives the inequality.
Linear Algebra and Differential EquationsLecture 12 Inner Product
JulySpaces
7, 2016
14 / 19
15 / 19
Orthogonal Complement
Let V be a subspace of an inner product space U then the orthogonal
complement of V, V , is the set of vectors that are orthogonal to every
vector in V
1
V V = U
V V = 0
(V ) = V
16 / 19
An Example
The null space of a matrix consists of the solutions to
Ax = 0
3 2 5 2 4
1 1 1 1 1
1 0 3 4 6
2 3 0 3 1
x1
x2
x3
x4
x5
0
=
0
1
0
0
0
1 1 1 1
1 2 1 1
0 0 1 1
0 0 0 0
x1
x2
x3
x4
x5
0
=
0
17 / 19
1
1
1
1
0
1
2
1
1
0
0
0
1
1
3
2
2
2
x = x3 1 + x5
0
0
1
0
1
which are orthogonal to vectors in the row space i.e.
Null space of a matrix = (Row space of the matrix)
Linear Algebra and Differential EquationsLecture 12 Inner Product
JulySpaces
7, 2016
18 / 19
19 / 19