You are on page 1of 48

Algebra-III

Ashok Bingi

Department of Mathematics
St. Xavier’s College-Autonomous, Mumbai

Inner Product Space

3rd August 2022

Ashok Bingi SMAT0302 3rd August 2022 1/7


Basics of a complex numbers

If z = a + ib where a, b ∈ R then its conjugate is z̄ = a − ib


a = Re(z)=real part of z and b = Im(z)=imaginary part of z

Modulus of z=|z|√= a2 + b 2 where z = a + ib.
Also |z̄| = a2 + b 2 . So |z̄| = |z|
2 2
Clearly, z z̄ = a + b = |z| 2

Re(z) ≤ |z|
z + z̄ = 2 Re(z)
z̄¯ = z
|z| = 0 if and only if z = 0
For complex numbers z1 , z2 , we have
z1 + z2 = z¯1 + z¯2
z1 − z2 = z¯1 − z¯2
z1 z2 = z¯1 z¯2

Ashok Bingi SMAT0302 3rd August 2022 2/7


Definition of Inner Product Space

For definition of inner Product space, refer uploaded pdf book, 7.2 section,
Page 258
Definition of IPS: Let V be real/complex vector space.
An inner product on V assigns to each ordered pair of vectors u, v ∈ V ,
a (real/complex) scalar written as u · v satisfying following properties:
(IP-1) u · (v + w ) = u · v + u · w ∀u, v , w ∈ V
(IP-2) (αu) · v = α (u · v ) ∀u ∈ V and for all scalars α
(IP-3) u · v = v · u ∀u, v ∈ V
(IP-4) u · u ⩾ 0 and u · u = 0 if and only if u = 0 ∀u ∈ V
A vector space V together with an inner product defined on it is called
inner product space.
A finite dimensional real inner product space is called Euclidean space.
A finite dimensional complex inner product space is called Unitary space.

Ashok Bingi SMAT0302 3rd August 2022 3/7


Inner Product Space

Recall properties of complex numbers and its conjugate


Example: VnC =set of all ordered n-tuples of complex
numbers={(x1 , · · · , xn )|xi ∈ C}
It is a vector space wrt addition and scalar multiplication defined as
(x1 , · · · , xn ) + (y1 , · · · , yn ) = (x1 + y1 , · · · , xn + yn )
a(x1 , · · · , xn ) = (ax1 , · · · , axn ) where a ∈ C
Define (x1 , · · · , xn ) · (y1 , · · · , yn ) = x1 y¯1 + · · · + xn y¯n
Prove that VnC is an inner product space.
Ans Let u=(x1 , · · · , xn ), v=(y1 , · · · , yn ), w=(z1 , · · · , zn )
Then v+w=(y1 + z1 , · · · , yn + zn )

Ashok Bingi SMAT0302 3rd August 2022 4/7


Inner Product Space
Consider u · (v + w )
=x1 (y1 + z1 ) + · · · + xn (yn + zn )
=x1 (y¯1 + z¯1 ) + · · · + xn (y¯n + z¯n )
=x1 y¯1 + x1 z¯1 + · · · + xn y¯n + xn z¯n
=(x1 y¯1 + · · · + xn y¯n ) + (x1 z¯1 + · · · + xn z¯n )
=u · v + u · w
For a ∈ C, consider (au) · v
=(ax1 , · · · , axn ) · (y1 , · · · , yn )
=ax1 y¯1 + · · · + axn y¯n
=a(x1 y¯1 + · · · + xn y¯n )
=a(u · v )
Now v · u = y1 x¯1 + · · · + yn x¯n
=y1 x¯1 + · · · + yn x¯n
=y¯1 x1 + · · · + y¯n xn
=x1 y¯1 + · · · + xn y¯n =u · v
Ashok Bingi SMAT0302 3rd August 2022 5/7
Inner Product Space

Finally, consider u · u
=x1 x¯1 + · · · + xn x¯n
=|x1 |2 + · · · + |xn |2
≥0 as each |xi | ≥ 0
Now u · u = 0
⇐⇒ |x1 |2 + · · · + |xn |2 = 0
⇐⇒ each |xi | = 0
⇐⇒ each xi = 0
⇐⇒ u = (x1 , · · · , xn ) = (0, · · · , 0) = 0
Thus all 4 properties to be an inner product are satisfied
Therefore VnC is an inner product space

Ashok Bingi SMAT0302 3rd August 2022 6/7


Algebra-III

Ashok Bingi

Department of Mathematics
St. Xavier’s College-Autonomous, Mumbai

Inner Product Space

5th August 2022

Ashok Bingi SMAT0302 5th August 2022 1/3


Theorem
In an inner product space V
(1) (u + v ) · w = u · w + v · w ∀u, v , w ∈ V
(2) u · (αv ) = ᾱ (u · v ) ∀u, v ∈ V and for all scalars α
(3) 0 · u = 0 = u · 0 ∀u ∈ V
Proof: (1) (u + v ) · w = w · (u + v ) by (IP-3)
=w · u + w · v by (IP-1)
=w · u + w · v by z1 + z2 = z¯1 + z¯2
=u · w + v · w by (IP-3) and z̄¯ = z
(2) u · (αv ) = (αv ) · u by (IP-3)
=α (v · u) by (IP-2)
=ᾱ (v · u) by z1 z2 = z¯1 z¯2
=ᾱ (u · v ) by (IP-3)
(3) 0 · u = (0v ) · u as in vector space V, 0 = 0v ∀v ∈ V
=0 (v · u) by (IP-2)
=0
Similarly, prove u · 0 = 0
Ashok Bingi SMAT0302 5th August 2022 2/3
Example

In an inner product space, for u, v ∈ V and scalar α, find


(v − αu) · (v − αu)
Ans.(v − αu) · (v + (−αu))
=(v − αu) · v + (v − αu) · (−αu) by (IP-1)
=(v + (−αu)) · v + (v + (−αu)) · (−αu)
={(v · v ) + (−αu) · v } + {v · (−αu) + (−αu) · (−αu)}
as (x + y ) · w = x · w + y · w
=(v · v ) − (αu) · v − v · (αu) + (αu) · (αu)
=(v · v ) − α(u · v ) − ᾱ(v · u) + α ᾱ (u · u)
by (IP-2) and x · (αy ) = ᾱ (x · y )

Ashok Bingi SMAT0302 5th August 2022 3/3


Algebra-III

Ashok Bingi

Department of Mathematics
St. Xavier’s College-Autonomous, Mumbai

Inner Product Space

8th and 10th August 2022

Ashok Bingi SMAT0302 8th and 10th August 2022 1/5


Norm of a Vector in IPS

In an inner product space V, norm of a vector u ∈ V is ∥u∥ = u·u
It is distance between vector u and zero vector of the space and called as
length of vector u.
u ∈ V is called unit vector if and only if ∥u∥ = 1
Definition: Distance
p between 2 vectors u and v in IPS
=∥u − v ∥= (u − v ) · (u − v )
Properties of Norm:
1]. ∥αu∥ = |α| ∥u∥
Proof: ∥αu∥2 = (αu) · (αu) = αᾱ(u · u) = |α|2 ∥u∥2 .
On taking square root, we are done.
2]. ∥u∥ ⩾ 0 and ∥u∥ = 0 iff u = 0
Proof: From (IP-4), u · u ⩾ 0 and u · u = 0 if and only if u = 0
Hence ∥u∥2 ⩾ 0 and ∥u∥2 = 0 if and only if u = 0

Ashok Bingi SMAT0302 8th and 10th August 2022 2/5


Cauchy-Schwarz’s Inequality

|u · v | ≤ ∥u∥ ∥v ∥ ∀ u, v ∈ V (inner product space)


Proof. If u = 0 then |u · v | = ∥u∥ ∥v ∥ and we are done.
v ·u
So let u ̸= 0. Define w = v − au where a=
∥u∥2
Consider ∥w ∥2 = w · w = (v − au) · (v − au)
=same as last slide of 5th August
=(v · v ) − a(u · v ) − ā(v · u) + a ā (u · u)
|v · u|2 1
Then ∥w ∥2 = ∥v ∥2 − 2
= {∥u∥2 ∥v ∥2 − |v · u|2 }
∥u∥ ∥u∥2
As ∥w ∥2 ≥ 0, we have {∥u∥2 ∥v ∥2 − |v · u|2 } ≥ 0
which implies |v · u|2 ≤ ∥u∥2 ∥v ∥2
Hence |u · v |2 ≤ ∥u∥2 ∥v ∥2 as |z̄|2 = |z|2
Taking positive square root on both sides, we get, |u · v | ≤ ∥u∥ ∥v ∥

Ashok Bingi SMAT0302 8th and 10th August 2022 3/5


Equality in Cauchy-Schwarz Ineq. =⇒ u and v are LD

Proof. Given that |u · v | = ∥u∥ ∥v ∥. Also |u · v | = |v · u|


If u = 0 then we are done, because a set with zero vector is LD.
v ·u
So let u ̸= 0. Define w = v − au where a=
∥u∥2
1
Then ∥w ∥2 =Refer proof of CS inequality= {∥u∥2 ∥v ∥2 − |v · u|2 }
∥u∥2
=⇒ ∥w ∥2 = 0 as |v · u| = ∥u∥ ∥v ∥.
=⇒ w =0
=⇒ v − au = 0
=⇒ v = au
=⇒ u and v are scalar multiple of each other
=⇒ u and v are LD
Thus, |u · v | = ∥u∥ ∥v ∥ =⇒ u and v are LD
Remark: Is its converse true? Justify.

Ashok Bingi SMAT0302 8th and 10th August 2022 4/5


Triangle Inequality
∥u + v ∥ ≤ ∥u∥ + ∥v ∥ ∀ u, v ∈ V (inner product space)
Proof. Consider ∥u + v ∥2 = (u + v ) · (u + v )
=u · (u + v ) + v · (u + v )
=u · u + u · v + v · u + v · v
=u · u + v · v + u · v + v · u
=∥u∥2 + ∥v ∥2 + u · v + u · v
=∥u∥2 + ∥v ∥2 + 2 Re(u · v ) as z + z̄ = 2 Re(z)
≤ ∥u∥2 + ∥v ∥2 + 2 |u · v | as Re(z) ≤ |z|
≤ ∥u∥2 + ∥v ∥2 + 2 ∥u∥ ∥v ∥ by Cauchy-Schwarz inequality
 2
= ∥u∥ + ∥v ∥
 2
Thus ∥u + v ∥2 ≤ ∥u∥ + ∥v ∥

Taking positive square root on both sides, we get, ∥u + v ∥ ≤ ∥u∥ + ∥v ∥


Remark: ∥u + v ∥2 = ∥u∥2 + ∥v ∥2 + 2 Re(u · v )
Ashok Bingi SMAT0302 8th and 10th August 2022 5/5
Algebra-III

Ashok Bingi

Department of Mathematics
St. Xavier’s College-Autonomous, Mumbai

Inner Product Space

17th August 2022

Ashok Bingi SMAT0302 17th August 2022 1/3


If equality holds in Triangle Inequality then u and v are LD
Proof. Given that ∥u + v ∥ = ∥u∥ + ∥v ∥
This on squaring gives ∥u + v ∥2 = ∥u∥2 + ∥v ∥2 + 2 ∥u∥ ∥v ∥ ←− (1)
But, we know, ∥u + v ∥2 = ∥u∥2 + ∥v ∥2 + 2 Re(u · v ) ←− (2)
Equating (1) and (2), we get, Re(u · v ) = ∥u∥ ∥v ∥ ←− (3)
Now, we know, Re(u · v ) ≤ |u · v | as Re(z) ≤ |z|.
=⇒ ∥u∥ ∥v ∥ ≤ |u · v | using (3)
=⇒ |u · v | = ∥u∥ ∥v ∥ as by Cauchy-Schwarz Ineq., |u · v | ≤ ∥u∥ ∥v ∥
=⇒ u and v are LD (Refer Previous Slide)
Thus, ∥u + v ∥ = ∥u∥ + ∥v ∥ =⇒ u and v are LD
Remark: Its converse is not true as shown below.
In R3 consider u = (−1, 0, 1) and v = (2, 0, −2). Then u + v = (1, 0, −1)
Clearly, v = −2u. Hence, u and v are LD.
Now, with respect to standard inner product in R3 , ∥u + v ∥2 = 2
Also, ∥u∥2 = 2 and ∥v ∥2 = 8
From these norms, clearly ∥u + v ∥ ≠ ∥u∥ + ∥v ∥, though u and v are LD
Ashok Bingi SMAT0302 17th August 2022 2/3
Pythogoras Theorem

Def: In an inner product space, vectors u and v are said to be orthogonal,


if u · v = 0
For orthogonal vectors u and v in an inner product space,
∥u + v ∥2 = ∥u∥2 + ∥v ∥2
Proof. Given u · v = 0 and v · u = 0.
Consider ∥u + v ∥2
=(u + v ) · (u + v )
=u · (u + v ) + v · (u + v )
=u · u + u · v + v · u + v · v
=u · u + v · v + u · v + v · u
=u · u + v · v + 0 + 0
=∥u∥2 + ∥v ∥2

Ashok Bingi SMAT0302 17th August 2022 3/3


Algebra-III

Ashok Bingi

Department of Mathematics
St. Xavier’s College-Autonomous, Mumbai

Inner Product Space

22nd August 2022

Ashok Bingi SMAT0302 22nd August 2022 1/3


Study of Orthogonal vectors
Two vectors u and v in an inner product space are said to be orthogonal,
if u · v = 0.
A set of vectors in an inner product space is said to be orthogonal set, if
each pair of distinct vectors in the set is orthogonal.
Theorem: Orthogonal set of non-zero vectors in an inner product space is
Linearly Independent.
Proof: Let A be Orthogonal set of non-zero vectors in an inner product
space. Let B={u1 , · · · , un } be finite subset of A.
Then ui · uj = 0 ∀i ̸= j. Also as each ui ̸= 0, we have ui · ui ̸= 0.
Consider a1 u1 + · · · + an un = 0 for some scalars a1 , · · · , an
We know ∀i = 1, · · · , n, 0 = 0 · ui
0 = (a1 u1 + · · · + an un ) · ui =a1 (u1 · ui ) + · · · + an (un · ui ) =ai (ui · ui )
This leads to ai = 0
Hence a1 = · · · an = 0. B is Linearly Independent.
Hence every finite subset of A is Linearly Independent.
Therefore A is Linearly Independent.
Ashok Bingi SMAT0302 22nd August 2022 2/3
Converse not true

Thus, Orthogonal set =⇒ Linearly Independent.


Its converse is not true as shown through following example
In R2 with standard inner product, consider the set
{(2, 0), (2, 3)} = {u1 , u2 } = A
Clearly u1 · u2 = 2(2) + 0(3) = 4 ̸= 0. i.e. A is not orthogonal
But a1 u1 + a2 u2 = (0, 0) =⇒ 2a1 + 2a2 = 0 and 3a2 = 0
So a2 = 0 and hence a1 = 0
A is Linearly Independent but not orthogonal.
Given Linearly Independent set, we can arrive at a orthogonal set by a
standard procedure called Gram-Schmidt process

Ashok Bingi SMAT0302 22nd August 2022 3/3


Algebra-III

Ashok Bingi

Department of Mathematics
St. Xavier’s College-Autonomous, Mumbai

Inner Product Space

24th August 2022

Ashok Bingi SMAT0302 24th August 2022 1/5


Gram-Schmidt process

Every finite dimensional inner product space V has orthogonal basis


Proof : Let dim(V)=n. Then there exists basis B={u1 · · · , un } for V.
We construct a new set {v1 , v2 , · · · , vn } by defining in following way.
Define v 1 = u1
i−1  
X ui · vj
and vi = ui − vj ∀ i = 2, 3, · · · , n
∥vj ∥2
j=1
 
u2 · v1
Then for i=2, v2 = u2 − v1
∥v1 ∥2
Claim: v2· v1 =  0    
u2 · v1 u2 · v1
v2 · v1 = u2 − v1 · v1 = (u2 · v1 ) − (v1 · v1 )
∥v1 ∥2 ∥v1 ∥2
v2 · v1 = (u2 · v1 ) − (u2 · v1 ) as v1 · v1 = ∥v1 ∥2
and hence v2 · v1 = 0
Thus {v1 , v2 } is orthogonal set

Ashok Bingi SMAT0302 24th August 2022 2/5


Gram-Schmidt process(continued)
2      
X u3 · vj u3 · v1 u3 · v2
For i=3, v3 = u3 − vj = u3 − v1 − v2
∥vj ∥2 ∥v1 ∥2 ∥v2 ∥2
j=1
Claim: v3· v1 = 
0    
u3 · v1 u3 · v2
v3 · v1 = u3 − v1 − v2 · v1
∥v1 ∥2  ∥v2 ∥2  
u3 · v1 u3 · v2
v3 · v1 = (u3 · v1 ) − (v1 · v1 ) − (v2 · v1 )
∥v1 ∥2 ∥v2 ∥2
v3 · v1 = (u3 · v1 ) − (u3 · v1 ) − 0 as v2 · v1 = 0 and v1 · v1 = ∥v1 ∥2
and hence v3 · v1 = 0
Claim: v3· v2 =  0    
u3 · v1 u3 · v2
v3 · v2 = u3 − v1 − v2 · v2
∥v1 ∥2  ∥v2 ∥2  
u3 · v1 u3 · v2
v3 · v2 = (u3 · v2 ) − (v1 · v2 ) − (v2 · v2 )
∥v1 ∥2 ∥v2 ∥2
v3 · v2 = (u3 · v2 ) − 0 − (u3 · v2 ) as v1 · v2 = 0 and v2 · v2 = ∥v2 ∥2
and hence v3 · v2 = 0. Thus {v1 , v2 , v3 } is orthogonal set
Ashok Bingi SMAT0302 24th August 2022 3/5
Gram-Schmidt process(continued)
Continuing in this way and using all the remaining vectors u4 , · · · , un , we
construct the orthogonal set {v1 , v2 , · · · , vn }.
Thus, from given LI set, we get an orthogonal set.
Claim: {v1 , · · · , vn } is orthogonal basis (as given LI set B is basis)
If vk = 0 then uk is a linear combination of v1 , · · · , vk−1 and hence a
linear combination of u1 , · · · , uk−1 which is a contradiction.
So each of vk ̸= 0.
As any orthogonal set of non-zero vectors in an inner product space is
linearly independent, we have this orthogonal set {v1 , · · · , vn } is Linearly
Independent.
We know, in an n-dimensional vector space V, any set of n linearly
independent vectors forms basis for V.
So this linearly independent orthogonal set {v1 , · · · , vn } of n-vectors in V
(of dimension n) is basis for V.
i.e.{v1 , · · · , vn } is orthogonal basis for inner product space V.
Hence the proof.
Ashok Bingi SMAT0302 24th August 2022 4/5
Orthonormal sets

Remark: In an n-dimensional inner product space, if a linearly independent


is given, containing less than n vectors, then using Gram-Schmidt process,
we get orthogonal set(not orthogonal basis).
Definition: {u1 , · · · , un } is said to be orthonormal set if
(
0 if i ̸= j
ui · uj =
1 if i = j
 
u
Remark:(1) If u is a non-zero vector in V then is a unit vector.
∥u∥
(2) If {v1 , · · · , vn } is orthogonal set(orthogonal basis)in V then the 
v1 vn
corresponding orthonormal set (orthonormal basis) is ,··· , .
∥v1 ∥ ∥vn ∥
H.W. Prove every orthonormal set in an inner product space is linearly
independent.

Ashok Bingi SMAT0302 24th August 2022 5/5


Existance of an orthonormal set in V
Every inner product space V(̸= zero space) possesses an orthonormal set.
Proof. Obviously, there exists u ̸= 0 in V.
Then clearly, ∥u∥ = ̸ 0. Note that here ∥u∥ is a (positive) real number.
u u
Consider ·
 ∥u∥
 ∥u∥
1 u
= u ·
∥u∥
 ∥u∥

1 u
= u· as (au) · v = a(u · v )
∥u∥   ∥u∥ 
1 1
= u· u
∥u∥ ∥u∥
1 1
= (u · u) as u · (av ) = ā(u · v )
∥u∥ ∥u∥
=1 as u · u = ∥u∥2

u
is an orthonormal set in V.
∥u∥
Thus, V possesses an orthonormal set.
Ashok Bingi SMAT0302 24th August 2022 6/5
orthogonal complement of a set

Definition: Let S be subset of inner product space V.


Then orthogonal complement of a set S=S ⊥ ={u ∈ V |u · s = 0 ∀ s ∈ S}
Result: Prove that S ⊥ is a subspace of inner product space V.
Proof. We know 0 · s = 0 ∀s ∈ S and hence 0 ∈ S ⊥ . Thus S ⊥ ̸= ∅
Now let s1 , s2 ∈ S ⊥ and a be any scalar.
Claim: as1 + s2 ∈ S ⊥
∀s ∈ S, consider (as1 + s2 ) · s
= a(s1 · s) + (s2 · s) by linearity property in V
= a(0) + (0)=0 as s1 , s2 ∈ S ⊥
Thus (as1 + s2 ) · s=0 ∀s ∈ S
therefore (as1 + s2 ) ∈ S ⊥ ∀ s1 , s2 ∈ S ⊥ and for any scalar a.

S is subspace of V.
Definition: Let S be subset of inner product space V. Then orthogonal
complement of a set S ⊥ =(S ⊥ )⊥ =S ⊥⊥ ={u ∈ V |u · s = 0 ∀ s ∈ S ⊥ }
H.W. Prove that S ⊆ S ⊥⊥
Ashok Bingi SMAT0302 24th August 2022 7/5
Algebra-III

Ashok Bingi

Department of Mathematics
St. Xavier’s College-Autonomous, Mumbai

Inner Product Space

30th August 2022

Ashok Bingi SMAT0302 30th August 2022 1/6


Lemma

Let S={v1 , · · · , vm } be an orthonormal set in an inner product space V. If


Xm
v ∈ V then prove that the vector u = v − (v · vi ) vi is orthogonal to
i=1
each of v1 , · · · , vm and consequently, to ths subspace spanned by  S.
Pm
Proof For j = 1, · · · , m; consider u · vj = v − i=1 (v · vi ) vi · vj
  
u · vj = v − (v · v1 ) v1 + · · · + (v · vj ) vj + · · · + (v · vm ) vm · vj
 
u ·vj = v ·vj − (v ·v1 ) (v1 ·vj )+· · ·+(v ·vj ) (vj ·vj )+· · ·+(v ·vm ) (vm ·vj )
u · vj = v · vj − v · vj as S is orthonormal
u · vj = 0 ∀ j = 1, · · · , m
X m
the vector u = v − (v · vi ) vi is orthogonal to each of v1 , · · · , vm
i=1

Ashok Bingi SMAT0302 30th August 2022 2/6


Proof Continued

Let the vector x belong to subspace spanned by S={v1 , · · · , vm }.


Then x=c1 v1 + · · · + cj vj + · · · + cm vm for some scalars ci
Consider u · x = u · (c1 v1 + · · · + cj vj + · · · + cm vm )
u · x = c¯1 (u · v1 ) + · · · + c¯j (u · vj ) + · · · + c¯m (u · vm )
u·x =0 as u · vj = 0 ∀ j = 1, · · · , m
so u is othogonal to any vector x in subspace spanned by S.
Thus u is othogonal to subspace spanned by S.
Note: In pdf book posted (Krishnamurthy),about direct sum in vector
space, Refer its definition and Theorem 3.4.4 on page 80. Also, refer
Corollary 3.6.15, for its dimension theorem on page 102
Recall: Let V1 and V2 are subspaces of V such that dim(V1 ) = dim(V2 ).
Then V1 = V2 if and only if either V1 ⊆ V2 or V2 ⊆ V1

Ashok Bingi SMAT0302 30th August 2022 3/6


Projection Theorem
If W is subspace of finite dimensional inner product space V then
V = W ⊕ W ⊥.
Proof: Clearly W is also finite dimensional. Let dim(W)=m.
As every finite dimensional inner product space has orthonormal basis, let
B={w1 , · · · , wm } be orthonormal basis for W.
Let v ∈ V .
X m
Define u = v − (v · wi ) wi
i=1
Then by Lemma, u is orthogonal to each of w1 , · · · , wm and consequently,
to ths subspace spanned by B.
i.e; u · wk = 0 ∀ wk ∈ B
So u · w = 0 ∀ w ∈ W as every vector in W is L.C. of vectors in B.
Thus u ∈ W ⊥ .
Xm Xm
Now clearly v = (v · wi ) wi + u where (v · wi ) wi ∈ W and
i=1 i=1
u ∈ W ⊥.
Ashok Bingi SMAT0302 30th August 2022 4/6
Projection Theorem(continued)

Claim: W ∩ W ⊥ = {0}
Consider x ∈ W ∩ W ⊥
Then x ∈ W and x ∈ W ⊥
But x ∈ W ⊥ =⇒ x ·w =0 ∀ w ∈W
So in particular for w = x ∈ W , we have x · x = 0 which implies x = 0.
Thus x ∈ W ∩ W ⊥ =⇒ x = 0

Hence W ∩ W = {0}
Therefore, by definition of direct sum, we have, V = W ⊕ W ⊥ .

Ashok Bingi SMAT0302 30th August 2022 5/6


Corollary to Projection Theorem

If W is subspace of finite dimensional inner product space V then


W = W ⊥⊥ .
Proof: We know, W ⊆ W ⊥⊥ and W ⊥ is also subspace of V
By projection theorem to subspace W, we have V = W ⊕ W ⊥ .
So by dimension theorem for direct sum, dim(V ) = dim(W ) + dim(W ⊥ )
By projection theorem to subspace W ⊥ , we have
V = W ⊥ ⊕ (W ⊥ )⊥ = W ⊥ ⊕ W ⊥⊥ .
So by dimension theorem for direct sum,dim(V ) = dim(W ⊥ ) + dim(W ⊥⊥ )
From these both dimension equations, we have, dim(W ) = dim(W ⊥⊥ )
Here W and W ⊥⊥ , both are subspaces of V such that W ⊆ W ⊥⊥
Hence, dim(W ) = dim(W ⊥⊥ ) =⇒ W = W ⊥⊥ .
Because for subspaces V1 and V2 of V with dim(V1 ) = dim(V2 ),
V1 = V2 if and only if either V1 ⊆ V2 or V2 ⊆ V1

Ashok Bingi SMAT0302 30th August 2022 6/6


Algebra-III

Ashok Bingi

Department of Mathematics
St. Xavier’s College-Autonomous, Mumbai

Inner Product Space

5th Sept 2022

Ashok Bingi SMAT0302 5th Sept 2022 1/7


Recall the proof of Projection Theorem
If W is subspace of finite dimensional inner product space V then
V = W ⊕ W ⊥.
In its proof, {w1 , · · · , wm } was orthonormal basis for subspace W.
Let v ∈ V .
X m m
X
Then v = (v · wi ) wi + u where (v · wi ) wi ∈ W and u ∈ W ⊥ .
i=1 i=1
Pm
Here i=1 (v · wi ) wi is called Orthogonal Projection of v on W and
denoted as projW v
Xm
Here u = v − (v · wi ) wi =v − projW v is called as component of v
i=1
orthogonal to W
Note If {w1 , · · · , wm } be orthogonal basis for subspace W then
m  
X v · wi
projW v = wi = Orthogonal Projection of v on W.
∥wi ∥2
i=1

Ashok Bingi SMAT0302 5th Sept 2022 2/7


Q In R3 , under standard inner product, let W be subspace spanned by
orthogonal vectors w1 = (1, 0, 1) and w2 = (−1, 0, 1). If v = (1, 2, 3) then
find Orthogonal Projection of v on W.
Also find component of v orthogonal to W.
2  
X v · wi
Ans. projW v = wi = (1, 0, 3)
∥wi ∥2
i=1
component of v orthogonal to W=v − projW v = (0, 2, 0)
Note: v − projW v is orthogonal to w1 and w2 and hence to each vector in
W spanned by w1 and w2
Q Prove Parallelogram law: ∥u + v ∥2 + ∥u − v ∥2 = 2∥u∥2 + 2∥v ∥2

Ashok Bingi SMAT0302 5th Sept 2022 3/7


Q.3 In R2 , under standard inner product, check whether the set
{(2, 0), (2, 3)} is orthonormal
Ans. (2, 0) · (2, 3) = 2(2) + 0(3) = 4 ̸= 0
So given set is not orthogonal and hence not orthonormal
 
u·v
(∗) Vector Projection of u along v ̸= 0 is given by v
  ∥v ∥2
u·v
Here the scalar is called component of u along v
∥v ∥2
Q.4 In R3 , under standard inner product, if u = (1, −1, 1) and
v = (1, 2, 0); find component of u along v.
Also obtain vector Projection of u along v.
Ans. Here u · v = −1 and ∥v∥2 = 5
u·v −1
So component of u along v= 2
=
∥v ∥  5 
u·v −1
Now Vector Projection of u along v = v= (1, 2, 0)
∥v ∥2 5

Ashok Bingi SMAT0302 5th Sept 2022 4/7


Existance of an orthonormal set in V
Every inner product space V(̸= zero space) possesses an orthonormal set.
Proof. Obviously, there exists u ̸= 0 in V.
Then clearly, ∥u∥ = ̸ 0. Note that here ∥u∥ is a (positive) real number.
u u
Consider ·
 ∥u∥
 ∥u∥
1 u
= u ·
∥u∥
 ∥u∥

1 u
= u· as (au) · v = a(u · v )
∥u∥   ∥u∥ 
1 1
= u· u
∥u∥ ∥u∥
1 1
= (u · u) as u · (av ) = ā(u · v )
∥u∥ ∥u∥
=1 as u · u = ∥u∥2

u
is an orthonormal set in V.
∥u∥
Thus, V possesses an orthonormal set.
Ashok Bingi SMAT0302 5th Sept 2022 5/7
Result-1
Let S={v1 , · · · , vm } be an orthogonal set in an inner product space V
where each vi ̸= 0 ∀ i = 1, · · · , m.
m  
X u · vk
If u ∈ [S]=linear span of S then prove that u = vk
∥vk ∥2
k=1
Proof Here u=linear combination of elements in S
So u=c1 v1 + · · · + cj vj + · · · + cm vm ←− (∗)
For j = 1, · · · , m, consider u · vj = (c1 v1 + · · · + cj vj + · · · + cm vm ) · vj
u · vj = c1 (v1 · vj ) + · · · + cj (vj · vj ) + · · · + cm (vm · vj )
u · vj = cj (vj · vj ) as S is orthogonal
u · vj = cj ∥vj ∥2
Now ∥vj ∥ = ̸ 0 (as vj ̸= 0), divide by ∥vj ∥2
u · vj
cj = ∀ j = 1, · · · , m.
∥vj ∥2
So eq (∗) becomes
      m  
u · v1 u · vj u · vm X u · vk
u= v1 + · · · + vj + · · · + vm = vk
∥v1 ∥2 ∥vj ∥2 ∥vm ∥2 ∥vk ∥2
k=1
Ashok Bingi SMAT0302 5th Sept 2022 6/7
Result-2

Let S={v1 , · · · , vm } be an orthonormal set in an inner product space V. If


Xm
u ∈ [S]=linear span of S then prove that u = (u · vk ) vk
k=1
Proof Here u=linear combination of elements in S
So u=c1 v1 + · · · + cj vj + · · · + cm vm ←− (∗)
For j = 1, · · · , m, consider u · vj = (c1 v1 + · · · + cj vj + · · · + cm vm ) · vj
u · vj = c1 (v1 · vj ) + · · · + cj (vj · vj ) + · · · + cm (vm · vj )
u · vj = cj (vj · vj ) as S is orthonormal
u · vj = cj as S is orthonormal
cj = u · vj ∀ j = 1, · · · , m.
So eq (∗) becomes
Xm
u=(u · v1 )v1 + · · · + (u · vj )vj + · · · + (u · vm )vm = (u · vk ) vk
k=1

Ashok Bingi SMAT0302 5th Sept 2022 7/7


Algebra-III

Ashok Bingi

Department of Mathematics
St. Xavier’s College-Autonomous, Mumbai

Inner Product Space

6th Sept 2022

Ashok Bingi SMAT0302 6th Sept 2022 1/8


Example
If vector x = w − (a1 v1 + a2 v2 ) where ai = w · vi and {v1 , v2 } is
orthonormal set in inner product space V then find ∥x∥2
Ans. We know (au) · v = a(u · v ) ; u · (av ) = ā(u · v ); a¯i = vi · w
Consider (a1 v1 + a2 v2 ) · (a1 v1 + a2 v2 )
=(a1 v1 ) · (a1 v1 + a2 v2 ) + (a2 v2 ) · (a1 v1 + a2 v2 )
=a1 a¯1 (v1 · v1 ) + a1 a¯2 (v1 · v2 ) +a2 a¯1 (v2 · v1 ) + a2 a¯2 (v2 · v2 )
=a1 a¯1 (1) + a1 a¯2 (0) +a2 a¯1 (0) + a2 a¯2 (1) by orthonormality
Thus (a1 v1 + a2 v2 ) · (a1 v1 + a2 v2 )=a1 a¯1 + a2 a¯2
Consider
 ∥x∥2 = x · x  
= w − (a1 v1 + a2 v2 · w − (a1 v1 + a2 v2
   
=w · w − (a1 v1 + a2 v2 ) − (a1 v1 + a2 v2 ) · w − (a1 v1 + a2 v2 )
=w · w −
w · (a1 v1 + a2 v2 ) − (a1
v1 +a2 v2 ) · w + (a1 v1 + a2 v
2 ) · (a1 v1 + a2 v2 )

=w ·w − a¯1 (w ·v1 )+ a¯2 (w ·v2 ) − a1 (v1 ·w )+a2 (v2 ·w ) +(a1 a¯1 +a2 a¯2 )
=w · w − (a¯1 a1 + a¯2 a2 ) − (a1 a¯1 + a2 a¯2 ) + (a1 a¯1 + a2 a¯2 )
Ashok Bingi SMAT0302 6th Sept 2022 2/8
Example continued

So ∥x∥2 = x · x=w · w − (a¯1 a1 + a¯2 a2 ) =∥w ∥2 − (|a1 |2 + |a2 |2 )


where x = w − (a1 v1 + a2 v2 )
 
i.e; If x = w − (w · v1 )v1 + (w · v2 )v2 ; {v1 , v2 } is orthonormal set in V
then ∥x∥2 =∥w ∥2 − (|w · v1 |2 + |w · v2 |2 )
2
X
i.e; If x = w − (w · vk )vk ; {v1 , v2 } is orthonormal set in V then
k=1
2
X
∥x∥2 =∥w ∥2 − |w · vk |2
k=1

Ashok Bingi SMAT0302 6th Sept 2022 3/8


Bessel’s Inequality

If S={v1 , · · · , vm } is orthonormal set in inner product space V then for all


Xm
w ∈ V , prove that |w · vk |2 ≤ ∥w ∥2
k=1
Further, equality holds if and only if w belongs to subspace spanned by S
Xm
Proof: Let x = w − (w · vk )vk ←− (∗∗)
k=1
Xm
Then ∥x∥2 = ∥w ∥2 − |w · vk |2 ←− (∗)
k=1
m
X
As ∥x∥2 ≥ 0, we have, ∥w ∥2 − |w · vk |2 ≥ 0 and hence
k=1
m
X
|w · vk |2 ≤ ∥w ∥2
k=1

Ashok Bingi SMAT0302 6th Sept 2022 4/8


Bessel’s Inequality(continued)
m
X m
X
x =w− (w · vk )vk ←− (∗∗), ∥x∥2 = ∥w ∥2 − |w · vk |2 ←− (∗)
k=1 k=1
m
X
Now let equality hold in Bessel’s inequality. i.e; |w · vk |2 = ∥w ∥2
k=1
Then from eq.(∗), we have, ∥x∥2 = 0 and hence x = 0 which from eq(∗∗)
m
X
gives w = (w · vk )vk =linear combination of v1 , · · · , vm .
k=1
Thus, w belongs to subspace spanned by S
Conversely, let w belongs to subspace spanned by S
Then w ∈ [S] where S is orthonormal set
m
X
So w is linear combination of v1 , · · · , vm given by w = (w · vk ) vk
k=1
Thus eq(∗∗) gives x = 0 and hence ∥x∥2 = 0
m
X
Hence from eq(∗), we get, |w · vk |2 = ∥w ∥2 , equality follows
Ashok Bingi
k=1 SMAT0302 6th Sept 2022 5/8
Consequences of Bessel’s Inequality

Corollary-1: Let S={v1 , · · · , vm } be orthonormal set in finite dimensional


inner product space V. If equality holds in Bessel’s inequality then prove
that S is a basis for V.
Proof: Consider w ∈ V .
Xm
Let x = w − (w · vk )vk ←− (∗∗)
k=1
m
X
Then ∥x∥2 = ∥w ∥2 − |w · vk |2 ←− (∗)
k=1
m
X
Now let equality hold in Bessel’s inequality. i.e; |w · vk |2 = ∥w ∥2
k=1
Then from eq.(∗), we have, ∥x∥2 = 0 and hence x = 0 which from eq(∗∗)
m
X
gives w = (w · vk )vk =linear combination of v1 , · · · , vm .
k=1
Thus, w belongs to subspace spanned by S and hence w ∈ [S].
Ashok Bingi SMAT0302 6th Sept 2022 6/8
Consequences of Bessel’s Inequality
So V is spanned by S (S spans V).
Now as every orthonormal set in an inner product space is linearly
independent, we have, the orthonormal set S is linearly independent in V.
Thus S is basis for V.
Corollary-2: If S={v1 , · · · , vm } is orthogonal set of non-zero vectors in
m
X |w · vk |2
inner product space V then for all w ∈ V , prove that ≤ ∥w ∥2
∥vk ∥2
k=1
Proof: Consider w ∈ V
vk
Define uk = ∀ k = 1, · · · , m. Then {u1 , · · · , um } is orthonormal set
∥vk ∥
Xm
Thhen by Bessel’s Inequality for w ∈ V , we have |w · uk |2 ≤ ∥w ∥2
k=1
vk 1 1
But w · uk = w · = (w · vk ) and hence |w · uk |2 = |w · vk |2
∥vk ∥ ∥vk ∥ ∥vk ∥2
m
X |w · vk |2
So above Bessel’s inequality becomes ≤ ∥w ∥2
∥vk ∥2
k=1
Ashok Bingi SMAT0302 6th Sept 2022 7/8
Consequences of Bessel’s Inequality

Note that Corollary-2 is Bessel’s Inequality for orthogonal set of non-zero


vectors in V.
m  
X w · vk
(∗) Equality in Corollary-2 holds if and only if w = vk
∥vk ∥2
k=1

(∗) Let S={v1 , · · · , vm } be orthogonal set of non-zero vectors in finite


dimensional inner product space V. If equality holds in its Bessel’s
inequality then prove that S is a basis for V.

Ashok Bingi SMAT0302 6th Sept 2022 8/8

You might also like