Professional Documents
Culture Documents
Next: Orthogonal Projections and Applications Up: Inner Product Spaces Previous: Definition and Basic
Properties Contents
Suppose we are given two vectors and in a plane. If we want to get vectors and such that is a unit
vector in the direction of and is a unit vector perpendicular to then they can be obtained in the following
way:
Take the first vector Let be the angle between the vectors and Then
as we have removed the component of from . So, the vectors that we are interested in are and
This idea is used to give the Gram-Schmidt Orthogonalisation process which we now describe.
THEOREM 5.2.1 (Gram-Schmidt Orthogonalisation Process) Let be an inner product space. Suppose
is a set of linearly independent vectors of Then there exists a set of
vectors of satisfying the following:
https://nptel.ac.in/content/storage2/courses/122104018/node49.html 1/11
09/01/2021 Gram-Schmidt Orthogonalisation Process
1. for
2. for and
3. for
(5.2.1)
and define
Let the result hold for all That is, suppose we are given any set of linearly
independent vectors of Then by the inductive assumption, there exists a set
1. for
2. for and
3. for
https://nptel.ac.in/content/storage2/courses/122104018/node49.html 2/11
09/01/2021 Gram-Schmidt Orthogonalisation Process
Now, let us assume that we are given a set of linearly independent vectors of Then by
the inductive assumption, we already have vectors satisfying
1. for
2. for and
3. for
(5.2.2)
We first show that . This will also imply that and hence is
well defined.
On the contrary, assume that Then there exist scalars such that
So, by (5.2.2)
This gives a contradiction to the given assumption that the set of vectors is linear
independent.
. Hence, by the principle of mathematical induction, the proof of the theorem is complete.
height6pt width 6pt depth 0pt
https://nptel.ac.in/content/storage2/courses/122104018/node49.html 3/11
09/01/2021 Gram-Schmidt Orthogonalisation Process
and
Remark 5.2.3
and for
2. Suppose we are given a set of vectors, of that are linearly dependent. Then by
Corollary 3.2.5, there exists a smallest such that
for the set is linearly independent (use Corollary 3.2.5). So, by Theorem
As by Remark 5.1.15
https://nptel.ac.in/content/storage2/courses/122104018/node49.html 4/11
09/01/2021 Gram-Schmidt Orthogonalisation Process
So, by definition of
Therefore, in this case, we can continue with the Gram-Schmidt process by replacing by
3. Let be a countably infinite set of linearly independent vectors. Then one can apply the Gram-Schmidt
process to get a countably infinite orthonormal set.
4. Let be an orthonormal subset of Let be the standard ordered
basis of Then there exist real numbers such that
is an matrix.
(5.2.3)
Note that,
https://nptel.ac.in/content/storage2/courses/122104018/node49.html 5/11
09/01/2021 Gram-Schmidt Orthogonalisation Process
Perhaps the readers must have noticed that the inverse of is its transpose. Such matrices are called orthogonal
matrices and they have a special role to play.
EXERCISE 5.2.5
1. Let and be two orthogonal matrices. Then prove that and are both orthogonal
matrices.
2. Let be an orthogonal matrix. Then prove that
1. the rows of form an orthonormal basis of
2. the columns of form an orthonormal basis of
3. for any two vectors
where
4. Let be an upper triangular matrix. If is also an orthogonal matrix, then prove that
THEOREM 5.2.6 (QR Decomposition) Let be a square matrix of order Then there exist matrices
and such that is orthogonal and is upper triangular with
https://nptel.ac.in/content/storage2/courses/122104018/node49.html 6/11
09/01/2021 Gram-Schmidt Orthogonalisation Process
In case, is non-singular, the diagonal entries of can be chosen to be positive. Also, in this case, the
decomposition is unique.
Proof. We prove the theorem when is non-singular. The proof for the singular case is left as an exercise.
Let the columns of be The Gram-Schmidt orthogonalisation process applied to the vectors
gives the vectors satisfying
(5.2.4)
(5.2.5)
Thus, we see that where is an orthogonal matrix (see Remark 5.2.3.4) and is an upper triangular
matrix.
https://nptel.ac.in/content/storage2/courses/122104018/node49.html 7/11
09/01/2021 Gram-Schmidt Orthogonalisation Process
The proof doesn't guarantee that for is positive. But this can be achieved by replacing the vector
by whenever is negative.
1. The inverse of an upper triangular matrix is also an upper triangular matrix, and
2. product of upper triangular matrices is also upper triangular.
Thus the matrix is an upper triangular matrix. Also, by Exercise 5.2.5.1, the matrix is an
Hence, proceeding on the lines of the above theorem, we have the following result.
2. If then and
3. is an matrix with
EXAMPLE 5.2.8
1. Let Find an orthogonal matrix and an upper triangular matrix such that
https://nptel.ac.in/content/storage2/courses/122104018/node49.html 8/11
09/01/2021 Gram-Schmidt Orthogonalisation Process
(5.2.6)
(5.2.7)
and
https://nptel.ac.in/content/storage2/courses/122104018/node49.html 9/11
09/01/2021 Gram-Schmidt Orthogonalisation Process
So, Hence,
2. with , and
EXERCISE 5.2.9
2. Prove that the polynomials form an orthogonal set of functions in the inner
functions, with
3. Consider the vector space with the standard inner product defined in the above exercise. Find an
vectors orthogonal to in
5. Determine an orthogonal basis of vector subspace spanned by
in
https://nptel.ac.in/content/storage2/courses/122104018/node49.html 10/11
09/01/2021 Gram-Schmidt Orthogonalisation Process
8. Let with Prove that there exists an orthogonal matrix such that
Next: Orthogonal Projections and Applications Up: Inner Product Spaces Previous: Definition and Basic
Properties Contents
A K Lal 2007-09-12
https://nptel.ac.in/content/storage2/courses/122104018/node49.html 11/11