You are on page 1of 9

2.

5-1

2.5 Generation of Orthonormal Bases We have seen that orthonormal bases simplify the calculation of
n and the calculation of distances and energies from the coefficients x

coefficients. This section outlines two classical ways to generate an orthonormal basis from an arbitrary set of basis vectors. (Note that the arbitrary basis and the orthonormal basis span the same space.) Well denote the projection of some x(t ) onto the space spanned by, say,
v1 (t ) and v2 (t ) by

(t ) = x n vn (t ) Pv1 ,v2 x(t ) = x


n =1

where
= G 1p , x
* (t ) vn (t ) dt [G ]k ,n = vk

pn =

* x (t ) vn (t ) dt

2.5-1

2.5-2

2.5.1 Gram-Schmidt Orthogonalization [P 4.2.3] Consider a set of waveforms {si (t ), i = 1,, M } , not necessarily orthonormal. The GS algorithm creates from them an orthonormal set

{uk (t ), k = 1,, N } that spans the same space.

An alternative basis set.

GS rests on the fact that the error in a projection is orthogonal to the space onto which a vector is projected. Then just normalize.
e1 (t ) = s1 (t )

u1 (t ) = e1 (t ) u2 (t ) = e2 (t ) u3 (t ) = e3 (t )

Ee1 Ee2 Ee3

e2 (t ) = s2 (t ) Ps1 s2 (t )
e3 (t ) = s3 (t ) Ps1 , s2 s3 (t )

etc. Note u1 (t ), un (t ) span the same space as s1 (t ), sn (t ) . So do this: 1. u1 (t ) = s1 (t )

Es1
*

2. c21 =

s2 (t ) u1 (t ) dt , e2 (t ) = s2 (t ) c21u1 (t )
n 1 i =1

u2 (t ) = e2 (t )

Ee2

n. cni =

sn (t ) ui* (t ) dt ,

en (t ) = sn (t ) cni ui (t ) un (t ) = en (t )

Een

etc.
2.5-2

2.5-3

In the process, if any ek (t ) 0 , then sk (t ) is linearly dependent on its predecessors, so discard it.

See Proakis, Example 4.2-2. Another example:

Also use GS to decorrelate sets of random variables [ENSC 810, Sect 1.3].

Closely related to Cholesky decomposition [ENSC 810, Sect 1.3].

2.5-3

2.5-4

When GS is applied to tuples s1 , s 2 , , s M , instead of waveforms, the equations on p. 2.5-2 can be written
s1 = c11 u1
s 2 = c22 u 2 + c21 u1 s3 = c33 u3 + c32 u 2 + c31 u1

or
c11 c21 0 c 22 0 | uN ] 0 0 0 c31 c32 c33 0 cM 1 cM 2 c3M cNM

[s1 | s2 | s3 |

| s M ] = [u1 | u 2 | u3 |

or
S = UC,

where U is unitary and C is upper triangular. Its shown for M = N , but more generally, M > N .

This is the QR decomposition of S.

It is not necessarily square, and U has fewer columns than S if the S columns are linearly dependent.

2.5-4

2.5-5

We have been working with waveforms


s1 (t ) = c11 u1 (t ) s2 (t ) = c22 u2 (t ) + c21 u1 (t )

s3 (t ) = c33 u3 (t ) + c32 u2 (t ) + c31 u1 (t )

so we can write these equations as (shown for M = N )


c11 c21 0 c 22 0 , u N (t ) ] 0 0 0 cM 1 cM 2 c3M cNM

[ s1 (t ), s2 (t ),

, sM (t ) ] = [u1 (t ), u2 (t ),

c31 c32 c33 0

or, more concisely, as

s(t ) = u(t ) C

where u(t ) satisfies

(t ) u(t ) dt = I .

2.5-5

2.5-6

2.5.2 Orthogonalization by Eigendecomposition The waveforms si (t ), i = 1,, N can also be orthogonalized by eigendecomposition. This methods does the orthogonalization in a batch, rather than sequentially, as in GS. Assume the original basis functions si (t ), i = 1,, N are linearly independent. We form the orthonormal basis functions ui (t ), i = 1, , N as linear combinations of the si (t ) by
un (t ) = wi n si (t ), for n = 1,
i =1 N

,N

Rewrite this with time vector notation:


un (t ) = [ s1 (t ) w1n = s(t ) w sN (t )] n wNn

So the complete new basis is


u(t ) = [u1 (t ) u N (t ) ] = s(t ) W w N ] has columns of coefficient vectors.

where W = [ w1 w 2

2.5-6

2.5-7

We want to choose the coefficients so that the ui (t ) are orthonormal:


* um (t ) un (t ) dt

* wkm

i =1 k =1

win

sk (t ) si (t ) dt = mn
* G ki

This condition can be written as w m G w n = mn or W G W = I .

Its more easily seen with time vectors. We want


s (t ) s(t ) dt W = W G W = I u (t ) u(t ) dt = W

If we can find W that satisfies this equation, then we have the desired orthonormal basis as u(t ) = s(t ) W . As a first step, diagonalize G by a similarity transform. The Gram matrix is Hermitian (i.e., G = G ), so it has special properties: o Its eigenvectors qi (t ), i = 1, , N are orthogonal. Assume they are normalized to unit norm, and use them as columns of matrix Q . Then
Q Q = I ; that is, Q is unitary. Eigenvectors can be complex. Now

we can diagonalize and undiagonalize G by

Q G Q = and

Q Q = G, where = diag [ 1 ,, N ]

because Q acts as Q 1 . This is the unitary similarity transform. o And the eigenvalues i (t ), i = 1, , N are real.
2.5-7

2.5-8

A Gram matrix G is also positive semidefinite, so i (t ) 0, i = 1, , N . We assumed the original basis set was linearly independent, so this makes
G positive definite, i (t ) > 0, i = 1, , N .
1 2 1 2 , , Choose W = Q1 2Q (where 1 2 = diag N ) , so that 1

W G W = Q1 2Q G Q1 2Q = Q1 2 1 2Q = I

which was the condition we needed. So we obtain an orthonormal set ui (t ), i = 1, , N just by


u(t ) = s(t )Q1 2Q

But we can also think of Q1 2Q as G 1 2 , since

(Q

1 2

Q Q1 2Q = Q1Q = G 1

)(

So W = G 1 2 and we can also write the orthonormal basis as


u(t ) = s(t ) G 1 2

And thats orthogonalization by eigendecomposition. This eigenmethod of creating an orthonormal basis is useful, since we have a diagonal form, rather than the triangular form of the G-S method. However, it is not as easily extended with additional signal waveforms.
2.5-8

2.5-9

For completeness, deal with the possibility that there are at most M < N linearly independent functions in the original basis set, so G is rankdeficient (rank M ). Since some eigenvalues are zero, how can we form

1 2 ? Heres how:

o Collect the non-zero eigenvalues and their eigenvectors together, so that


= M 0 0 and Q = [Q M Z ] 0

where Z is an arbitrary basis of the null space of G .

o Check these properties:


Q M G QM = M and G = Q M M Q M

12 o Choose W = Q M M Q M . Then 12 1 2 1 2 1 2 W G W = Q M M QM G QM M QM = M M M = I

So the method still works, as long as we confine it to the non-zero eigenvalues.

2.5-9