You are on page 1of 2

CHAPTER-3

A = B-1+CD-1CH
(A)-1 =(B-1+CD-1CH)-1
A-1 = (B-1)-1-(B-1)-1C[(D-1)-1+CH(B-1)-1C]CH(B-1)-1
A-1=B-BC[D+CHBC]CHB -①
From recursion for updating the value of the correlation of matrix of the tap input

ᵩ(n)=λᵩ(n-1)+u(n)u (n)- H

compare A=B-1+CD-1CH with equation②,

ᵩ ᵩ
we get, A= (n) , B-1=λ (n-1), C=u(n), D=1

Now substitute A,B,C,D values in equation①,

ᵩ (n) = λ ᵩ-1(n-1)-λ ᵩ (n-1)u(n)u (n)ᵩ (n-1)/1+λ u (n)ᵩ (n-1)u(n)


-1 -1 -2 -1 H -1 -1 H -1

Let us consider p(n)= ᵩ (n);


-1

P(n)= λ-1p(n-1)-λ-1P(n-1)u(n)λ-1uH(n)p(n-1)/1+λ-1uH(n)p(n-1)u(n)
Let,
k(n)=λ-1p(n-1)u(n)/1+λ-1uH(n)p(n-1)u(n)
P(n)=λ-1p(n-1)-λ-1p(n-1)-λ-1uH(n)p(n-1)k(n)
Now rearrange equation of k(n)equation;
k(n)=λ-1p(n-1)u(n)/1+λ-1uH(n)p(n-1)u(n)
k(n)=λ-1p(n-1)u(n)-λ-1k(n)uH(n)p(n-1)u(n)
k(n)=p(n)u(n)


k(n)= -1(n)u(n)

Gain vector is defined as the tap input vector u(n), transformed by the inverse of the
correlation of matrix (n). ᵩ
The normal equation for the recursive least square is, (tap weight vector at iteration)

ŵ(n) = z(n) ᵩ (n)


-1

ŵ(n) =p(n)z(n)
(cross correlation vector between the inputs and desired response)
Z(n)=λ z(n-1)+u(n)d*(n)
ŵ(n)=p(n)z(n)
ŵ(n)=p(n)[λ z(n-1)+u(n)d*(n)]
ŵ(n)= p(n)λ z(n-1)+u(n)d*(n)p(n)
Now substitute p(n) value in first part,
ŵ(n)=λ z(n-1)[λ-1p(n-1)-λ-1k(n)uH(n)p(n-1)u(n)]+p(n)u(n)d*(n)
ŵ(n)=z(n-1)p(n-1)-z(n-1)k(n)uH(n)p(n-1)+p(n)u(n)d*(n)
we know,
ŵ(n)=z(n)p(n) (or) ŵ(n-1)=z(n-1)p(n-1)
ŵ(n)=ŵ(n-1)-ŵ(n-1)k(n)uH(n)+p(n)u(n)d*(n)
ŵ(n)=ŵ(n-1)-ŵ(n-1)k(n)uH(n)+k(n)d*(n)
ŵ(n)=ŵ(n-1)+[ŵ(n-1)uH(n)-d*(n)]k(n)
ŵ(n)=ŵ(n-1)+e*(n)k(n)

You might also like