You are on page 1of 36

An Introduction to

Turbo Equalization
Ian Marsland
Dept. of Systems & Computer Engineering
Carleton University


BCWS Seminar Series
March 12, 2001
System Model
Intersymbol Interference (ISI) Channels
Some Traditional Equalizers
Linear Equalizer (LE)
Decision Feedback Equalizer (DFE)
Maximum Likelihood (ML) Receivers
Turbo Equalizers
Interference Cancellers
Soft Decision Interference Cancellers
ML-Based Turbo Equalizers
Outline
First consider uncoded transmission over an ISI channel




Symbol Mapper converts digital data to a symbol in an
M-ary signal constellation.
System Model
Symbol
Mapper
ISI
Channel
Equalizer
n
d

n
d
n
v
n
r
Decision
Device
n
z
r
n
= received sample
v
n
= transmitted symbol
w
n
= additive white Gaussian noise sample
f
l
= channel tap
assumed to be known, and
L = channel ISI length
Intersymbol Interference (ISI) Channel

+ =
=

L
l
n l n l n
w v f r
0
1
0
2
=

=
L
l
l
f
Attempts to reduce the effects of the ISI
concentrates the energy transmitted for v
n
reduces the energy from other transmitted symbols.
Produces a decision variable, z
n
. Ideally,


where is additive white Gaussian noise with the
same variance as w
n
.
If equalization is effective, decision device can
determine v
n
with same reliability as if the channel did
not introduce any ISI.
Equalizer
Use tapped delay line to estimate transmitted symbols
Filter length K = K
1
+ K
2
+ 1 taps
{ p
k
} = tap weights
chosen to minimize the mean-square-error (MSE)
c
n
= E[| v
n
- z
n
|
2
]
z
n
~ v
n
+ noise
Linear Equalizer (LE)

=
=
+
1
2
K
K k
k n k n
r p z
Decision Feedback Equalizer (DFE)

=
=

=
+
L
k
k n k
K
k
k n k n
v q r p z
1 0

1
Feedforward
Filter { p
k
}
Feedback
Filter { q
k
}
Decision
Device
n
v

n
r
n
z

+
Taps chosen to minimize MSE
Maximum Likelihood (ML) Receiver
exploit finite state machine nature of channel
state definition


branch metrics





can use Viterbi algorithm for MLSE equalization
requires M
L
states

) , , , (
1 1 +
=
n L n L n n
v v v s
)
~
,
~
, ,
~
,
~
| (
1 1 n n L n L n n
v v v v r f
+

)
`

=
=

2
0
1 1
~
exp
0 0
L
l
l n l n
N N
v f r
t
Transmitted symbols are uncoded QPSK
Example: Proakis' Channel B
0.815
0.407
0.407
l
0 l 2
f
l
L = 2
Example: Proakis' Channel B
Equalization with Error Control Coding
Convolutional
Encoder
Interleaver
Symbol
Mapper
n
v
n
a
n
c
n
d
Transmitter:
] IL[n n
c d =
| |
n n
d v SM =
The random interleaver rearranges the order
of the code symbols, { c
n
}.
Equalization with Error Control Coding
an optimal receiver requires that the ISI and the error
correcting code be decoded jointly.
in the absence of interleaving it is possible to
implement an optimal ML receiver.
decoding is based on a "super-trellis" structure
large number of states.
when interleaving is used, the optimal receiver
becomes far too complex to implement.
instead, we must use sub-optimal decoding methods
e.g., perform the equalization and the decoding
separately.
Equalization with Error Control Coding
Equalizer Deinterleaver Decoder
n
a

n
r
n
z
n
y
| |
n n n n n
w d w v z
'
+ =
'
+ ~ SM
Suboptimal Receiver:
| |
| | n IL
n
n
n
w
c
z y
1 1
SM
] [ IL

'
+ ~ =
deinterleave equalizer output
use as received samples for decoder
LE and DFE are appropriate equalizers.
Equalization with Error Control Coding
if MLSE equalization is used, hard decisions are
produced by the equalizer, so .




hard-input (hard-decision) decoding must be done for
the convolutional code.
this leads to a 2 dB performance penalty over soft-
input (soft-decision) decoding.
MLSE
Equalizer
Deinterleaver Decoder
n
a

n
r
n
d

n
c

Equalization with Error Control Coding


better approach:
use soft-output equivalent of ML equalization
e.g., use SOVA or APP algorithms
produce a posteriori probabilities (APP's)


for each n, calculate a group of M APP's
deinterleave, keeping each group together
use deinterleaved probabilities as branch metrics
for decoder.
Equalization with Error Control Coding
Example
rate 1/2, 16-state convolutional code
generator (23,35)
8

1000 symbol random interleaver
QPSK modulation
Proakis' Channel B
Example: Proakis' Channel B
Turbo Equalization (Finally)
separating the equalization and decoding is suboptimal
knowledge about the structure on the transmitted
symbols imposed by the error correcting code is not
exploited by the equalizer.
turbo equalization is an iterative equalization/decoding
process.
involves using output from the decoder to re-
equalize the received data.
normally based on:
interference canceller, or
APP equalizer.
Interference Cancellation (IC)
attempt to perfectly cancel the interference.
recall that


the energy for v
n
is received in L+1 samples,


e.g., L = 1
Interference Cancellation (IC)
collect all the energy for v
n
into a single sample



e.g. L = 1



Interference Cancellation (IC)
in general


where


and



x
n
is interfered by {v
n-L
, , v
n-1
, v
n+1
, , v
n+L
}

Interference Cancellation (IC)
if we know all then we can remove the
ISI from x
n
let



is coloured Gaussian noise with variance N
0


Interference Cancellation (IC)
requires knowledge of {v
n-L
, , v
n-1
} and
{v
n+1
,,v
n+L
}
could use decision feedback for {v
n-L
, , v
n-1
}
but {v
n+1
,,v
n+L
} correspond to future decisions.
turbo equalization helps solve this problem.
IC-based Turbo Equalization
1. first equalize the received samples using a DFE or LE
2. then decode the convolutional code, giving
3. re-encode , then interleave and symbol map the
result, giving
4. then, cancel the interference in r
n
with an IC using
in place of v
n
.




5. use z
n
to re-decode the convolutional code.
6. repeat from step 3.

=
=

=
+
=
+
L
k
k n k
L
k
k n k
L
k
k n k n
v q v q r f z
1 1
*
0
*

Example: IC - Proakis B
Soft-Decision Feedback IC
a problem with this IC-based turbo equalizer is with
error propagation limiting performance.
an alternative is to use soft-decisions for interference
cancellation.
use a soft-output decoder for the convolutional code,
such as the APP algorithm
decoder computes code symbol APP's

generate soft symbols from APP's
} | Pr{ ] SM[
] IL[
1
0
r c c c v
n
M
c
n
=

=

=
Example: IC - Proakis B
ML-based Turbo Equalization
a better-performing turbo-equalization scheme can be
found by using the same principles as used for
iteratively decoding serially concatentated codes.
use a soft-output APP equalizer and a soft-output APP
decoder for the convolutional code.
decoder computes code symbol APP's


pass extrinsic information between equalizer and
decoder.

ML-based Turbo Equalization
extrinsic information:
output of equalizer cannot depend directly on input
from decoder.
output of decoder cannot depend directly on input from
equalizer.
APP
Equalizer
APP
Decoder
n
a

n
r
{ }
{ } d d
r d d
n
n
=
=
Pr
| Pr
IL
-1

IL
) | ( c c r f
n n
=
{ }
) | (
| Pr
c c r f
r c c
n n
n
=
= { } d d
n
= Pr
Example: ML - Proakis B
Comparison of Schemes
0.688
0.460 0.460
l
0 l 2
f
l
0.227
0.227
3 4
Another Example: Proakis' Channel C
L = 4
Comparison of Schemes (Channel C)
Discussion
Turbo-equalization can significantly reduce the SNR
penalty caused by ISI
ML-based turbo-equalizer gives best performance
almost completely eliminates ISI after 5 iterations
Hard decision feedback IC-based TE has lowest
complexity
no soft-decision decoding
IC is very simple, with 3L+1 filter taps.
digital interleaver in receiver
Discussion
ML based equalizer complexity grows exponentially
with ISI length and signal constellation size
not suitable for spectrally efficient modulation
IC complexity does not depend on constellation size.
Performance of soft decision feedback IC could be
improved by adjusting tap weights used for each
symbol based on APP's from decoder.
Ongoing Work
Adaptive algorithms
Frequency selective fading
Higher-order modulation schemes

You might also like