Professional Documents
Culture Documents
R.Sunder,
Assistant Professor,
Department of Electronics and
Communication Engineering
INTRODUCTION
Increasing demand for information exchange is a characteristic
of modern civilisation.
Transfer of information from source to destination has to be
done such that quality of received information is as close as
possible to quality of transmitted information
Information may be machine generated (e.g., images,computer
data) or human generated( e.g., speech)
Regardless of its source,information must be translated into a
set of signals optimised for the channel over which we want to
send it.
Source encoder eliminates the redundant part in order to
maximise the information transmission rate.
INTRODUCTION
To ensure secrecy,encryption is used.
To protect the data against perturbations introduced by the
channel which leads to misinterpretation of the transmitted
message at the receiving end,error control strategies such as
FEC(forward error correction) which use error correcting codes
to correct errors at the receiving end and ARQ( automatic
repeat request) are used.
The modulator generates a signal suitable for the transmission
channel.
INTRODUCTION
From coding theory,by increasing codeword length or encoder
memory,greater protection or coding gain can be achieved.
Complexity of typical decoding algorithm such as Maximum
Likelihood Decoding Algorithm(MLDA) increases exponentially
with the encoder memory and the algorithms become difficult to
implement.
The increased error correction capability of long codes requires
a very high computational effort at the decoder.
NET RESULT: FIND NEW CODING SCHEMES WHICH COULD
REPLACE THE MLDA WITH SIMPLER DECODING STRATEGIES.
INTRODUCTION
The new strategies combine simpler codes in such a way that
allows each code to be decoded separately with less complex
decoders.
By using soft-input,soft-output (SISO) decoders,information can
be passed from one decoder to the next in an iterative fashion.
This is a “divide-and-conquer” strategy which in an iterative
process can approach the performance of the MLDA.
Encoding techniques used in conjunction with iterative decoding
combine different codes in such a way that each of them can be
coded independently.
The common feature is that decoding can be done sequentially
using one decoder at a time(after one code is decoded,another
decoder is used for the next code and so on).
TOPICS
Historical evolution of the concept on which turbo codes are
built.
Summary of the differences between turbo codes and
convolutional codes
Encoding process
Component codes for implementing turbo codes
Different turbo encoder structures based on convolutional
codes.
Block turbo codes
Turbo trellis coded modulation
Interleaver design
Introduction to decoding
TOPICS
A review of likelihoods
Principles of iterative decoding
Two dimensional single parity code example
Turbo decoding
Performance of turbo codes
Decoder delay
A Need for Better Codes
Designing a channel code is always a tradeoff between energy
efficiency and bandwidth efficiency.
Lower rate Codes correct more errors the communication
system can operate with a lower transmit
power, transmit over longer distances,
tolerate more interference, use smaller
antennas and transmit at a higher data rate.
have a large overhead and are hence more
heavy on bandwidth consumption
decoding complexity grows exponentially
with code length, and long (low-rate) codes set
high computational requirements to
conventional decoders.
Features:
Parallel code concatenation
Nonuniform interleaving
If the output of the outer decoder were reapplied to the inner
decoder it would detect that some errors remained, since the
columns would not be codewords of the inner code
Iterative decoder: to reapply the decoded word not just to the
inner code, but also to the outer, and repeat as many times as
necessary.
However, it is clear that this would be in danger of simply
generating further errors. One further ingredient is required for
the iterative decoder.
Soft-In, Soft-Out (SISO)
decoding
The performance of a decoder is significantly enhanced if, in addition
to the ‘hard decision’ made by the demodulator on the current symbol,
some additional ‘soft information’ on the reliability of that decision is
passed to the decoder.
H1 : dk= +1,
H2 : dk= -1.
MAP Likelihood ratio test
H1
>
p(xk / dk= +1 ) P(dk= +1) < p(xk / dk= -1 )
P(dk= -1) H2
H1
p(xk / dk= +1 ) > P(dk= -1)
<
p(xk / dk= -1 ) H2 P(dk= +1)
H1
p(xk / dk= +1 ) P(dk= +1) > 1
<
p(xk / dk= -1 ) P(dk= -1) H2
Log - Likelihood Ratio : LLR
P(d= +1 / x)
L(d/ x ) = log
P(d= -1 / x )
p(x / d= +1 ) P(d=
= log +1)
p(x / d= -1 ) P(d= -1)
p(x / d= +1 )
= log + log P(d=
p(x / d= -1 ) +1)
P(d= -1)
= L ( x/d ) + L(d)
Log - Likelihood Ratio : LLR
L(d/ x ) = L ( x /d ) + L(d)
^
L’( d ) = Lc ( x ) + L(d)
Soft LLR output for a systematic code :
^ ^ ^
L( d ) = L’( d ) + Le( d )
LLR of data at Extrinsic LLR : Knowledge from
Demod. output Decoding process
^ ^
L( d ) = Lc ( x ) + L(d) + Le(
^ ^
L( d ) = Lc ( x ) + L(d) + Le(
d)
L(d) apriori
value in
+ ak ak-1 ak-2
+
{vk}
L-1
ak = dk + gi’ ak-i (mod 2) ; gi’ = g1i if uk = dk
i=1 g2i if vk = dk
Trellis for NSC & RSC
NSC RSC
00 00
a = 00 a = 00
11 11
11 11
b = 01 b = 01
00 00
10 10
c = 10 c = 10
01 01
01 01
d = 11 10
d = 11 10
Concatenation of RSC
Codes
{dk} {uk}
+ ak ak-1 ak-2
Interleaver
+
{v1k}
{vk}
+ ak ak-1 ak-2
+
{ 0 0 …. 0 1 1 1 0 0 …..0 0 } {v2k}
{ 0 0 …. 0 0 1 0 0 1 0 … 0 0 } produce low weight codewords in component coders
Feedback Decoder
APP Joint Probability k i,m
= P { dk = i, Sk = m / R1 N }
State at
time k
Received sequence
From time 1 to N
APP P { dk = i / R1 N } = k i,m
; i = 0,1 for binary
m
k 1,m
m
Likelihood Ratio ( dk ) =
k 0,m
m
k 1,m
m
Log Likelihood Ratio L( dk ) = Log
k 0,m
m
Feedback
Decoder
MAP Rule dk =1 ; L(dk) > 0
dk =0 ; L(dk) < 0
^ ^
L( dk) = Lc ( x k) + L(dk) + Le( dk )
L1( dk ) = [Lc ( x k) + Le1(dk ) ]
L2( dk ) = [ f{L1 ( dn) }n k + Le2(dk ) ]
Feedback
Decoder
De-
xk L 1( dn ) Interleaver Le2( dk )
L 1( dk )
DECODER 1 Interleaver DECODER 2
De-
Interleaver
L 2( d k )
y1k y2k
yk
dk
Modified MAP Vs.
SOVA
SOVA
Viterbi Algorithm acting on soft inputs over forward
path of the trellis for a block of bits
Add BM to SM compare select ML path
Modified MAP
Viterbi Algorithm acting on soft inputs over forward
and reverse paths of the trellis for a block of bits
Multiply BM & SM Sum in both directions best
overall statistic
MAP Decoding Example
{uk}
00
a = 00
11
01
c = 01 10
+
01
{vk}
d = 11 10
MAP Decoding Example
d = { 1, 0, 0 }
u = { 1, 0, 0 } x = { 1., 0.5, -0.6 }
v = { 1, 0, 1 } y = { 0.8, 0.2, 1.2 }
Apriori probabilities 1 = 0 = 0.5
= P { Rk / dk = i, Sk = m }
. P {Sk = m / dk = i } . P { dk = i }
P {Sk = m / dk = i } = 1 / 2 L = ¼ ; P { dk = i } = 1 / 2 ;
k i,m
= P { xk / dk = i, Sk = m } . P { yk / dk = i, Sk = m } . { ki / 2L }
MAP Decoding Example
k i,m
= P { xk / dk = i, Sk = m } . P { yk / dk = i, Sk = m } . { ki / 2L }
k i,m
= { ki / 2L } (1/2 ) exp { - (xk – uki )2 /(2 2 ) }dxk
k i,m
= { Ak ki } exp { (xk . uki )+ (yk . Vki,m )/ 2 }
Assuming Ak = 1 2 =1 ;
k i,m
= 0.5 exp { xk . uki + yk . vki,m }
Subsequent steps
Calculate branch metric
k i,m
= 0.5 exp { xk . uki + yk . vki,m }
LLR L( dk ) = L(dk) + { 2xk } + Log [ke ]
Iterative decoding
For the second iteration;
k i,m
= ke i exp { xk . uki + yk . Vki,m }