You are on page 1of 69

Error Control Coding

Lecture Notes

Prof. Dr.-Ing. Volker Kühn


University of Rostock
Institute of Communications Engineering
Error Control Coding
Table of Contents

Basics of Digital Communications – A Summary

Two Lessons of Information Theory

Forward Error Correcting Codes – The Classics

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 175 / 385
History of Concatenated Codes
• Conventional codes like convolutional and simple block codes
cannot approach Shannon’s capacity limit
• Concatenated codes have potential to reach capacity limit with
moderate decoding complexity
1963 Gallager invented LDPC codes
[D. Gallager: “Low Density Parity Check Codes”, The M.I.T. Press, Cambridge, MA]
1966 Forney’s work on concatenated codes
[G. Forney: “Concatenated Codes”, The M.I.T. Press, Cambridge, MA]
• Limited attention due to missing computational power
1993 Breakthrough by turbo codes approaching capacity limit by 0.5 dB
[C. Berrou, G. Glavieux, P. Thitimajshima: “Near Shannon Limit Error-Correcting
Coding and Decoding: Turbo-Codes”, Proc. IEEE ICC, pp. 1064-1070 ]

• Turbo decoding principle suitable for many other applications like


coded modulation, turbo equalization, decoding and multi-user
detection, LDPC codes
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 176 / 385
Potential of Concatenated Codes
Comparison of convolutional, turbo and RA codes for Rc = 1/2

100
Lc =3
• Turbo code with
optimized interleaver of
Lc =5
length 65384 bit
10−1 =7
Lc
Lc =9
• Gain of nearly 3 dB
compared to
uncoded
10−2 convolutional code with
BER →

TC
constraint length Lc = 9
10−3
RA
• Gap to Shannon’s
channel capacity only
0.5 dB
10−4 • Former world record:
0.1 dB gap to Shannon
capacity by Stephan ten
10−5
0 2 4 6 Brink’s repeat
accumulate (RA) code
10 log10 (Eb /N0 ) in dB →
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 177 / 385
Serial and Parallel Code Concatenation
• Serial Concatenated Code (SCC)

C1 C2 D2 D1
inner code

outer code

• Parallel Concatenated Code (PCC), Turbo Code

C1 P

CL S

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 178 / 385
Serial Code Concatenation
First Attempt: Concatenation of (3,2,2) SPC and (4,3,2) SPC codes

u c1 c2
C1 C2

u c1 c2 wH (c2 )
00 000 0000 0
01 011 0110 2
10 101 1010 2
11 110 1100 2

• Concatenated code has dmin = 2

• no improvement due to concatenation

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 179 / 385
Serial Code Concatenation: Second Attempt
Concatenation of (4,3,2) SPC code and (7,4,3) Hamming code
u c1 c2 wH (c2 ) copt
2 wH (copt
2 )
000 0000 0000 000 0 0000000 0
001 0011 0011 001 3 0001111 4
010 0101 0101 010 3 0110011 4
011 0110 0110 011 4 0111100 4
100 1001 1001 100 3 1010101 4
101 1010 1010 101 4 1011010 4
110 1100 1100 110 4 1100110 4
111 1110 1111 111 7 1101001 4

• Original code concatenation achieves dmin = 3 → no gain


compared to Hamming code

• optimized concatenation achieves dmin = 4


V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 180 / 385
Product Codes
Serial Concatenation of 2 Linear Block Codes
• Information bits arranged in
k | × k − matrix
• Row-wise encoding with code C − k− n− − k −
of rate k − /n−
• Column-wise encoding with code
C | of rate k | /n| k| u p− C−
• Entire code rate:
k− · k|
Rc = = Rc| · Rc−
n− · n| n| − k | p| p+

• Minimum Hamming distance


C|

dmin = d−
|
min · dmin
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 181 / 385
Product Codes
Examples 1: (12,6,4) Product Code

• Horizontal: (3,2,2) SPC code


• Vertical: (4,3,2) SPC code
• Code rate: x0 x4 x8
6 1
Rc = = x1 x5 x9
12 2
• Minimum Hamming distance x3 x6 x10

dmin = 2 · 2 = 4 x3 x7 x11

• 1 error correctable
• Coding gain dmin · Rc = 2

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 182 / 385
Product Codes
Examples 2: (28,12,6) Product Code

x0 x7 x14 x21 x0 x7 x14 x21


• Horizontal:
(4,3,2) SPC code x1 x8 x15 x22 x1 x8 x15 x22
• Vertical: x3 x9 x16 x23 x3 x9 x16 x23
(7,4,3) Hamming code
12
• Code rate Rc = 28 = 37 x3 x10 x17 x24 x3 x10 x17 x24

• dmin = 2 · 3 = 6 x4 x11 x18 x25 x4 x11 x18 x25


• 2 errors correctable
x5 x12 x19 x26
• Coding gain x5 x12 x19 x26
18
dmin · Rc = 7 ≈ 2.57 x6 x13 x20 x27 x6 x13 x20 x27

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 183 / 385
Parallel Concatenation of 2 Linear Block Codes
• Information bits arranged in k | · k − matrix

• Row-wise encoding with code C − of rate Rc− = k − /n−

• Column-wise encoding with code C | of rate Rc = k | /n|


|

• Parity check bits of component codes: no checks on checks


k− n− − k −

• Entire code rate:

Rc = k− ·k|
n− n| −(n− −k− )(n| −k| )
k| u p− C−

1
=
1/Rc− +1/Rc −1
|

n| − k | p|
• Minimum Hamming distance
dmin = d−
min + dmin − 1
| C|

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 184 / 385
Parallel Concatenation of 2 Linear Block Codes
Example with two SPC Codes

• Horizontal: (3,2,2) SPC code


• Vertical: (4,3,2) SPC code
• Code rate:
x0 x4 x8
Rc = 6/11
x1 x5 x9
• Minimum Hamming distance
x3 x6 x10
dmin = 2 + 2 − 1 = 3
x3 x7
• 1 error correctable
• Coding gain
18
dmin · Rc = 11 ≈ 1.64

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 185 / 385
Parallel Concatenation of 2 Linear Block Codes
Example with Hamming and SPC Code

x0 x7 x14 x21
• Horizontal: (4,3,2) SPC code
• Vertical: (7,4,3) Hamming code x1 x8 x15 x22
• Code rate:
12 x3 x9 x16 x23
Rc =
25 x3 x10 x17 x24
• Minimum Hamming distance
x4 x11 x18
dmin = 2 + 3 − 1 = 4 x5 x12 x19

• 1 error correctable x6 x13 x20


• Coding gain dmin · Rc = 48
25 ≈ 1.92

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 186 / 385
Interleaving is crucial: Simple block interleaver
write
read
x0 x3 x6 x9 x12

x1 x4 x7 x10 x13 interleaving depth: 5

x2 x5 x8 x11 x14

• input sequence:
x0 , x1 , x2 , x3 , x4 , x5 , x6 , x7 , x8 , x9 , x10 , x11 , x12 , x13 , x14
• output sequence:
x0 , x3 , x6 , x9 , x12 , x1 , x4 , x7 , x10 , x13 , x2 , x5 , x8 , x11 , x14
• Turbo codes require (pseudo) random interleaving
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 187 / 385
Concatenation of 2 Constituent Codes
Parallel Concatenated Code (PCC), Turbo Code
u

u = u1 c2
C1 c
P
Π
u2 c1
C2

• Interleaver Π permutes information sequence u


• Encoders deliver just parity bits, systematic information bits in upper
branch
• Puncturing P adjusts entire code rate Rc and delivers code word c
• Turbo codes generally employ convolutional codes
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 188 / 385
Concatenation of 2 Constituent Codes
Serial Concatenated Code (SCC)

u c1 c̃1 c2
C1 Π C2

• Complete outer code word c1 is interleaved

• Interleaved version c̃1 serves as input of inner encoder C2 delivering


c2
• Both encoders might contain puncturing for rate adaptation

• Overall code rate becomes Rc = Rc,1 · Rc,2

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 189 / 385
Turbo Codes: Example 1
Recursive convolutional encoders with Lc = 3, code rate Rc = 2/3

u
c1
C1
u = u1  
1 1 1 1
D D   c
P = 1 0 0 0
0 0 1 0

Π
c2
C2
u2
D D

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 190 / 385
Turbo Codes: Example 2
Recursive convolutional encoders with Lc = 5, code rate Rc = 1/2
u
c1
C1
u = u1  
1 1
D D D D   c
P = 1 0
0 1

Π
c2
C2
u2
D D D D

[C. Berrou, G. Glavieux, P. Thitimajshima: “Near Shannon Limit Error-Correcting Coding and
Decoding: Turbo-Codes”, IEEE ICC, pp. 1064-1070, Geneva, Switzerland, 1993]

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 191 / 385
Serial Concatenated Code: Example 1
Repeat Accumulate code by ten Brink with code rate Rc = 1/2

u repatition c1 c̃1 c2
Π D D D P
code

C1
outer repetition encoder, C2
code rate 1/2 inner recursive encoder:
g1 = 1, g2 = 178 /178

[S. ten Brink: "Rate one-half code for approachingthe Shannon limit by 0.1 dB", Electronics Letter,
20th July 2000, Vol. 36, No. 15]

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 192 / 385
Serial Concatenated Code: Example 2
Convolutional encoders with Lc = 4, entire code rate Rc = 1/2

outer non-recursive encoder:


c1 g1 = 178 , g2 = 158
u c̃1
D D D P1 Π

C1

c2
c̃1 P2
D D D
inner recursive encoder:
g1 = 1, g2 = 158 /178
C2

[Neubauer, Freudenberger, Kühn: "Coding Theory – Algorithms, Architectures, and Applications",

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 193 / 385
Analysis of Code Performance via Distance Properties
• First attempts to understand amazing performance of turbo codes
considered distance properties of concatenated codes

• Upper union bound on bit error probability (assuming ML decoding)


s !
X Eb
Pb ≤ Cd · erfc d · Rc
d
N0

• Simple block interleavers perform well for concatenated block codes,


but not for concatenated convolutional codes

• Random interleavers are much better → like random codes


(Shannon)

• Pseudo-random interleavers used in practice


V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 194 / 385
Influence of Encoders on Distance Properties
Parallel code concatenation:

Pb ∼ L1−w
π
min

Lπ : interleaver size
wmin : minimum input weight of constituent encoders for finite output
weight
• Block codes and non-recursive convolutional encoders: wmin = 1
→ Pb ∼ L0π → no gain for enlarging interleaver
• Recursive convolutional encoders: wmin = 2 → Pb ∼ L−1
π
• Turbo codes employ systematic recursive convolutional encoders
Serial concatenation:
• Inner encoder should be recursive convolutional encoder, outer
encoder can be non-recursive
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 195 / 385
Union Bound on Error Rates for Half-Rate SCCs
• Serially concatenated code with
• Outer non-recursive encoder with g1 = 178 , g2 = 158

• Inner recursive encoder with g1 = 1, g2 = 158 /178

100 NSC, Lc = 9
n = 120
10−5 n = 1200
n = 12000
Pb →

10−10

10−15

10−20
0 2 4 6 8 10
10 log10 (Eb /N0 ) in dB →
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 196 / 385
Preliminary Summary
Analysis via Distance Properties

• Large Hamming distances d and small coefficients Cd desirable


min. Hamming distance dominates asymptotic error rates

• Recursive convolutional encoders and large (pseudo) random


interleavers crucial

• Union bound approximation of BER assumes optimal Maximum


Likelihood decoding → prohibitively high complexity

• Up to now, no efficient code construction presented

• Distance properties provide only half of the story!

• Our goal: Powerful concatenated codes with moderate decoding


complexity
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Concatenated Codes 197 / 385
Error Control Coding
Table of Contents

Basics of Digital Communications – A Summary

Two Lessons of Information Theory

Forward Error Correcting Codes – The Classics

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 198 / 385
General Concept of Turbo Decoding

D1 D2

only for parallel concatenation

• Constituent codes C1 and C2 are decoded by separate decoders


D1 and D2
• Decoders D1 and D2 are allowed to exchange “information” in order
to improve their performance
• Suboptimal iterative decoding strategy with soft-in soft-out decoding
algorithms
• Modern convergence analysis of iterative decoding by EXIT charts
provides new insights in code design
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 199 / 385
Definition of Log-Likelihood Ratio (LLR)
Pr{u = 0} Pr{x = +1}
L(u) = log ←→ L(x) = log
Pr{u = 1} Pr{x = −1}

6
4
2
L(x) →

0
−2
−4
−6
0 0.2 0.4 0.6 0.8 1
Pr{X = +1} →
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 200 / 385
Probability of Correct and Wrong Decision
e|L(x|y)|
Pr{x
b is correct} = Pr{x
b = x} =
1 + e|L(x|y)|
1
Pr{x
b is wrong} = Pr{x
b ̸= x} = 1 − Pr{x
b = x} =
1 + e|L(x|y)|
100
b is wrong} →

10−1

10−2
Pr{x

10−3

10−4
0 2 4 6 8 10
|L(x | y)| →
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 201 / 385
Log-Likelihood Ratios for AWGN Channels
Pr{y|x = +1} Es
L(y|x) = log =4 · y = Lch · y
Pr{y|x = −1} N0

50
L(y | x) →

0
0 dB
2 dB
4 dB
6 dB
−50 8dB

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2


y→
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 202 / 385
Log-Likelihood Ratios for Further Channel Models
Binary Symmetric Channel (BSC)

" #
log 1−P e
for y = +1 1 − Pe
L(y|x) = Pe = y · log
log 1−Pe
Pe
for y = −1 Pe

1 − Pe 4
X0 Y0 |L(y | x)| →
2
0
−2
−4
X1 Y1
1 − Pe 0 0.2 0.4 0.6 0.8 1
Pe →

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 203 / 385
Log-Likelihood Ratios for Further Channel Models
Binary Erasure Channel (BEC)


1−Pe
log 0 = +∞ for y = +1


L(y|x) = log PPee = 0 for y = 0


log 0 = −∞ for y = −1
1−Pe

1 − Pe
X0 Y0

Pe Y2

X1 Y1
1 − Pe

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 204 / 385
Soft-Output Decoding of Binary Repetition Code
• Binary repetition code: Γ = {[1 . . . 1], [0 . . . 0]}

• BPSK mapping xi = 1 − 2ci delivers transmit word x


h i
• AWGN channel: y = x + w = x0 + w0 · · · xn−1 + wn−1

• Log-likelihood ratio becomes

Pr{u = 0 | y} p(y | c(u = 0)) Pr{u = 0}


L(u | y) = log = log
Pr{u = 1 | y} Bayes p(y | c(u = 1)) Pr{u = 1}
Y p(yi | ci = 0) Pr{u = 0}
= log ·
AWGN
i
p(yi | ci = 1) Pr{u = 1}
X p(yi | ci = 0) X
= log + La (u) = L(yi | ci ) + La (u)
i
p(yi | ci = 1) i

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 205 / 385
L-Algebra
LLR of modulo-2-sum of two bit

Extrinsic LLR for parity bit p = u0 ⊕ u1 :


 
L(x0 ) L(x1 )
Le (p) = L(u0 ) ⊞ L(u1 ) = 2 artanh tanh · tanh
2 2
≈ sgn[L(x0 ) · sgn[L(x1 )] · min{|L(x0 )|, |L(x1 )|}

L(x0 )
tanh(x/2)
x̄0
x̄p Le (xp )
2 artanh(x)
L(x1 ) x̄1
tanh(x/2)

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 206 / 385
L-Algebra
Generalization to n − 1 information bits and its approximation

• (n, n − 1) SPC code (n − 1 information bit, 1 parity bit)


"n−2  #
L(xi ) Y
Le (p) = L(u0 ) ⊞ · · · ⊞ L(un−2 ) = 2 artanh tanh
i=0
2

• Approximation
n−2
Y
Le (p) = L(u0 ⊕ · · · ⊕ un−2 ) ≈ min{|L(xi )|} · sgn[L(xi )]
i=0

• Complete soft-output LLR consists of systematic and extrinsic LLRs:


Es
L(p) = L(yn−1 | xn−1 ) + Le (p) = 4 yn−1 + Le (p)
N0

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 207 / 385
Example: Soft-Output Decoding for (4,3,2) SPC Code

10 log10 (Es /N0 ) = 2dB

u 1 0 1 Lch .y ≠5.1 +7.0 +1.9 +2.5

encoding + approximation

c 1 0 1 0 Le (u) +1.9 ≠1.9 ≠2.5 ≠1.9


Pr{û correct}
BPSK = Lch .y + Le (u) 0.96 0.99 0.65 0.65
HD
x ≠1 +1 ≠1 +1 L(û) ≠3.2 +5.1 ≠0.6 +0.6 ≠1 +1 ≠1 +1
error corrected
AWGN

HD error detected
y ≠0.8 +1.1 +0.3 +0.4 ≠1 +1 +1 +1
but not corrected

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 208 / 385
Turbo Decoding of (24,16,3) Product Code (1)

u 1 0 0 1 x -1 +1 +1 -1 +1 LLR 0.6 7.6 1.3 -3.2 6.3

0 1 1 1 encoding +1 -1 -1 -1 -1 AWGN 5.1 -4.4 3.8 -0.6 -9.5

1 0 1 0 -1 +1 -1 +1 +1 -7.5 +3.2 -5.7 7.6 1.3


BPSK SNR=2 dB
0 0 0 1 +1 +1 +1 -1 -1 1.3 -1.3 8.2 -9.5 -17.2

+1 -1 +1 -1 +1.9 -5.7 7.6 -7.0

1. vertical extrinsic
decoding information
(1) (1)
La,≠(u) = Le,| (u)
(1) (1)
-1.3 -1.3 -3.8 -0.6 L| (u) -0.7 6.3 -2.5 -3.8 Le,| (u) -1.3 -1.3 -3.8 -0.6

-0.6 1.3 -1.3 -3.2 4.5 -3.1 2.5 -3.8 -0.6 1.3 -1.3 -3.2

0.6 -1.3 1.3 0.6 -7.0 1.9 -4.4 8.2 0.6 -1.3 1.3 0.6
(1)
Lch · y + Le,| (û)
-0.6 3.2 -1.3 -0.6 0.7 1.9 6.9 -10.1 -0.6 3.2 -1.3 -0.6

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 209 / 385
Turbo Decoding of (24,16,3) Product Code (2)

(1)
Lch · y + La,≠(u)
(1) (1)
-0.7 6.3 -2.5 -3.8 6.3 Le,≠(u) 2.5 -0.7 0.7 0.7 L≠ (u) 1.8 5.6 -1.8 -3.1

4.5 -3.1 2.5 -3.8 -9.5 1. horizontal -2.5 2.5 -3.1 2.5 2.0 -0.6 -0.6 -1.3

-7.0 1.9 -4.4 8.2 1.3


decoding
-1.3 1.3 -1.3 1.3
Lch · y 8.3 3.2 -5.7 9.5
(1)
0.7 1.9 6.9 -10.1 -17.2 1.9 0.7 0.7 -0.7 +Le,≠ (u) 2.6 2.6 7.6 -10.8
(1)
1.9 -5.7 7.6 -7.0
+La,≠ (u)

(2)
Lch · y + La,| (u) (1)
Le,≠ (u)
(2)
3.1 6.9 2.1 -2.5 6.3 = La,| (u) û(1) 0 0 1 1

2.6 -1.9 0.7 1.9 -9.5 0 1 1 1

-8.9 4.5 -7.0 8.9 1.3 1 0 1 0

3.2 -0.6 8.9 -10.2 -17.2 0 0 0 1

1.9 -5.7 7.6 -7.0

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 210 / 385
Turbo Decoding of (24,16,3) Product Code (3)

(2)
Lch · y + La,| (u)
(2)
3.1 6.9 2.1 -2.5 6.3 û(2) 1 0 0 1 L≠ (u) -1.9 7.6 2.1 -1.9

2.6 -1.9 0.7 1.9 -9.5 0 1 1 1 1.5 -2.1 -1.4 1.4

-8.9 4.5 -7.0 8.9 1.3 1 0 1 0 -7.0 3.9 -6.3 1.3

3.2 -0.6 8.9 -10.2 -17.2 x x 0 1 0 0 6.9 0.6

1.9 -5.7 7.6 -7.0 Lch · y


(2)
+Le,≠ (u)
(2) (2)
2. vertical Lch · y + La,≠(u) +La,≠ (u)
decoding (2)
-1.3 7.0 0.6 -1.3 6.3 Le,≠(u) -0.6 0.6 1.3 -0.6
(2)
Le,| (u) -1.9 -0.6 -0.7 1.9 3.2 -3.8 1.7 -3.1 -9.5 -1.7 1.7 -3.1 1.7
2. horizontal
-1.9 0.6 -2.1 -2.5 Lch · y -5.7 2.6 -5.0 5.7 1.3 -1.3 1.3 -1.3 1.3
(2)
+La,| (u) decoding
1.9 -0.6 0.7 -1.9 (2) -0.6 0.6 7.5 -7.6 -17.2 0.6 -0.6 -0.6 0.6
+Le,| (u)
-1.9 1.9 -0.7 1.9 1.9 -5.7 7.6 -7.0

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 211 / 385
Turbo Decoding of (24,16,3) Product Code (4)

(3)
Lch · y + La,| (u)
(3)
0.0 8.2 2.6 -3.8 6.3 û(3) 1 0 0 1 L≠ (u) -1.9 6.3 1.9 -2.7

3.4 -2.7 0.7 1.1 -9.5 0 1 1 1 3.9 -1.3 -1.3 3.2

-8.9 4.5 -7.0 8.9 1.3 1 0 1 0 -8.9 2.6 -6.3 8.8

1.9 -1.9 8.2 -8.9 -17.2 0 0 0 1 2.7 2.7 8.8 -9.7

1.9 -5.7 7.6 -7.0 Lch · y


(3)
+Le,≠ (u)
(3) (3)
3. vertical Lch · y + La,≠(u) +La,≠ (u)
decoding (3)
-1.3 5.7 0.6 -2.1 6.3 Le,≠(u) -0.6 0.6 1.3 -0.6
(3)
Le,| (u) -1.9 -1.9 -0.7 1.1 5.1 -2.5 1.2 -4.4 -9.5 -1.2 1.2 -2.5 1.2
(3)
3. horizontal
0 1.9 -2.6 -3.8 Le,| (u) = -7.6 1.3 -5.0 7.5 1.3 -1.3 1.3 -1.3 1.3
(3) decoding
0 -1.9 0.7 -1.1 +La,≠ (u) 1.3 1.4 7.5 -8.4 -17.2 1.4 1.3 1.3 -1.3

0 2.7 -0.7 1.1 1.9 -5.7 7.6 -7.0

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 212 / 385
Turbo Decoding for Parallel Concatenated Codes
La,1 (u) = Le,2 (u) ≠1 .

Lch · y ≠
D1 . D2
L2 (u) = Lch · ys
L1 (u) = Lch · ys
Lch · ys +
(2)
La (u) +La,2 (u)
+La,1
e,1 (u)
+Le,2 (u)
+Le,1 (u)

• Both decoders estimate same information word u and each decoder


receives corresponding channel outputs
• Systematic information bits ys are fed to D2 via D1 and Π
• Each decoder generates extrinsic LLRs serving as a priori LLRs for
other decoder
• A priori LLRs improve decoders’ performance in each iteration as
long as they are statistically independent of regular inputs
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 213 / 385
Turbo Decoding for Serial Concatenated Codes
u c1 c̃1 c2
C1 C2
outer encoder systematic inner encoder

La,2 (c̃1 ) = Le,1 (c̃1 ) L1 (c1 ) = Lch ỹs


.
≠ +La,1 (c1 )
+Le,1 (c1 )
Lch y ≠
D2 . ≠1 D1
L1 (u)
L2 (c̃1 ) = Lch ys (2)
Lch ỹs + La,1 (u)
+La,2
e,2 (c̃1 )

+Le,2 (c̃1 )

• Outer decoder receives information only from inner decoder


• Outer decoder delivers estimates on information bits u as well as
extrinsic LLRs of code bits c1 being information bits of inner code C2
• Extrinsic LLRs of code bits c1 serve as a priori LLRs for inner code
V. Kühn C 2
| Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 214 / 385
General Approach for Soft-Output Decoding
P
(0) p(y|c) · Pr{c}
Pr{ui = 0|y} c∈Γ
L(ûi ) = L(ui | y) = log = log P i
Pr{ui = 1|y} c∈Γ
(1) p(y|c) · Pr{c}
i
P Q Q
j̸=i
p(yj |cj ) j̸=i
Pr{uj }
(0)
i |ci =0) Pr{ui =0}
= log p(y
p(yi |ci =1) + log Pr{ui =1} + log
c∈Γ
P Q i Q
j̸=i
p(yj |cj ) j̸=i
Pr{uj }
(1)
c∈Γ
i

= Lch yi + La (ui ) + Le (ui )

• A posteriori LLR consists of 3 independent parts for systematic


encoders:
• Lch yi : intrinsic (systematic) part
• La (ui ): a priori LLR obtained from extrinsic LLR of other decoder
• Le (ui ): extrinsic LLR
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 215 / 385
Problem with General Approach
• Computational complexity grows exponentially with k (or n − k )
• Example: (255,247,3) Hamming code (still quite short)

• 2247 ≈ 2.3 × 1074 code words → infeasible!

• General soft-output decoding approach only for specific codes useful!

• Repetition codes: c = [u0 · · · u0 ] ⇒ x = 1 − 2 · c ⇒ y = x + w


n−1
X
L(ûi ) = L(yℓ |xℓ ) averaging
ℓ=0

• Single Parity Check codes: c = [u0 · · · uk−1 p = u0 ⊕ · · · ⊕ uk−1 ]


 
 
Y L(y |x )
L(ui ) = Lch yi +2 artanh  tanh
j j  see L-algebra
j̸=i
2
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 216 / 385
BCJR Algorithm for Convolutional Codes
Optimal Symbol-by-Symbol Soft-Output Decoding of Convolutional Codes

• Invented from Bahl, Cocke, Jelinek and Raviv in 1972

• Originally developed for soft-in soft-out processing of ISI signals (ISI:


intersymbol interference)

• Adapted for convolutional codes and based on trellis diagram

• Performs symbol-by-symbol decoding


→ minimizes bit error probability, not word error probability
• Notation:
 
• Received sequence: y = y1 y2 · · · yL
• Actual received word: yℓ = [yℓ,1 · · · yℓ,n ]

V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 217 / 385
Symbol-by-Symbol Decoding for Convolutional Codes
• Log-Likelihood ratio: P
p(s′ , s, y)
Pr{ul = 0 | y} (s′ ,s),uℓ =0
L(uℓ | y) = log = log P
Pr{ul = 1 | y} p(s′ , s, y)
(s′ ,s),uℓ =1

• ℓ-th trellis segment


state s0 state s
p(s0 , yk<` ) p(yi |s0 , s) · Pr{s|s0 } p(s0 , yk<` )
00

10 u` = 1
u` = 0
01

11
`−1 `
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 218 / 385
BCJR Algorithm for Convolutional Codes
Foward and Backward Recursions
• Probabilities αℓ−1 (s′ ), βℓ (s), and γℓ (s′ , s) can be computed
recursively
• State transitions
γℓ (s′ , s) = p(yℓ | s′ , s) · Pr{uℓ = u(s′ , s)}
| {z } | {z }
likelihood p(yℓ |x(s′ ,s)) a priori

• Forward recursion
X
αℓ (s) = γℓ (s′ , s) · αℓ−1 (s′ )
s′

• Backward recursion
X
βℓ−1 (s′ ) = γℓ (s′ , s) · βℓ (s)
s
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 219 / 385
BCJR Algorithm for Convolutional Codes
LLR calculations in trellis diagram

P
(s′ ,s),uℓ =0 αℓ−1 (s
′)
· γℓ (s′ , s) · βℓ (s)
L(uℓ | y) = log P
(s′ ,s),uℓ =1 αℓ−1 (s ) · γℓ (s , s) · βℓ (s)
′ ′

α0 (0) = 1 α1 (0) α2 (0) α3 (0) βL−2 (0) βL−1 (0) βL (0)


γ1 (0, 0) γ2 (0, 0) γ3 (0, 0) γl−1 (0, 0) γL (0, 0)
00
γ1 (0, 2) γ2 (0, 2) γL−1 (1, 0) γL (1, 0)
α0 (2) = 0 α1 (2) α2 (2) α3 (2) βL−2 (2) βL−1 (2) βL (2)

10 γ3 (1, 0)
γ2 (2, 1) γL−1 (2, 1)
α0 (1) = 0 α1 (1) α2 (1) α3 (1) βL−2 (1) βL−1 (1) βL (1)

01
γ2 (2, 3) γL−1 (3, 1)
α0 (3) = 0 α1 (3) α2 (3) α3 (3) βL−2 (3) βL−1 (3) βL (3)

11
`=0 `=1 `=2 `=3 `=L−2 `=L−1 `=L
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 220 / 385
BCJR Implementation in Log-Domain
• State transitions
γ̄ℓ (s′ , s) = log γℓ (s′ , s) = log p(yℓ |s′ , s) + log Pr{uℓ = u(s′ , s)}

• Forward recursion
X X
ᾱℓ (s) = log γℓ (s′ , s) · αℓ−1 (s′ ) = log eγ̄ℓ (s ,s)+ᾱℓ−1 (s )
′ ′

s′ s′

• Backward recursion
X X
β̄ℓ−1 (s′ ) = log γℓ (s′ , s) · βℓ (s) = log eγ̄ℓ (s ,s)+β̄ℓ (s)

s s

• Jacobi logarithm
log(ex1 + ex2 ) = max(x1 , x2 ) + log(1 − e−|x1 −x2 | ) ≈ max(x1 , x2 )
| {z }
correction term
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 221 / 385
Max-Log-MAP (BCJR without Correction Term)
• State transitions
∥yℓ − x(s′ , s)∥2
γ̄ℓ (s′ , s) = 2 + log Pr{uℓ = u(s′ , s)}
2σW
• Forward recursion ᾱℓ (s) = max(γ̄ℓ (s′ , s) + ᾱℓ−1 (s′ ))
s′
(
0 s′
• Initialization: ᾱ0 (s) =
−∞ s′ =
̸ 0

• Backward recursion β̄ℓ−1 (s′ ) = max(γ̄ℓ (s′ , s) + β̄ℓ (s))


s′
(
0 s′
• Initialization for terminated trellis: β̄ℓ (s) =
−∞ s′ =
̸ 0

• Initialization for truncated trellis: β̄ℓ (s) = 0 ∀s


V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 222 / 385
Simulation Results for Turbo Codes
Recursive convolutional codes with Lc = 3 and different interleavers

100
it. 1
it. 2
10−1 it. 3
it. 4
10−2 10 x 10 block interleaver it. 6
BER →

10−3

10−4 30 x 30 block interleaver

10−5
0 1 2 3 4 5 6
10 log10 (Eb /N0 ) in dB →
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 223 / 385
Simulation Results for Turbo Codes
Recursive convolutional codes with Lc = 3, Rc = 1/2 and Lπ = 900

100
RIL it. 1
RIL it. 2
10−1 RIL it. 3
RIL it. 5
10−2 RIL it. 6
BER →

BIL it. 1
10−3 BIL it. 2
BIL it. 3
BIL it. 4
10−4 BIL it. 6

10−5
0 1 2 3 4 5 6
10 log10 (Eb /N0 ) in dB →
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 224 / 385
Simulation Results for Turbo Codes
Recursive convolutional codes with Lc = 3 with block and random Π

100
BIL Lπ = 100
BIL Lπ = 400
10−1 BIL Lπ = 900
RIL Lπ = 900
10−2 RIL Lπ = 900 (Rc = 1/3)
BER →

CC
10−3

10−4

10−5
0 1 2 3 4 5 6
10 log10 (Eb /N0 ) in dB →
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 225 / 385
Simulation Results for Turbo Codes (Lc = 4)
Parallel concatenation, code rate Rc = 1/2

100
recursive encoders
g1 = 1, g2 = 178 /158
10−1
random interleaver: Lπ = 6000

10−2
BER →

10−3

10−4
1.3 dB
10−5
0 1 2 3 4 5 6
10 log10 (Eb /N0 ) in dB →
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 226 / 385
Repeat Accumulate Code by ten Brink
Outer half-rate repetition and inner rate-one recursive conv. encoder

100

10−1 it. 10
it. 20
10−2 it. 30
BER →

it. 40
it. 50
10−3 it. 60
it. 70

10−4 it. 80
it. 90
it. 100 approx. 100 decoding iterations needed
10−5
0 0.1 0.2 0.3 0.4 0.5 0.6
10 log10 (Eb /N0 ) in dB →
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 227 / 385
Comparison of Serial and Parallel Concatenation (2) 14

14
[ten Brink: Rate one-half code for approaching the Shannon limit by 0.1 dB, Electronics Letters,
Vol. 36, No. 15, 2000]
V. Kühn | Error Control Coding | Modern Error Correcting Codes → Turbo Decoding Principle 228 / 385
Error Control Coding
Table of Contents

Basics of Digital Communications – A Summary

Two Lessons of Information Theory

Forward Error Correcting Codes – The Classics

V. Kühn | Error Control Coding | Modern Error Correcting Codes → EXIT Charts 229 / 385
EXIT Chart Analysis of Turbo Decoding
proposed by Stephan ten Brink

• Amazing performance of concatenated codes and turbo decoding


• No explanation why turbo decoding performs so well

• No systematic code construction available

• Union upper bound assumes ML decoding and determination of


IOWEF is challenging

• EXIT charts analyze turbo decoding behavior


• Tight prediction of concatenated codes’ performance

• Allow to design codes such that constituent decoders match

• Allow estimation of loss compared to channel capacity

V. Kühn | Error Control Coding | Modern Error Correcting Codes → EXIT Charts 230 / 385
Mutual Information for Turbo Decoder
Parallel Concatenation

Le,2 (u)
La,1 (u) Q−1

u Lch y − Le,1 (u) Q


C D1 D2

Ia,1 = I(u; La,1 (u)) La,2 (u)

Ie,1 = I(u; Le,1 (u))

Ie,2 = I(u; Le,2 (u))

Ia,2 = I(u; La,2 (u))

V. Kühn | Error Control Coding | Modern Error Correcting Codes → EXIT Charts 231 / 385
Mutual Information for Single Decoder

La (u)
Le (u)
w −
u x Lch y
C BPSK D L(û)

C
Ia = I(u; La (u))

Ie = I(u; Le (u))

It = I(u; Lû (u))

V. Kühn | Error Control Coding | Modern Error Correcting Codes → EXIT Charts 232 / 385
EXIT-Chart Analysis of Turbo Decoding
• Exchange of extrinsic information between decoders in iterative
process analyzed by semi-analytic approach

• Mutual information I(u; La (u)) between information bits and a priori


input of decoder modeled with artificial AWGN
σa2
La (u) = · (1 − 2u) + wa wa ∼ N (0, σa2 )
2

• Mutual information I(u; Le (u)) between information bits and


extrinsic output of decoder numerically determined for specific
I(u; La (u)) at input
• Draw curves I(u; Le (u)) = f (I(u; La (u))) of both contributing
decoders into one diagram: → EXIT chart: EXtrinsic Information
Transfer chart
V. Kühn | Error Control Coding | Modern Error Correcting Codes → EXIT Charts 233 / 385
Dependency of Noise Variance and A Priori Information
1 Z ∞  
1 X p(La |u = µ)
Ia = 1 + · p(La |u = µ) log2 dLa
2 −∞
p(La |u = 0) + p(La |u = 1)
µ=0

0.8

0.6
Ia →

0.4

0.2

0
0 5 10 15 20 25 30 35 40
σa2 →
V. Kühn | Error Control Coding | Modern Error Correcting Codes → EXIT Charts 234 / 385
Transfer Characteristic of Decoder
Mutual information between decoder input and output

• Computation using histograms (estimates of probability density


function)
1 Z∞
1 X
Iˆe = 1 + · p̂(Le | u = µ)
2 µ=0
−∞
 
p̂(Le | u = µ)
· log2 dLe
p̂(Le | u = 0) + p̂(Le | u = 1)

• Computation using reliabilities of bits (magnitudes of LLRs)

ˆ 1X L
1 2 1 2
Ie = ·log2 + ·log2
L i=1 1 + e|Le,i | 1+e |Le,i | 1+e−|Le,i | 1 + e−|Le,i |

V. Kühn | Error Control Coding | Modern Error Correcting Codes → EXIT Charts 235 / 385
Extrinsic and Total Mutual Information
• Only a-priori information at decoder input La Le
DEC.
• Extrinsic and a priori information do not sum up Lt
1 1

0.8 0.8

0.6 0.6
Ie →

0.4 Lc = 3 It → 0.4 Lc = 3
Lc = 5 Lc = 5
0.2 Lc = 7 0.2 Lc = 7
uncoded uncoded
0 0
0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1
Ia → Ia →
V. Kühn | Error Control Coding | Modern Error Correcting Codes → EXIT Charts 236 / 385
Comparison of MAP and Max-Log-MAP
1 La Le
DEC.
0.8 Lch y

0.6 • High SNR leads to high


Ie →

extrinsic information
0.4 10 log10 (Eb /N0 ) = −1 dB • Large a-priori
10 log10 (Eb /N0 ) = 0 dB information
10 log10 (Eb /N0 ) = 1 dB
0.2 compensates bad
10 log10 (Eb /N0 ) = 2 dB
10 log10 (Eb /N0 ) = 3 dB channel conditions
0 • Max-Log-MAP performs
0 0.2 0.4 0.6 0.8 1
nearly as good as
Ia →
optimal MAP

V. Kühn | Error Control Coding | Modern Error Correcting Codes → EXIT Charts 237 / 385
EXtrinsic Information Transfer (EXIT) Charts
• Serial concatenation of inner recursive convolutional encoder and
outer non-recursive convolutional encoder
• Extrinsic information of inner decoder is a priori information for outer
decoder and vice versa
• Drawing trajectories of both decoders into one diagram and flipping
abscissa and ordinate for outer convolutional code leads to EXIT
charts
Ie,1 =Ia,2

Lch · y −
D2 D1
Ie,2 =Ia,1

V. Kühn | Error Control Coding | Modern Error Correcting Codes → EXIT Charts 238 / 385
EXIT Charts for Serial Concatenation
1
Outer non-recursive
0.8 convolutional encoder
g1 = 158
Ia,1 = Ie,2 →

0.6 g2 = 138
10 log10 (Eb /N0 ) = −1 dB Rc = 3/4
0.4 10 log10 (Eb /N0 ) = 0 dB
10 log10 (Eb /N0 ) = 1 dB
Inner recursive
10 log10 (Eb /N0 ) = 1.2 dB
0.2 10 log10 (Eb /N0 ) = 2 dB
convolutional encoder
10 log10 (Eb /N0 ) = 3 dB g1 = 18
0
outer decoder D1
g2 = 138 /158
0 0.2 0.4 0.6 0.8 1 Rc = 2/3
Ia,2 = Ie,1 →

V. Kühn | Error Control Coding | Modern Error Correcting Codes → EXIT Charts 239 / 385
EXIT Charts for Serial Concatenation
1
Outer non-recursive
0.8 convolutional encoder
g1 = 158
Ia,1 = Ie,2 →

0.6 g2 = 138
Rc = 3/4
0.4
Inner recursive
0.2 convolutional encoder
g1 = 18
0 g2 = 138 /158
0 0.2 0.4 0.6 0.8 1 Rc = 2/3
Ia,2 = Ie,1 →

V. Kühn | Error Control Coding | Modern Error Correcting Codes → EXIT Charts 240 / 385
Further Results on EXIT Charts

• For Binary Erasure Channels (BECs), it can be proved that area


between decoder curves represents loss compared to channel
capacity (AWGN channel behaves similar to BEC)

→ Curves should be nearly identical for capacity achieving codes


→ Small gap between curves require high number of decoding
iterations

→ Designing capacity achieving concatenated codes essentially


becomes a curve fitting problem

V. Kühn | Error Control Coding | Modern Error Correcting Codes → EXIT Charts 241 / 385
Code Design for Half-Rate Repeat-Accumulate Code
Signal-to-Noise ratio 10 log10 (Es /N0 ) = 0.5 dB
1

0.8
Outer repetition code
Ia,1 = Ie,2 →

0.6 Rc = 1/2

0.4 Inner recursive


convolutional encoder
0.2 Rc = 1
outer repetition code
inner RSC code
0
0 0.2 0.4 0.6 0.8 1
Ia,2 = Ie,1 →
V. Kühn | Error Control Coding | Modern Error Correcting Codes → EXIT Charts 242 / 385

You might also like