You are on page 1of 17

Error Control Coding

Lecture Notes

Prof. Dr.-Ing. Volker Kühn


University of Rostock
Institute of Communications Engineering
Error Control Coding
Table of Contents

Basics of Digital Communications – A Summary

Two Lessons of Information Theory

Forward Error Correcting Codes – The Classics


Principles and Preliminaries
Decoding Principles & Error Probabilities
General linear block codes

V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 124 / 385
Questions and Examples
Questions

• How do we have to choose code words?

• How can we implement mapping from information word u to code


word c?

Examples

• Repetition code: encoding, decoding, error probability

• Single-Parity Check (SPC) code: encoding, decoding, error


probability

• Hamming code: encoding, decoding, error probability


V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 125 / 385
Some Simple Code Examples
Error Probabilities of Repetition Codes
100
n=2
10−1 n=4
n=6
10−2 n=8
Pb →

n = 10
10−3
uncoded
10−4 uncoded
10−5 soft decoding
hard decoding
10−6
−10 −5 0 5 10 15 −10 −5 0 5 10 15
10 log10 (Es /N0 ) in dB → 10 log10 (Eb /N0 ) in dB →

• left: hard decoding (dashed) loses up to 3 dB compared to


soft-decoding (solid)
• right: repetition coding does not offer a coding gain
V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 126 / 385
Some Simple Code Examples
Error Probability for (7,4)-Hamming Code

100 uncoded
Hard, BMD
10≠1 Soft, MLD

10≠2
Pw æ

10≠3

10≠4 coding gain

10≠5

10≠6
0 1 2 3 4 5 6 7 8 9 10 11 12
10 log10 (Eb /N0 ) in dB æ

V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 127 / 385
Generator Matrix of Linear Block Codes
• Each row contains valid code word
g0,0 ... g0,n−1
 
.. .. ..
G= . . .
 

gk−1,0 . . . gk−1,n−1
• Encoding: c = u · G mod q
in the sequel, all calculations are carried out mod q
• Code: Γ = {c = uG mod q|u ∈ GF(q)k }

• Systematic encoder ( u explicitly part of c ) (c = [u p])

1 0
 

G = [Ik |Pk×n−k ] =  ..
. Pk×n−k 
 

0 1
V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 128 / 385
Parity Check Matrix of Linear Block Codes Return

• Parity check matrix (linear independent rows span orthogonal space)


h0,0 ... h0,n−1
 
.. .. ..
H= . . .
 

hn−k−1,0 . . . hn−k−1,n−1

• For systematic encoder: H = [−PT


k×n−k |In−k ]

• Product of generator and parity check matrix


" #
T   −Pk×n−k
G·HT = Ik Pk×n−k · −PT = Ik Pk×n−k · =0
 
k×n−k In−k
In−k

• All code words fulfill c · HT mod q = 0


Γ = {c = GF(q)n |c · HT mod q = 0}
V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 129 / 385
Generator and Parity Check Matrices of Code Examples
• Repetition code: (n, 1, n)-code

1 1 0
 
1 1 1

G= 1 1...1 H =  ... ..  n−1
h i 
.
 
0 0 0
1 0 1
| {z } 

n

• Single parity check (SPC) code: (n, n − 1, 2)-code


1 1 0
1 0 1
 
1 0 1

.. .. 
G=  .  n − 1 H= 1 1...1
 h i
.

0 1 1
0 1 1
 | {z }
n
0 0 0

• Repetition code is dual code of single parity check code


→ generator and parity check matrix are exchanged
V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 130 / 385
Example of (7, 4, 3)2 Hamming Code
• Generator matrix
1 0 0 0 0 1 1
 
 0 1 0 0 1 0 1 
G=
 0 0 1 0 1 1 0 

0 0 0 1 1 1 1
• Parity Check Matrix
0 1 1 1 1 0 0
 

H= 1 0 1 1 0 1 0 
1 1 0 1 0 0 1
• Perfect code: columns of H represent all 2n−k − 1 syndromes
(except 0)
• For appropriate H: position of syndrome s corresponds to position
of error in y → encoder is not systematic any more!
V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 131 / 385
Hamming code
• An (n, n − r, 3)q -Hamming code of rank r has length n = qq−1
−1 r

• All Hamming codes have a minimum distance of dH = 3, whereas


the code rate tends to Rc = 1 for n → ∞.
k (q r − 1)/(q − 1) − r (q r − 1) − r(q − 1)
Rc = = = −−−→ 1
n (q r − 1)/(q − 1) (q r − 1) r→∞

• Columns of parity check matrix represent all 2r − 1 possible binary


vectors of length r

• Examples for q = 2: (3, 1), (7, 4), (15, 11), (31, 26), (63, 57),
(127, 120)
• Hamming codes are perfect codes, i.e. the number of distinct
syndromes equals the number of correctable errors.
V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 132 / 385
Simplex and Walsh-Hadamard Code
• Simplex code
• Exchanging G and H of Hamming code (dual code)
• All code words have same pair-wise Hamming distance dH = 2r−1
• (7, 4, 3)2 Hamming code → (7, 3, 4) Simplex code
• Walsh-Hadamard Code
• Extension of Simplex code by a zero in front of each code word
• BPSK mapping (1 → −1, 0 → +1) yields orthogonal code words
• Used as CDMA spreading codes in UMTS
+1 +1 +1 +1 +1 +1 +1 +1
 
 +1 +1 +1 +1 −1 −1 −1 −1
00111100
h i  +1
+1
+1
+1
−1
−1
−1
−1
−1
+1
−1
+1
+1
−1
+1 
−1 
G= 0 1 0 1 1 0 1 0

Γ=
 +1 −1 −1 +1 +1 −1 −1 +1 
01101001  +1 −1 +1 −1 −1 +1 −1 +1 
+1 −1 −1 +1 −1 +1 +1 −1
+1 −1 +1 −1 +1 −1 +1 −1
V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 133 / 385
System Model with AWGN and Hard-Input Decoding
u c x
digital channel BPSK
source encoder

Rc = k/n
w

û c̃ y
digital channel hard
sink decoder decision
BSC

• Channel output is fed to decision unit


• Decoder obtains hard inputs and shall correct wrong decisions
• Equivalent BSC from encoder output to decoder input
• Often used for algebraic decoding of block codes
V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 134 / 385
Error detection with Syndrome
• Hard decision on y = x + w:
x̂ = Q(y) ⇒ c̃ = (1 − x̂)/2 = c + e

• Calculation of syndrome s = [s0 . . . sn−k−1 ]T

s = c̃ · HT mod q = (c + e) · HT mod q
T T
=c·H mod q +e · H mod q = e · HT mod q
| {z }
=0

Syndrome only depends on error and not on transmitted code word


• Error detection:
• s = 0 → no detectable error, i.e. either there is no error (e = 0) or
error is valid code word (e ∈ Γ \ {0})
• s ̸= 0 → error detected
V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 135 / 385
Error Correction by Syndrome Decoding
Situation and Initialization

• Not every error is correctable (only up to t = ⌊(dmin − 1)/2⌋)

→ q n − q k possible error patterns e, but only r=1 ni (q − 1)r are


Pt 

correctable

→ every correctable error must have unique syndrome (q n−k )


• Initialization
• Partition set of all words GF(q)n into cosets Mµ

• Coset Mµ contains all words generating same syndrome sµ

• Determine coset leader eµ for each coset Mµ :


eµ = argmine∈Mµ wH (e)
• Store all syndromes sµ and corresponding coset leaders eµ in table

V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 136 / 385
Error Correction by Syndrome Decoding
Principle of Syndrome Decoding
• Syndrome decoding
1. Compute syndrome s = c̃ · HT mod q

2. If syndrome s ̸= 0, get corresponding coset leader ê(s) from table

3. Correct error by subtracting ê from received word

ĉML = c̃ − ê(s)

• Syndrome decoding fulfills Maximum Likelihood principle and is


optimal hard-input decoding strategy

• Syndrome decoding is of little practical interest because storing


syndromes / coset leaders becomes infeasible for long codes
(memory constraints)
V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 137 / 385
Syndrome Decoding for (7, 4) Hamming Code
• There exist only 27−4 = 8 different syndromes, but
27 − 24 = 128 − 16 = 112 error patterns
• Due to dmin = 3 only t = 1 error correctable

• Coset leader represented by 7 vectors with wH (eµ ) = 1


syndrome coset leader
001 0000001
0 1 1 1 1 0 0
 
010 0000010
H = 1 0 1 1 0 1 0 011 1000000
 
1 1 0 1 0 0 1 100 0000100
101 0100000
110 0010000
111 0001000
V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 138 / 385
Limitations of Syndrome Decoding
• Storing syndromes / coset leaders infeasible for long codes

• Loss of appr. 2 dB due to hard decision prior to decoding

• Suboptimal but efficient decoding algorithms required


• Algebraic decoding exploits algebraic code structure
• Mathematically demanding approaches

• Very efficient BMD implementation for Reed-Solomon and BCH codes

• Non-algebraic decoding mainly follows intuition (not considered)


• Computational effort increases with number of errors to be corrected

• Graph-based decoding
• Iterative soft-in soft-out decoding algorithms (see LDPC codes)
V. Kühn | Error Control Coding | Forward Error Correcting Codes → General linear block codes 139 / 385

You might also like