You are on page 1of 37

ERROR CONTROL

CODING

EL5110 Information Theory & Coding


By
Ir. Emir Mauludi Husni, M.Sc., Ph.D.
School of Electrical Engineering & Informatics,
Institut Teknologi Bandung

Dr. Emir Husni, ECC: Fundamentals 1


Definitions
Coding:
Mapping of information to a set of symbols
Source Coding:
To compress information
Encryption encoding (cryptography):
To protect information from misuse

Error control coding (channel coding):


To make information immune from random distortion.

2
Dr. E. M. Husni, ECC:
Fundamentals
Improvements obtained
Increase the operational range of a
communications system
Reduce the error rate
Reduce the transmitted power requirements
Combinations of all benefits

3
Dr. E. M. Husni, ECC:
Fundamentals
Error Control Schemes
Forward error correction:
A code capable of detecting and correcting errors is applied
to the messages.
● Maximum use of received information
● FEC may fail
● Failure may increase subjective or objective effects

Retransmission error control:


To detect errors and request retransmission.
● Error detection is reliable
● Channel capacity reduced
● Variable delays restrict applicability

4
Dr. E. M. Husni, ECC:
Fundamentals
Coding in Communication
Systems

Vector = sequence (c)


Vector coordinate = symbol (cj)

5
Dr. E. M. Husni, ECC:
Fundamentals
Encoder
The information is formed into
frames.
Each frame consisting of a fixed
number of symbols (1 symbol =
1 bit or several bits).
The output generally contains
more symbols than the input,
i.e. redundancy has been
added.
Code rate, R=k/n
A (n,k) block code uses only the
current frame to produce its
output
A (n0,k0) tree code (member of
convolutional codes) uses
previous frames in its algorithm.

6
Dr. E. M. Husni, ECC:
Fundamentals
The channel (1)
The transmission medium introduces a number of effects:
attenuation, distortion, interference and noise, which make it
uncertain whether the information will be received correctly.
The way in which the transmitted symbols are corrupted:
Memoryless channel – the probability of error is independent from
one symbol to the next. Gives random errors.
Additive white Gaussian noise (AWGN) channel – a memoryless
channel in which the transmitted signal suffers the addition of
wide-band noise whose amplitude is a normally (Gaussian)
distributed random variable. Gives random errors.
Bursty channel – the error are characterized by periods of
relatively high symbol error rate separated by periods of relatively
low, or zero, error rate. This is usually happened in mobile
environment, memory channels. Gives bursty errors

7
Dr. E. M. Husni, ECC:
Fundamentals
The Channel (2)
(continued):
Symmetric channel – the
probability of a transmitted symbol
valued x being received as a value
y is the same as that of a
transmitted symbol value y being
received as x, for all values of x
and y.
Commonly encountered example
is the binary simmetric channel
(BSC) with a probability p of bit
error.
Compound (or diffuse) channel –
the errors consist of a mixture of
bursts and random errors. In
reality all channels exhibit some
form of compound characteristics.

8
Dr. E. M. Husni, ECC:
Fundamentals
Decoding
The job is to decide the transmitted information was.
It is able because only certain transmitted sequences,
codewords, are possible (=2k) and any errors are likely to
result in reception of a noncode sequence.
On a memoryless channel, the best strategy is to
select the codeword that is closest to the received
sequence.
Called minimum distance decoding, the measure of
difference between sequences is known as distance.
When a received word is decoded, the output is
correct or false, or no decision can be made.
These outputs maybe defined as correct decoding, false
decoding and decoding failure respectively.

9
Dr. E. M. Husni, ECC:
Fundamentals
Modulo Calculation
Defined for the term a = cb + d, such that
a = d mod b,
Where: a,c Є integer numbers,
b Є natural numbers,
d Є natural numbers including zero.

Example: 71 = 1 mod 7, -13 = 2 mod 5, 30 = 6 mod 8

10
Dr. E. M. Husni, ECC:
Fundamentals
Arithmetic of Modulo-2
Addition modulo-2 (Exclusive OR)
0+0=0
0+1=1
1+0=1
1+1=0
Multiplication (AND)
0•0=0
0•1=0
1•0=0
1•1=1

11
Dr. E. M. Husni, ECC:
Fundamentals
Single parity check code,
PC code
Single PC code of length n = i0 i1 c2
k+1 is k information bits 0 0 0
i0, i1,…, ik-1
0 1 1
to a codeword:
c0, c1,…, cn-1 1 0 1
by equating 1 1 0
c0=i0, c1=i1,…, ck-1=ik-1
The last symbol (coordinate) of
the codeword cn-1satisfies the
equation:

12
Dr. E. M. Husni, ECC:
Fundamentals
Repetition Code
A binary repetion code of length n consists of
two codewords:

The all-zero word c0 = c1 = … = cn-1 = 0

The all-one word c0 = c1 = … = cn-1 = 1

13
Dr. E. M. Husni, ECC:
Fundamentals
Hamming Metric (1)
Hamming weight
Number of non-zero
vector coordinates
Hamming distance
The distance between
two vectors a and c is
the number of
coordinates where a and
c differ.

14
Dr. E. M. Husni, ECC:
Fundamentals
Hamming Metric (2)
Weight enumeration
The j-th coordinate wj of the weight enumeration vector W =
(w0, w1,…,wn) of a code C of length n is defined as the number
of codewords with weight j.

Example:
The weight enumeration vector of the single parity check code of length n
= 3 is
W = (1, 0, 3, 0)
The weight enumeration vector of the repetition code of length n is
w0 = 1, wn = 1, wj, = 0, j = 1,…,n-1

15
Dr. E. M. Husni, ECC:
Fundamentals
Decoding Metric (1)
We need to find the maximum value of p(c|r), the
probability that c was the transmitted sequence given that
r was received. But

where p(c,r) is the probability that c is transmitted and r is


received.
Thus if the transmitted sequences are equiprobable, find
the transmitted sequence that maximizes the probability of
giving rise to the received sequences.

16
Dr. E. M. Husni, ECC:
Fundamentals
Decoding Metric (2)
If the received sequence has n symbols and the symbol
error are independent, then

17
Dr. E. M. Husni, ECC:
Fundamentals
Decoding Metric (3)
Now, define a metric for any symbol, transmitted as x
and received as y, as

The constants A and B do not affect the ranking of


metrics for different path of the same length, and can
be chosen to have any convenient values.
Over BSC, for example, we could make the metric be
0 for x = y and 1 otherwise.

18
Dr. E. M. Husni, ECC:
Fundamentals
Soft-Decision Metrics for a
Gaussian Channel (1)
To cover soft decision, we need to know the transition
probabilities between the transmitted and received values.
The optimum metric will vary according to the signal-to-noise ratio
at the receiver. A fixed scheme will often provide an adequate
compromise over a wide range of operating conditions.
Due to noise, difficult to have analog detected signal, but we
might quantize the output to several levels. In practice it is found
that eight levels give an adequately good performance.
Set the thresholds for detected signal levels at intervals of 2ÖEs/7.
The demodulator is passing a 3-bit value to the decoder.
The question: the measure of distance to be attached when
comparing one of these 3-bit values with 0 or 1.
Distance 0,1,2,…,7 from the code symbol 0 or 1 turns out to be not quite
right.
There is close to optimum measurement of distance for soft decision
metrics

19
Dr. E. M. Husni, ECC:
Fundamentals
Soft-Decision Metrics for a
Gaussian Channel (2)

20
Dr. E. M. Husni, ECC:
Fundamentals
Soft-Decision Metrics for a
Gaussian Channel (3)
Log-likelihood ratio

Values of x for bit 1 and 0 are +ÖEs and -ÖEs

21
Dr. E. M. Husni, ECC:
Fundamentals
Soft-Decision Metrics for a
Gaussian Channel (4)
For Es/No = 2 dB

22
Dr. E. M. Husni, ECC:
Fundamentals
Soft-Decision Metrics for a
Gaussian Channel (5)
The logarithmic probability that a sequence c of length n was
transmitted, given the received sequence r, is

We wish to maximize this value over all possible code


sequences.
So to decode, we can maximize

Where ci is taken as having values ±1

23
Dr. E. M. Husni, ECC:
Fundamentals
Soft-Decision Metrics for a
Gaussian Channel (6)
Correlations for soft-decision sequence
+2.5 +0.5 +1.5 +0.5 +3.5

Codeword Correlation
-1 -1 -1 -1 -1 -8.5
-1 +1 -1 +1 +1 +0.5
+1 -1 +1 -1 +1 +6.5
+1 +1 +1 +1 -1 +1.5
24
Dr. E. M. Husni, ECC:
Fundamentals
Coding Gain with Hard and
Soft Decision Demodulation
Hard decision gives asymptotic coding gain of
10log10[R(d+1)/2] dB = 10log10[R(t+1)] dB where d is
distance between closest code sequence
Ideal in unquantized (analog output)
Unquantized soft decision gives asymptotic coding
gain of 10 log10[Rd] dB
Usually over 3 dB better than hard decision
Quantization to 8 levels gives close approximation to
analog output
Reduces coding gain by about 0.25 dB

25
Dr. E. M. Husni, ECC:
Fundamentals
Linear Systems

26
Dr. E. M. Husni, ECC:
Fundamentals
Linear Codes
Slightly less restrictive than linear systems definition
Not concerned with mapping from information to code
Linear systems used in practice to encode
Sum of 2 codewords gives a codeword
Product of codeword and scalar factor gives a
codeword
All zero sequence is a codeword
Distance structure appears same viewed from any
codeword
Minimum distance equal to weight of minimum
weight (non zero) sequence

27
Dr. E. M. Husni, ECC:
Fundamentals
Example of a Block Code
Systematic Code: Information
0000
Code
0000000
The mapping of information 1000 1000110
symbols into a codeword is 0100 0100101
1100 1100011
systematic if the k information
0010 0010011
symbols are unchanged within the 1010 1010101
n codeword coordinates. 0110 0110110
1110 1110000
Minimum distance of this 0001 0001111
code, dmin = 3 1001 1001001
0101 0101010
1101 1101100
0011 0011100
1011 1011010
0111 0111001
1111 1111111
28
Dr. E. M. Husni, ECC:
Fundamentals
Minimum (Hamming) Distance
The minimum (Hamming)
distance dmin of a code C
is the minimum distance
between two different
codewords
For linear codes, sum of 2
codewords gives a
codewords so, the
minimum distance is equal
to the minimum weight

29
Dr. E. M. Husni, ECC:
Fundamentals
Single Error Correction
(Transmitted codeword = 1100011)
Received Codeword Distance Received Code Distance
1000011 0000000 3 1100111 0000000 5
1000011 1000110 2 1100111 1000110 2
1000011 0100101 4 1100111 0100101 2
1000011 1100011 1 1100111 1100011 1
1000011 0010011 2 1100111 0010011 4
1000011 1010101 3 1100111 1010101 3
1000011 0110110 5 1100111 0110110 3
1000011 1110000 4 1100111 1110000 4
1000011 0001111 3 1100111 0001111 3
1000011 1001001 2 1100111 1001001 4
1000011 0101010 4 1100111 0101010 4
1000011 1101100 5 1100111 1101100 3
1000011 0011100 6 1100111 0011100 6
1000011 1011010 3 1100111 1011010 5
1000011 0111001 5 1100111 0111001 5
1000011 1111111 4 1100111 1111111 2
30
Dr. E. M. Husni, ECC:
Fundamentals
Double-Error Correction?
(Transmitted codeword = 1100011)
Received Codeword Distance
1101001 0000000 4
1101001 1000110 5
1101001 0100101 3
1101001 1100011 2
1101001 0010011 5
1101001 1010101 4
1101001 0110110 6
1101001 1110000 3
1101001 0001111 4
1101001 1001001 1
1101001 0101010 3
1101001 1101100 2
1101001 0011100 5
1101001 1011010 4
1101001 0111001 2
1101001 1111111 3
31
Dr. E. M. Husni, ECC:
Fundamentals
Random Error Detection and
Correction Capability of Block Codes
dmin > s + t
s is number of errors detected
t is number of errors corrected ( ≤ s)
Example dmin = 5
● s=4, t=0
● s=3, t=1
● s=2, t=2
t=s (maximum correction)

32
Dr. E. M. Husni, ECC:
Fundamentals
Coding Gain (1)

33
Dr. E. M. Husni, ECC:
Fundamentals
Coding Gain (2)
Eb is energy per bit of information
N0 is single sided noise power spectral density
Coding increases Eb/N0 by 10 log10(1/R) dB
Coding changes BER
If same BER achieved with lower Eb/N0 after coding
then coding gain results
Coding gain is a function of BER
Approaches asymptotic (maximum) value
Coding gain can be negative

34
Dr. E. M. Husni, ECC:
Fundamentals
Information Theory (1)
The pioneering work of Claude Shannon in the late
1940s prove that signaling schemes exist such that
error-free transmission can be achieved at any rate
lower than channel capacity (error-free capacity, C
(bits/s))
For AWGN channel:
C = B log2(1+(Eb/No).(R/B))
B is bandwidth, R is data rate (bits/s), Eb is energy per bit
of received signal and No is single-sided noise power
spectral density

35
Dr. E. M. Husni, ECC:
Fundamentals
Information Theory (2)
C = B log2(1+(Eb/No).(R/B))
Setting R=C, R/W®0 (i.e., infinite bandwidth) gives
Eb/No = -1.6 dB
At any Eb/No greater than -1.6 dB zero probability of
making a transmission error is possible at the
expense of infinite transmission bandwidth
Reliable communication cannot be achieved in
practrice in these conditions
All known codes are worse than average

36
Dr. E. M. Husni, ECC:
Fundamentals
Coding Choices
Convolutional codes good for random errors at moderate required
BER (10-5 to 10-7)
Soft decision decoding gives ~ 5 dB
Can interleave in bursty conditions
Mainly rate ½
Block codes good for low BER (<10-8)
Usually only hard decision decoding
High asymptotic coding gains
Very low BER needed to approach asymptotic gain
Binary BCH codes or Golay code for random errors
RS codes for bursty conditions

37
Dr. E. M. Husni, ECC:
Fundamentals

You might also like