You are on page 1of 65

CONVOLUTIONA

L CODES
Dr. Y. LEHWIJI EC 627 Presented BY:
- RIMA HASSAN
CODING THEORY
- KHAWLA ASHERIF
- RABIA GETH
contents
• Introduction to convolutional codes
• Convolutional encoder
• Convolutional codes representation and explanation with examples
– Generator Representation
– State Diagram Representation
– Tree Diagram Representation
– Trellis Diagram Representation
• Decoding of convolutional codes
– Sequential decoding
– Viterbi algorithm
– Soft decision coding.
• Convolutional codes vs. Block codes
• Advantages of convolutional codes
• Practical examples of convolutional codes
FORWARD ERROR CORRECTION
CODE
• There are four important forward error correction codes
that find applications in digital transmission. They are:
Block Parity
Hamming Code
Interleaved Code
Convolutional Codes
Introduction to convolutional codes

• Convolutional codes are introduced by Elias in 1955.


• Convolution coding is a popular error-correcting coding method
used to improve the reliability of communication system.
• A message is convoluted, and then transmitted into a noisy channel.
• Convolution codes are error detecting codes used to reliably
transmit digital data over unreliable communication channel system
to channel noise.
Introduction to convolutional codes
(Cont’d)

The convolutional codes map information to code bits, but


sequentially convolve the sequence of information bit according to
some rule.
The convolutional coding can be applied to a continuous data
stream as well as to blocks of data whereas the block codes can be
applied only for the block of data.
Convolutional Encoder

Fig(a):Convolutional Encoder

• Fig(a) shows a convolutional encoder .Here L=3,v=2.


• The message bits are applied at the input of the shift register (SR). The coded
digit stream is obtained at the commutator output. The commutator samples the ν
modulo-2 adders in a sequence, once during each input-bit interval.
Convolutional Encoder (Cont’d)
Example: Assume that the input digits are 10101. Find the coded
sequence output for Fig(a).
• Initially, the Shift Registers s1=s2=s3=0.
• When the first message bit 1 enters the SR, s1= 1, s2 = s3=0.Then ν1=1, ν2=1 and
the coder output is 11.
• When the second message bit 0 enters the SR, s1=0, s2=1, s3=0.Then ν1=1 and
ν2=0 and the coder output is 01.
• When the third message bit 1 enters the SR, s1=1, s2=0 and s3=1Then ν1=0 and
ν2=0 and the coder output is 00.
• When the fourth message bit 0 enters the SR, s1=0, s2=1 and s3=0.Then ν1=1 and
ν2=0 and the coder output is 10.
• The coded Output Sequence is : 11100010
v1
1
1

1010
10
101 11 00 01 00
0 1 0 0

v2
0
1

• Initially, the Shift Registers s1=s2=s3=0.


• When the first message bit 1 enters the SR, s1= 1, s2 = s3=0.Then ν1=1,
ν2=1 and the coder output is 11.
• When the second message bit 0 enters the SR, s1=0, s2=1, s3=0.Then ν2=0
and ν1=1 and the coder output is 10.
Convolutional encoder (cont’d)
PARAMETERS OF A CONVOLUTIONAL ENCODER :
• Convolutional codes are commonly specified by three parameters: (n,k,m):
n = number of output bits
k = number of input bits
m = number of memory registers
• Code Rate: The quantity k/n is called as code rate. It is a measure of the efficiency
of the code.
• Constraint Length: The quantity L(or K) is called the constraint length of the code.
It represents the number of bits in the encoder memory that affect the generation of
the n output bits. It is defined by
• Constraint Length, L = k (m-1)
ENCODER REPRESENTATIONS
• The encoder can be represented in several different but equivalent ways. They
are:
I. Generator Representation
II. State Diagram Representation
III. Tree Diagram Representation
IV. Trellis Diagram Representation
ENCODER REPRESENTATIONS (cont’d)

• Generator Representation:
• Generator representation shows the hardware connection of the shift register
taps to the modulo-2 adders. A generator vector represents the position of the
taps for an output. A “1” represents a connection and a “0” represents no
connection.
• A convolutional code may be defined by a set of n generating polynomials for
each input bit.
• For the circuit under consideration:
– g1(X) = 1 + X2 . [101]
– g2(X) = 1 + X + X2 , [111]
• The set {gi(X)} defines the code completely. The length of the shift register is
equal to the highest-degree generator polynomial.
Example
ENCODER REPRESENTATIONS (cont’d)

State Diagram Representation


In the state diagram, the state information of the encoder is shown
in the circles. Each new input information bit causes a transition
from one state to another.
Contents of the rightmost (K-1) shift register stages define the
states of the encoder. The transition of an encoder from one state to
another, as caused by input bits, is depicted in the state diagram.
ENCODER REPRESENTATIONS
(cont’d)
 The path information between the states, denoted as x/c, represents input
information bit x and output encoded bits c.
It is customary to begin convolutional encoding from the all zero state.
Example: State diagram representation of convolutional codes.
v
1

Input
u S1 S2

v
2

Fig(b): State Diagram


Here k=1,n=2,K=3.
EXAMPLE: STATE DIAGRAM REPRESENTATION

State Diagram Here k=1,n=2,M=3.


ENCODER REPRESENTATIONS
(cont’d)
• From the state diagram
• Let 00 State a ; 01 State b; 10 State c; 11 State d;
• (1) State a goes to State a when the input is 0 and the output is 00
• (2) State a goes to State b when the input is 1 and the output is 11
• (3) State b goes to State c when the input is 0 and the output is 10
• (4) State b goes to State d when the input is 1 and the output is 01
• (5) State c goes to State a when the input is 0 and the output is 11
• (6) State c goes to State b when the input is 1 and the output is 00
• (7) State d goes to State c when the input is 0 and the output is 01
• (8) State d goes to State d when the input is 1 and the output is 10
v1

Input
u S1 S2 S3

v2
EXAMPLE: STATE DIAGRAM REPRESENTATION
ENCODER REPRESENTATIONS
(cont’d)
Tree Diagram Representation:
 The tree diagram representation shows all possible information and encoded
sequences for the convolutional encoder.
 In the tree diagram, a solid line represents input information bit 0 and a dashed line
represents input information bit 1.
 The corresponding output encoded bits are shown on the branches of the tree.
 An input information sequence defines a specific path through the tree diagram from
left to right.
ENCODER REPRESENTATIONS
(cont’d)
• Example: Tree Diagram representation of convolutional codes
ENCODER REPRESENTATIONS
TRELLIS DIAGRAM REPRESENTATION:

• The trellis diagram is basically a redrawing of the state diagram in time.


• The X-axis is discrete time and all possible stats are shown on the Y- axis.
• We move horizontally through the trellis with the passage of the time
ENCODER REPRESENTATIONS

STEPS TO DRAW TRELLIS DIAGRAM:


• The trellis diagram is drawn by lining up all the possible states (2L) in the vertical axis.
Then we connect each state to the next state by the allowable codeword’s for that state.
• There are only two choices possible at each state. These are determined by the arrival of
either a 0 or a 1 bit.
• The arrows show the input bit and the output bits are shown in parentheses.
• The arrows going upwards represent a 0 bit and going downwards represent a 1 bit.
• The trellis diagram is unique to each code .
• We always start at state Zeros (0)
ENCODER REPRESENTATIONS
ENCODER REPRESENTATIONS
Example: How to encode sequence (1011000) by Trellis diagram of (2,1,4)???

.
DECODING OF CONVOLUTIONAL CODES

 There are several different approaches to decoding of convolutional


codes.
 These are grouped in two basic categories:
• A. Sequential Decoding(Fano Algorithm.).
• B .Maximum Likelihood Decoding (Viterbi Algorithm).
 Both of these two methods represent two different approaches to
the same basic idea behind decoding .
DECODING OF CONVOLUTIONAL CODES

THE BASIC IDEA BEHIND DECODING:


• We know from the encoding process that the input bits sequence will have a unique output
sequence but due to errors.
• When received output bits sequence we do decoding by using one of the two things:
• we can compare this received sequence to all permissible sequence and pick with tha
smallest Hamming distance (Hard decision decoding ).
• We can a correlation and pick the sequence with the best correlation(Soft decision decoding)
• The basic idea behind decoding is (how can we decode the sequence without checking each
and everyone of these 2L codewords.
DECODING OF CONVOLUTIONAL CODES

SEQUENTIAL DECODING (FANO ALGORITHM)


 It was first proposed by Wozencraft and later a better version was proposed be
Fano.
 You are given one path at time , You may give up that path at any time and turn
back to follow an another path but important things is that only one path
followed at any one time.
 T he decoder keep track of its decisions ,If tally increased faster than some
threshold value decoder give up that path and retraces the path back to the last
fork where the tally was below the threshold.
DECODING OF CONVOLUTIONAL CODES

• Example: Decoding using Sequential decoding-Fano algorithm to decode this code


• 01 11 01 11 01 01 11
DECODING OF CONVOLUTIONAL CODES
DECODING OF CONVOLUTIONAL CODES
DECODING OF CONVOLUTIONAL CODES

From point 1 all choices encountered meet perfectly with the codeword choices and the
decoder successfully decodes the message as 1011000
DECODING OF CONVOLUTIONAL
CODES
• M.L.H DECODING (VITERBI ALGORITHM):
• The Viterbi algorithm is named after Andrew Viterbi, who proposed it in 1967 as a
decoding algorithm for convolutional codes over noisy digital communication links.
• The Viterbi decoding is the best known implementation of the Maximum Likely-hood
decoding.
• Viterbi algorithm is the optimum decoder, when the input sequence messages are
equally likely.
• It finds a path through trellis with the largest metric (maximum correlation or
minimum distance).
• Viterbi algorithm reduces the complexity of this search by finding the optimal path
one stage at a time.
• The complexity of the Viterbi algorithm is proportional to the number of states.
THE BASIC IDEA OF VITERBI ALGORITHM

• The Viterbi decoder examines an entire received sequence of a given length.


The decoder computes a metric for each path and makes a decision based on
this metric.
• All paths are followed until two paths converge on one node. Then the path
with the higher metric is kept and the one with lower metric is discarded. The
paths selected are called the survivors.
• For an N bit sequence, total numbers of possible received sequences are 2N. Of
these only 2kL are valid. The Viterbi algorithm applies the maximum-likelihood
principles to limit the comparison to 2kL surviving paths instead of checking all
paths.
THE BASIC IDEA OF VITERBI
ALGORITHM
• The most common metric used is the Hamming distance metric. This is just the
dot product between the received codeword and the allowable codeword. Other
metrics are also used and we will talk about these later.
• These metrics are cumulative so that the path with the largest total metric is the
final winner.
DECODING USING VITERBI
ALGORITHM
• Example: Let's do an example. The code is: (2,1,4) for
which we did the encoder diagrams in the previous
• Assume that bit sequence 11 11 01 11 01 01 11 was
received, and no errors occurred,
• Where the original message of this bit sequence is :
(1011000) (where the last 3 bits are the result of the flush
bits. Their output is called tail bits.)
Input State Received bits Decoded bit outputState
s1 s2 s3 R1 R2 D1 s1 s2 s3
0 0 0 0 0 0 0 0 0
0 0 0 1 1 1 1 0 0
0 0 1 1 1 0 0 0 0
0 0 1 0 0 1 1 0 0
0 1 0 1 0 0 0 0 1
0 1 0 0 1 1 1 0 1
0 1 1 0 1 0 0 0 1
0 1 1 1 0 1 1 0 1
1 0 0 1 1 0 0 1 0
1 0 0 0 0 1 1 1 0
1 0 1 0 0 0 0 1 0
1 0 1 1 1 1 1 1 0
1 1 0 0 1 0 0 1 1
1 1 0 1 0 1 1 1 1
1 1 1 1 0 0 0 1 1
1 1 1 0 1 1 1 1 1
Input State Received bits Decoded bit outputState
Incoming
bits 11 11 01 11
s1 s2 s3 01 R1 R2 01 D1 11 s1 s2 s3
0 0 0 0 0 0 0 0 0
000 0 0 0 1 1 1 1 0 0
0 0 1 1 1 0 00 0 0
001 0 0 1 0 0 1 1 0 0
0 1 0 1 0 0 0 0 1
010 1 0 1 0 0 1 0 1 1 0 1
0 1 1 0 1 0 0 0 1
011 0 0 1 1 1 0 1 1 0 1
1
1 0 0 1 1 0 0 1 0
100 1 0 0 0 0 1 1 1 0
1 0 1 0 0 0 0 0 1 0
101 1 0 1 1 1 1 1 1 0
1
1 1 0 0 1 0 0 1 1
110
1 1 0 1 0 1 1 1 1
1 1 1 1 0 0 0 1 1
111
1 1 1 0 1 1 1 1 1
ERROR CORRECTION USING VITERBI
ALGORITHM
• Example: Let's do The same example as the last.
• Assume that bit sequence 1011000 was sent.
• If no errors occurred, we would receive: 11 11 01 11 01 01 11
• But let's say we received instead : 01 11 01 11 01 01 11.
One error has occurred. The first bit has been received as 0.
ERROR CORRECTION USING VITERBI
ALGORITHM
• Example: Let's do another example as the last. The code is:
(2,1,3), and we will use the error instead of Huffman
distance. The metrics are cumulative so that the path with
the smallest total metric is the final winner.
• Assume that bit sequence 010110 was sent.
• If no errors occurred, we would receive: 00 11 01 00 10 10
• But let's say we received instead : 10 11 01 00 10 10.
• One error has occurred. The first bit has been received as 0.
THE MESSAGE IS: (010110)
Viterbi Decoding
Algorithm

Hard Soft
Decision Decision

The Sequence Of Bites: The Sequence Of Bites:


0’s, And 1’s (01001101…) Voltages. (0.1,-1.1, 0.3,
2.1,…)
VITERBI SOFT DECISION DECODING
VITERBI SOFT DECISION DECODING
CONVOLUTIONAL CODING VS. BLOCK
CODING
• In convolution codes the information • -In the block codes, the information bits
bits are spread along the sequence. are followed by the parity bits.
• Convolutional codes operate on serial • - Block codes operate on relatively large
data, one or a few bits at a time. (typically, up to a couple of hundred
• In a convolutional code, there is some bytes) message blocks.
`memory' that remembers the stream of • - Block coding is memoryless. (no data
bits that flow by. dependency between blocks)
• Convolutional codes accept a • - Block coder outputs a unique n-bit data
continuous stream of bits and map them block, so that the block codes map
into an output stream introducing individual blocks of bits into blocks of
redundancies in the process. codewords
• Useful for real-time applications • - Useful for data communication
ADVANTAGES OF CONVOLUTIONAL
CODES
• Convolution coding is a popular error-correcting coding method used
in digital communications.
• The convolution operation encodes some redundant information into
the transmitted signal, thereby improving the data capacity of the
channel.
• Convolution Encoding with Viterbi decoding is a powerful FEC
technique that is particularly suited to a channel in which the
transmitted signal is corrupted mainly by AWGN.
• It is simple and has good performance with low implementation cost.
PRACTICAL EXAMPLE OF CONVOLUTIONAL
CODES
• NASA uses a standard r=1/2,K=7 convolutional code.
• IS-54/136 TDMA Cellular Standard uses a r=1/2,K=6 convolutional
code.
• GSM Cellular Standard uses a r=1/2,K=5 convolutional code.
• IS-95 CDMA Cellular Standard uses a r=1/2,K=9 convolutional
code for forward channel and a a r=1/3,K=9 convolutional code for
Reverse channel.
• Galileo Space Probe used constraint length 15 convolutional code
Thanks…

You might also like