You are on page 1of 3

ECE4007 Information Theory And Coding L T P J C

3 0 0 4 4
Version : 1.00
Pre-requisite: ECE4001 Digital Communication Systems
Objectives :
Describe and analyze the information source and channel capacity
Differentiate between the uniform and non-uniform quantization
Analyze the source coding techniques such as ShananFano Encoding, Huffman Coding,
Arithmetic Coding.
Apply statistical techniques for signal detection
Construct the various channel coding schemes such as block codes, cyclic codes and convolutional
codes.
Outcomes:
1. Apply mathematical models that describes the behavior of information source and channel
capacity and the performance of source coding and channel coding techniques
2. Solve mathematical problems in source coding and channel coding techniques and implement in
Matlab.
Student Learning Outcomes (SLO): 1,2,13,18

Module:1 Introduction: 4 hours SLO:


1,2
Review of Probability Theory, Introduction to information theory

Module:2 Entropy: 6 hours SLO:


1,2
Uncertainty, self-information, average information, mutual information and their properties - Entropy
and information rate of Markov sources - Information measures of continuous random variables.

Module:3 Channel Models and Capacity: 5 hours SLO:


1,2
Importance and types of various channel models - Channel capacity calculation Binary symmetric
channel, binary erasure channel - Shannons channel capacity and channel coding theorem - Shannons
limit.

Module:4 Source CodingI: 6 hours SLO:


1,18
Source coding theorem - Huffman coding - Non binary Huffman codes - Adaptive Huffman coding -
Shannon Fano Elias coding - Non binary Shannon Fano codes

Module:5 Source Coding II : 6 hours SLO:


1,18
Arithmetic coding - Lempel-Ziv coding - Run-length encoding and rate distortion function - Overview of
transform coding.

Module:6 Channel Coding I: 8 hours SLO:


1,18
Introduction to Error control codes - Block codes, linear block codes, cyclic codes and their properties,
Encoder and Decoder design- serial and parallel concatenated block code, Convolution Codes-
Properties, Encoder-Tree diagram, Trellis diagram, state diagram, transfer function of convolutional
codes, Viterbi Decoding, Trellis coding, Reed Solomon codes.
Module:7 Channel Coding II: 8 hours SLO:
1,18
Serial and parallel concatenated convolutional codes, Block and convolutional interleaver, Turbo coder,
Iterative Turbo decoder, Trellis coded modulation-set partitioning - LDPC Codes.

Module:8 Contemporary Issues 2 hours SLO:


2

Total Lecture (Inclusive of Tutorial 45 hours


Hours):
Text Books:
1. Simon Haykins, Communication Systems Wiley,4th Edition, 2013.
2. Ranjan Bose, Information Theory, Coding and Cryptography, Tata McGraw- Hill, Eighteenth
Reprint: June 2015.

Reference Books:
1. John G. Proakis, Digital Communications, McGraw-Hill, 5th Edition, 2014.
2. Sklar Digital Communications: Fundamentals and Applications, Pearson Education, 2nd Edition,
2009
3. Khalid Sayood, Introduction to Data Compression, Elsevier, 4th Edition, 2015.

Typical Projects SLO: 13

1. Efficient Image compression technique by using modified SPIHT algorithm


2. Develop the compression algorithms by using Discrete Wavelet Transform
3. Compress and decompress an Image using Modified Huffman coding
4. Apply Run length coding and Huffman encoding algorithm to compress an image.
5. Adaptive Huffman coding of 2D DCT coefficients for Image compression
6. Compress of an image by chaotic map and Arithmetic coding
7. Region of Interest based lossless medical image compression
8. Write a code to build the (3, 1, 3) repetition encoder. Map the encoder output to BPSK symbols.
Transmit the symbols through AWGN channel. Investigate the error correction capability of the (3, 1, 3)
repetition code by comparing its BER performance to that without using error correction code.
9. Write a code to compare the BER performance and error correction capability of (3, 1, 3) and (5, 1, 5)
repetition codes. Assume BPSK modulation and AWGN channel. Also compare the simulated results
with the theoretical results.
10. Write a code to compare the performance of hard decision and soft decision Viterbi decoding
algorithms. Assume BPSK modulation and AWGN channel.
11. Write a code to build (8, 4, 3) block encoder and decoder. Compare the BER performance of (8, 4, 3)
block coder with (3,1,3) repetition codes. Assume BPSK modulation and AWGN channel.
12. Consider the following Extended vehicular A channel power delay profile. Write a code to model the
given profile. Also measure the channel capacity. Compare the obtained capacity to that without fading
channel.

Delay (ns) Power (dB)


0 0
30 -1.5
150 -1.4
310 -3.6
370 -0.6
710 -9.1
1090 -7
1730 -12
2510 -16.9
13. Performance analysis of various channels (BSC, BEC, Noiseless, Lossless) under AWGN.
14. FPGA implementation of linear block coding and syndrome decoding.
15. Performance of linear block codes under single error and burst error.
16 .Performance of analysis of convolution codes under single error and burst error
17. Implementation of VITERBI decoding in FPGA.
18. Efficiency checking of different interleaver for turbo encoder.
19. Implementation of trellis code modulator in FPGA.
20. Developing the Compression algorithms for Wireless multimedia sensor networks.

Date of Approval by the Academic Council 18.03.16

You might also like