You are on page 1of 8

INFORMATION ENTROPY FUNDAMENTALS

Uncertainty, Information and Entropy – Source coding Theorem – Huffman coding –Shannon
Fano coding – Discrete Memory less channels – channel capacity – channel coding Theorem –
Channel capacity Theorem.

DATA AND VOICE CODING


Differential Pulse code Modulation – Adaptive Differential Pulse Code Modulation – Adaptive
subband coding – Delta Modulation– Coding of speech signal at low bit rates (Vocoders, LPC).

ERROR CONTROL CODING


Linear Block codes – Syndrome Decoding – Minimum distance consideration – cyclic codes –
Generator Polynomial – Parity check polynomial – Encoder for cyclic codes – calculation of
syndrome – Convolutional codes.

COMPRESSION TECHNIQUES
Principles – Arithmetic coding – Image Compression – Graphics Interchange format – Tagged
Image File Format – Introduction to JPEG standards.

AUDIO AND VIDEO CODING


Linear Predictive coding – code excited LPC – Perceptual coding, MPEG audio coders – Dolby
audio coders – Video compression – Principles – Introduction to H.261 & MPEG Video
standards.

References:

1. Simon Haykin, “Communication Systems”, 4th Edition, John Wiley and Sons, 2001.
2. Fred Halsall, “Multimedia Communications, Applications Networks Protocols and Standards”,
Pearson Education, Asia 2002;
3. Mark Nelson, “Data Compression Book”, BPB Publication 1992.
4. Watkinson J, “Compression in Video and Audio”, Focal Press, London, 1995.
5.Andre Neubauer, Jurgen Freudenberger, Volker Kuhn, “Coding theory: Algorithms,
Architectures and Applications” John Wiley & Sons Ltd, Reprint 2012.
6. R Bose, “Information Theory, Coding and Cryptography”, TMH 2007.
7. Robert. H. Morelos- Zaragoza, “The Art of Error Correcting Coding”, Second Edition, John
Wiley & Sons Ltd, Reprint 2013.
8. R. Avudaiammal, “Information Coding Techniques”, 2nd Edition, Tata McGraw Hill Education
Pvt. Ltd.,2010
S.No Topics SREC PSG NIT(S) AU IIT Karunya University
Madras Universit of illinois
y
Uncertainty,
1 Information and √ √ √ √ √ √ √
Entropy
Source coding
2 √ √ √ √ √ √ √
Theorem
3 Huffman coding √ √ √ √ √
Shannon Fano
4 √ √ √ √ √
coding
Discrete Memory
5 √ √ √ √ √ √ √
less channels
6 channel capacity √ √ √ √ √ √ √
channel coding
7 √ √ √ √ √ √ √
Theorem
Channel capacity
8 √ √ √ √ √ √ √
Theorem.
Differential Pulse
9 √ √
code Modulation
Adaptive Differential
10 Pulse Code √ √
Modulation
Adaptive subband
11 √ √
coding
12 Delta Modulation √ √
Coding of speech
signal at low bit
13 √ √
rates (Vocoders,
LPC).
Linear Block codes
14 – Syndrome √ √ √ √
Decoding
Minimum distance
15 √ √ √ √
consideration
16 cyclic codes √ √ √ √ √
Generator
17 √ √ √ √ √
Polynomial
Parity check
18 √ √ √ √ √
polynomial
Encoder for cyclic
19 √ √ √ √ √
codes
calculation of
20 √ √ √ √ √
syndrome
Convolutional
21 √ √ √ √
codes.
Principles –
22 √ √
Arithmetic coding
23 Image Compression √ √
Graphics
24 √ √
Interchange format
Tagged Image File
25 √ √
Format
Introduction to
26 √ √
JPEG standards.
Linear Predictive
27 √ √
coding
28 code excited LPC √ √
Perceptual coding,
29 √ √
MPEG audio coders
30 Dolby audio coders √ √
Video compression
31 √ √
– Principles
Introduction to
H.261 & MPEG
32 √ √
Video standards.

NIT Suratkal-
EC382
INFORMATION THEORY AND CODING(3-0-0) 3
Communication systems and Information Theory, Measures of Information,
Coding for Discrete sources, Discrete memory-less channels and capacity, Noisy
channel coding theorem, Techniques for coding and decoding, Waveform
channels, Source coding with Fidelity criterion.

Thomas M Cover & Joy A Thomas, Elements of Information Theory, John


Wiley,1991
R.G.Gallagher, Information Theory and Reliable Communication, Addison
Wesley, 1987.
A.J.Viterbi & J.K. Omura, Principles of Digital Communications and Coding,
McGraw Hill, 1979.
Department of Electrical Engineering

Indian Institute of Technology Madras

Title : Introduction to information theory and coding

Course No : EE5142

Credits : 4

Prerequisite : Probability and Random Processes, Digital Communications

Syllabus :

1) Entropy, Relative Entropy, and Mutual Information:

Entropy, Joint Entropy and Conditional Entropy, Relative Entropy and Mutual Information,
Chain Rules, Data-Processing Inequality, Fano’s Inequality

2) Typical Sequences and Asymptotic Equipartition Property:

Asymptotic Equipartition Property Theorem, Consequences of the AEP: Data Compression,


High-Probability Sets and the Typical Set

3) Source Coding and Data Compression:

Kraft Inequality, Huffman Codes, Optimality of Huffman Codes

4) Channel Capacity:
Symmetric Channels, Properties of Channel Capacity, Jointly Typical Sequences, Channel
Coding Theorem, Fano’s Inequality and the Converse to the Coding Theorem

5) Differential Entropy and Gaussian Channel:

Differential Entropy, AEP for Continuous Random Variables, Properties of Differential Entropy,
Relative Entropy, and Mutual Information, Coding Theorem for Gaussian Channels

6) Linear Binary Block Codes:

Introduction, Generator and Parity-Check Matrices, Repetition and Single-Parity-Check Codes,


Binary Hamming Codes, Error Detection with Linear Block Codes, Weight Distribution and
Minimum Hamming Distance of a Linear Block Code, Hard-decision and Soft-decision
Decoding of Linear Block Codes, Cyclic Codes, Parameters of BCH and RS Codes, Interleaved
and Concatenated Codes

7) Convolutional Codes:

Encoder Realizations and Classifications, Minimal Encoders, Trellis representation, MLSD and
the Viterbi Algorithm, Bit-wise MAP Decoding and the BCJR Algorithm

Text Books :

1. Elements of Information Theory by Thomas Cover, Joy Thomas


2. Channel Codes: Classical and Modern by William Ryan, Shu Lin

References :

1. Information Theory and Reliable Communication by Robert Gallager

Karunya University

17EC2045 INFORMATION THEORY AND CODING Credits 3:0:0


Pre requisite: 14EC2021 Digital Communication
Course objectives
To learn the basics of information theory
To calculate channel capacity and other measures
To understand the source coding techniques for text, audio and speech
To know the format of image compression
To know the concept of error control techniques
To understand how to calculate the rate and error probabilities.

Course outcome
The students understand the basics of information theory
The student gain knowledge to calculate channel capacity and other measures
The students understand the source coding techniques for text, audio and speech
The students understand the format of image compression
The students understand the concept of error control techniques
The students analyze and apply specific coding methods calculate the rate and error pr
obabilities.

Unit I : Information Theory : Introduction –Information, Entropy, Information rate, classification


of codes, Kraft McMillam inequality, Source coding theorem, Shannon Fano coding, Huffman
coding –algorithm, tree construction, Efficiency –Joint and Conditional Entropies, Mutual
information and its properities –Discrete Memory less channels –BSC, BEC, Channel capacity
–Shannon limit

Unit II - Source Coding: Text, Audio and Speech: Introduction to text coding, Adaptive Huffman
coding, Arithmetic coding, LZW algorithm - Introduction to Audio coding, Perceptual coding,
Masking techniques, Psychoacous-tic model, MPEG audio layers I, II, III –Dolby AC3
–Introduction to Speech Coding, Channel vocoder, Linear Predictive Coding.

Unit III - Source Coding: Image and Video: Image and Video formats –GIF, TIFF, SIF, CIF and
QCIF –Image compression –READ, JPEG –Video compression–Principles I, B, P frames –
Motion estimation –Motion Compensation –H.261 standard, MPEG standard

Unit IV - Error Control Coding: Block Codes: Introduction to error control coding–
Definition and Principles –Hamming weight, Hamming distance, minimum distance decoding
–Single parity codes, hamming codes, repetition codes –Linear block codes –Cyclic codes
–Syndrome calculation –Encoder and decoder - CRC

Unit V - Error Control Coding: Convolutional Codes: Introduction to Convolutional codes


–Code tree –Trellis codes –State diagram –Encoding and decoding –Sequential search
–Viterbi algorithm –Principle of Turbo coding

Text Book:
1.Andre Neubauer, Jurgen Freudenberger, Volker Kuhn, “Coding theory: Algorithms,
Architectures and Applications” John Wiley & Sons Ltd, Reprint 2012.
Reference Books:
1.Robert. H. Morelos- Zaragoza, “The Art of Error Correcting Coding ”, Second Edition, John
Wiley & Sons Ltd, Reprint 2013.
2.R. Avudaiammal, “Information Coding Techniques”, 2nd Edition, Tata McGraw Hill Education
Pvt. Ltd., 2010.
3.R Bose, “Information Theory, Coding and Cryptography”, TMH 2007
4.Fred Halsall, “Multidedia Communications: Applications, Networks, Protocols and Standards”,
Perason Education Asia, 2002
5.K Sayood, “Introduction to Data Compression” 3/e, Elsevier 2006
6.S Gravano, “Introduction to Error Control Codes”, Oxford University Press 2007
7.Amitabha Bhattacharya,“Digital Communication”, TMH 2006
University of Illinois
Department of Computer science

Topics: Entropy and differential entropy, mutual information; data compression; channel capacity, the
Gaussian channel; rate-distortion; universal-source-coding; network information theory. Contemporary
examples and research topics.

You might also like