Professional Documents
Culture Documents
Information Theory and Coding Techniques: IV B. Tech. - I Semester
Information Theory and Coding Techniques: IV B. Tech. - I Semester
–I Semester
(19BT70407)Information Theory and Coding Techniques
(PE - 5)
COURSE DESCRIPTION:
Information theory; Channel capacity; Channel coding techniques – Linear block codes, Cyclic
codes, Convolutional codes; Reed-Solomon and Turbo codes.
COURSE OBJECTIVES:
CEO1: To impart knowledge in information theory, coding, decoding and error control
codes.
CEO2: To develop analytical, design and development skills in error coding techniques
for prescribed requirements.
COURSE OUTCOMES: After successful completion of this course, the students will be able to:
CO1: Evaluate entropy for various source coding techniques.
CO3: Analyze various error detection and correction codes to enable reliable data
transmission.
CO4: understand Reed-Solomon Codes, Interleaving and Concatenated Codes and their
Applications
DETAILED SYLLABUS:
Entropy: Discrete stationary sources, Markov sources, Entropy of a discrete Random variable-
Joint, conditional, relative entropy, Mutual Information and conditional mutual information.
Chainrules for entropy, relative entropy and mutual information, Differential Entropy- Joint,
relative, conditional differential entropy and Mutual information.
Loss less Source coding: Uniquely decodable codes, Instantaneous codes, Kraft’s inequality,
optimal codes, Huffman code, Shannon’s Source Coding Theorem.
Capacity computation for some simple channels, Channel Coding Theorem, Fano’s inequality
and the converse to the Coding Theorem, Equality in the converse to the coding theorem, The
joint source Channel Coding Theorem, The Gaussian channels- Capacity calculation for Band
limited Gaussian channels, Parallel Gaussian Channels, Capacity of channels with colored
Gaussian noise.
Linear Block Codes: Introduction to Linear block codes, Generator Matrix, Systematic Linear
Cyclic Codes: Algebraic Structure of Cyclic Codes, Binary Cyclic Code Properties, Encoding in
Systematic Form, Well-Known Block Codes-Hamming Codes, Extended Golay Code, BCH Codes.
Total Periods: 45
Topics of Self-study are provided in the lesson plan
TEXT BOOKS:
1. Thomas M. Cover and Joy A. Thomas, Elements of Information Theory, John Wiley & Sons,
1stEdition, 1999.
2. Bernard sklar, “Digital Communications – Fundamental and Application”, Pearson Education,
REFERENCE BOOKS:
1. John G. Proakis, “Digital Communications”, Mc.GrawHill Publication, 5thEdition, 2008.
2. ShuLin and Daniel J.Costello,Jr., “Error Control Coding–Fundamentals and Applications”,
Prentice Hall,2nd Edition, 2002.
https://nptel.ac.in/courses/117101053
Program specific
Course Program outcomes
outcomes
outcome
PO1 PO2 PO3 PO4 PO5 PO6 PO7 PO8 PO9 PO10 PO11 PO12 PSO1 PSO2 PSO3
CO1 3 3 - 2 - - - - - - - - - 3 -
CO2 3 3 - 2 - - - - - - - - - 3 -
CO3 3 3 1 2 - - - - - - - - - 3 -
CO4 3 2 - 2 - 1 - - - - - - - 3 -
Average 3 2.7 1 2 - 1 - - - - - - - 3 -
Course
Correlation 3 3 1 2 - 1 - - - - - - - 3 -
Level
Correlation Level: 3-High; 2-Medium; 1-Low
LESSON PLAN
Name of the Course: (19BT70407) Information Theory and Coding Techniques
Class & Semester: IV B. Tech (ECE) – I Semester
Name(s) of the faculty Member(s): Dr.V.R.Anitha
Book(s)
S. No. of
Topic followe Topics for self study
No. periods
d
UNIT – I:INTRODUCTION
Entropy: Discrete stationarysources, 1 T1 The entropy power
1.
Markov sources inequality and the
Entropy of a discrete Random variable- 2 T1 Brunn–Minkowski
Joint, conditional,relativeentropy,Mutual Inequality, Lempel-Ziv
2.
Information and conditionalmutual coding, Arithmetic
information coding.
Chain rules for entropy, relative entropy 1 T1
3.
and mutual Information
Differential Entropy- Joint,relative, 1 T1
4. conditional differential entropy, Mutual
information
Loss less Source coding:Uniquely 1 T1
5.
decodable codes
6. Instantaneous codes 1 T1
Kraft’s inequality, Optimal codes 1 T1
7.
8. Huffman code 1 T1
9. Shannon’s Source Coding Theorem 1 T1, R1
Total periods required: 10
UNIT – II: CHANNEL CAPACITY
Capacity computation for some simple 1 T1 Rate distortion
10.
channels, Channel Coding Theorem Theory,Arimoto-
Fano’s inequality and the converse to the 2 T1 Blahut algorithm.
11.
Coding Theorem,
Equality in the converse to the coding 1 T1
12.
theorem
The joint source Channel Coding 1 T1
13.
Theorem
The Gaussian channels- Capacity 2 T1
14. calculation for Band limited Gaussian
channels
Parallel Gaussian Channels 1 T1
15.
TEXT BOOKS:
T1.Thomas M Cover and Joy A. Thomas, Elements of Information Theory, John Wiley & Sons,
st
1 Edition, 1999.
3. a) Describe shannon’s channel coding theorem for memory less 6 Marks L2 CO2 PO1
channels.
b) Derive the Shannon channel coding theorem and design a 6 Marks L4 CO2 PO3
system which has a band width of 5kHz andSNR of 28dB at the
input to the receiver, find its information carrying Capacity.
(OR)
4. a) 6 Marks L4 CO3 PO2
Derive the capacity of channels with colored Gaussian noise.
b) 6 Marks L4 CO3 PO3
Derive the channel capacity for a Water filling channel.
Dr.P. NarahariSastry,
Associate Professor,
Mobile: 9948397802
1 Department of ECE,
ChaitanyaBharathi Institute of Technology, Email:ananditahari@yahoo.com
Gandipet,
HYDERABAD -500075.
Dr.K.Suresh Reddy ,
Professor& Head, Mobile: 9866178937
3
Department of ECE, Email:sureshkatam@gmail.com
G.Pulla Reddy Engineering College,
Kurnool-518007.
Dr.G. SasiBhushanaRao,
Professor, Mobile: 9849747131
4 Department of ECE,
Andhra University Email:sasigps@gmail.com
College of Engineering
Visakhapatnam-530003.
Dr.M.S.S.Rukmini,
Professor, Mobile: 98662 81571
5 Department of ECE,
Vignan University,
Email:mssrukmini@gmail.com
Vadlamudi,
Guntur-522213
Dr.T. Ramasree,
Professor, Mobile: 9346465050
6 Dept. Of ECE,
SVU College of Engineering, Email:Rama.jaypee@gmail.com
Tirupati- 517502.