You are on page 1of 1

University Roll No: Paper Code: NEC-031 3. Calculate entropy when pk=0 & when pk=1.

United College of Engineering & Research


B.Tech VIIth Semester, Branch: EC 4. Write the difference between Arithmetic coding & Huffman coding.
FIRST SESSIONAL TEST (2016-17)
Subject: INFORMATION THEORY & CODING 5. Explain the properties of JPM.
Time: 2 Hours Max Marks: 30

Note: All sections are compulsory. Part C (Long Answer Type) (1x8=08)
Part A (Short Answer Type) (10X1=10) Attempt 1 Question
Attempt all Questions

1. Calculate the channel capacity of binary erasure channel.


1. Compare and contrast Huffman coding arithmetic coding.
2. Explain theory of decodability.
2. A binary channel matrix is given as:
3. Relation between information transmission rate & H(X/Y)
4. What is the relation between second order entropy & first order
entropy?
5. Draw vein diagram of entropy & mutual information
6. What is the Concavity of entropy?
7. State channel coding theorem?
8. Write the matrix of any symmetric & non symmetric channel.
9. Write the Jensens inequality?
10. What is relative entropy?
Determine the H(X), H(Y), H(Y/X), and H (X/Y)
Part B (Medium Answer Type) (3x4=12)
Attempt 3 Question
3. Prove the mutual information is always positive i.e. I(X ; Y) 0
1. Find out output probability for the input p(x1) =0.4, p(x2) = 0.5, p(x3)
=0.1

2. Calculate the efficiency of the following symbols: [x] = [1/16, 1/32,


1/64, 1/8, 1/2, 1/4, 1/64]

You might also like