You are on page 1of 1

—————————————————————————————————————————————–

Course: Information Theory and Coding (EC353) TUTORIAL-1 Marks : 35


Topic: Entropy, Mutual Information, and Perfect Secrecy
—————————————————————————————————————————————–
(1) Calculate the amount of information if the probability of the symbol is pk = 14 ? (2 marks)

(2) Calculate the amount of information if binary digits occur with equal likelihood in binary PCM? (2
marks)

(3) If I1 is the information carried by message m1 and I2 is the information carried by message m2 , then
find that the amount of information carried compositely due to m1 and m2 ? (2 marks)

(4) A zero memory source contains X = {x1 , x2 , x3 , x4 } with probabilities P (X) = { 21 , 14 , 18 , 81 , }. Find
the entropy of the source and second order extension of the source? (2 marks)

(5) A discrete memoryless source C produces two bits at a time. This source C comprises two binary
independent sources A and B whose outputs are equally likely to occur and each of these two sources
contributing one bit output of C. Find the information of each output of C? (2 marks)

(6)Let the basket contains 26, 13, and 13 numbers of red, green, and blue balls respectively. In a random
experiment a ball is taken from this basket of 52 balls. Find the information about the outcome of this
experiment to be a red ball? (2 marks)

(7) A source emits three symbols with the probabilities 0.7, 0.2, and 0.1. Find the source entropy, max-
imum entropy of the source, source efficiency, and redundancy? Here, the source efficiency is defined as
the ratio between the source entropy and maximum entropy of the source. The redundancy can be found
by subtracting the source efficiency from 1. (2 marks)

(8) A discrete source emits one of six unique symbols at every Ts = ms. The probabilities of these six
symbols are 21 , 14 , 18 , 16
1 1
, 32 1
, and 32 . Find the source entropy and information rate of the source? The
information rate of the source can be defined as the multiplication between the source entropy and the
rate at which the symbols are generated from the source. (2 marks)

(9) Find P (X, Y ), P (y1 ) and P (y2 ) of the binary channel as shown below? The corresponding probabil-
ities are shown on the arrows.

(2 marks)

(10) Let the joint probability matrix


 is as follows.
0.25 0.00 0.00 0.00
0.10 0.30 0.00 0.00
 
P (A, B) = 0.00 0.05 0.10 0.00. Find the mutual information between A and B? (2 marks)
0.00 0.00 0.05 0.10
0.00 0.00 0.05 0.00

You might also like