You are on page 1of 1

09EC352: Information Theory and

Coding
Assignment-2
VI Semester ECE, Jan-May 2012

1. Let X indicate the outcome of the roll of a die. If X < 4, the die
is
( rolled once more, and the outcome is denoted by Y . Define Z =
X X>3
. Find H(X|Z) and I(X; Z).
X +Y X <4
H(X2 |X1 )
2. Let X1 and X2 be identically distributed. Let ρ = 1 − H(X1 )

(a) Show that ρ = I(X1 ; X2 )/H(X1 )


(b) Show that 0 ≤ ρ ≤ 1.
(c) When is ρ = 0?
(d) When is ρ = 1?

3. Let S0 be the third extension of a zero-memory binary source with P (0) =


p. Another source observes the outputs of S0 and emits either a 0, 1, 2
or 3 according to whether the output of S0 had 0, 1, 2 or 3 zeros. Find
H(S0 ) and H(S). What does H(S0 ) − H(S) signify?
4. The state transition probabilities
( of a first order Markov source with three
q i=j
symbols are given by pij = . Find the value of q that
(1 − q)/2 i 6= j
maximizes the source entropy, and find the maximum entropy.
5. Draw the state diagram and find the stationary state probabilities, entropy
of the source and the entropy of the adjoint source for the Markov sources
whose transition probabilities are given below.

(a) M = 3, r = 1. p11 = 0.5, p12 = 0.5, p21 = 0.25, p22 = 0.5, p32 =
0.5, p33 = 0.5.
(b) S = {0, 1}, r = 2. The four states are S1 = 00, S2 = 01, S3 =
11, S4 = 10. p11 = 0.6, p21 = 0.5, p33 = 0.6, p41 = 0.5.
(
0.5 i = j
(c) M = 3, r = 1, pij =
0.25 i 6= j

6. For the Markov sources given in the previous problem, find the uncon-
ditional first and second order symbol entropies (G1 and G2 ) and verify
that G1 > G2 > H(S)

You might also like