You are on page 1of 2

E2 201: Information Theory (2019)

Homework 3
Instructor: Himanshu Tyagi

Homework Questions Questions marked ∗ are more difficult.

Q1 Consider binary random variables X, Y , and Z, where X and Z are independent and are both
distributed uniformly and Y = X ⊕ Z.
Find H(X), H(Y ), H(X, Y ), H(X, Y |Z = 0), H(X, Y |Z = 1), and H(X, Y |Z). Use the
properties of entropy to argue that X and Y are independent and that X and Y are not
independent given Z.

Q2 Show the following inequalities for entropies:


1
(a) H(X1 , X2 , X3 ) ≤ 2 [H(X1 , X2 ) + H(X2 , X3 ) + H(X3 , X1 )]
1
(b) H(X1 , X2 , X3 ) ≥ 2 [H(X1 , X2 |X3 ) + H(X2 , X3 |X1 ) + H(X3 , X1 |X2 )]

Q3 Show that I(X ∧ Y, Z) ≥ max{I(X ∧ Y ), I(X ∧ Z), I(X ∧ Y |Z), I(X ∧ Z|Y )}. Find conditions
for equality.

Q4 Show the following:

(a) minx maxy f (x, y) ≥ maxy minx f (x, y);


(b) Sum of two concave functions is concave;
(c) A linear function is both concave and convex;
(d) Sum of a concave and a convex function can either be convex or concave or neither (give
examples of each);
(e) ln x ≤ x − 1 and 1 − x ≤ e−x for all x ≥ 0.

Q5 A pack of cards is shuffled. Let X denote the random order of the resulting shuffled cards.
Note that X may or may not be uniformly distributed depending on the shuffling strategy.
After the shuffling operation, a card is selected at random (uniformly) and placed on the top.
Let Y denote the new order of the cards. Is H(X) more or less than H(Y )? Justify your
answer.

Q6 For a finite random variable X , let p denote the maximum probability of guessing X correctly.
Show that

H(X) ≤ h(p) + (1 − p) log(|X | − 1),

and find the conditions for equality.

Q7 Show that if X −◦− Y −◦− Z, then I(X ∧ Y |Z) ≤ I(X ∧ Y ) and find the conditions for equality.
Furthermore, provide an example where I(X ∧ Y |Z) > I(X ∧ Y ).

1
Q8 Let P and Q be pmfs over finite alphabetsPX and Y, respectively, and W : X → Y be a
discrete channel. Show that D(W kQ|P) = x P (x)D(WX kQ) is convex in Q (for a fixed P)
and linear in P for a fixed Q.

Q9 For discrete random variables X, Y, Z, show that the Markov relations X −Y



◦ −Z

◦ and X −Z−

◦ −Y◦
hold iff there exist functions f : Y → U and g : Z → U such that (a) P (f (Y ) = g(Z)) = 1;
(b) X −◦− f (Y ) −◦− (Y, Z) holds; (c) X −◦− g(Z) −◦− (Y, Z) holds.

Q10 ∗

Pn
(i) Show that H(X1 , ..., Xn ) ≤ i=1 H(Xi ) and find conditions for equality.
(ii) Use the previous inequality to show that for 0 ≤ p ≤ 1/2
X  n
≤ 2nh(p) ,
i
i≤np

where h(p) denotes the binary entropy function.

Q11 (i) Consider random variables X1 , ..., Xn , Y where Y takes values in the set {0, 1}k and
X1 , ..., Xn are mutually independent. Show that we can always find a coordinate i such
that I(Xi ∧ Y ) ≤ nk .
(ii) Show via an example that the conclusion of the previous part doesn’t hold if the as-
sumption of mutual independence of X1 , ..., Xn is dropped.

Q12 Consider discrete random variables U and X where U is distributed uniformly over {0, 1}m .
We quantize X to b-bits (2b -levels) and store the quantized value. Later we form an estimate
Û of U from the stored value which agrees with U with probability of error less than 1/3.
Show that we must have quantized to more than 32 (m − 1) bits.

Q13 ∗ Using data processing inequality, show that Pinsker’s inequality holds for all distributions
iff it holds for binary distributions (those with support {0, 1}). Further, show that Pinsker’s
inequality for binary distributions holds, that is
p 1−p 2
p log + (1 − p) log ≥ · (p − q)2 .
q 1−q ln 2

Q14 ∗

(a) Establish the following variational formula for a discrete distribution P over real num-
bers: For all λ > 0,
h i
log EP 2λ X = max λ(EQ [X] − EP [X]) − D(QkP ).
Q:supp(Q)⊂supp(P )

(b) Show that if |X| ≤ 1 with probability 1, then EQ [X] − EP [X] ≤ d(P, Q).
(c) Conclude that if |X| ≤ 1, then
h i λ2
log EP 2λ X ≤ .
2 ln 2

You might also like