Professional Documents
Culture Documents
Homework 3
Instructor: Himanshu Tyagi
Q1 Consider binary random variables X, Y , and Z, where X and Z are independent and are both
distributed uniformly and Y = X ⊕ Z.
Find H(X), H(Y ), H(X, Y ), H(X, Y |Z = 0), H(X, Y |Z = 1), and H(X, Y |Z). Use the
properties of entropy to argue that X and Y are independent and that X and Y are not
independent given Z.
Q3 Show that I(X ∧ Y, Z) ≥ max{I(X ∧ Y ), I(X ∧ Z), I(X ∧ Y |Z), I(X ∧ Z|Y )}. Find conditions
for equality.
Q5 A pack of cards is shuffled. Let X denote the random order of the resulting shuffled cards.
Note that X may or may not be uniformly distributed depending on the shuffling strategy.
After the shuffling operation, a card is selected at random (uniformly) and placed on the top.
Let Y denote the new order of the cards. Is H(X) more or less than H(Y )? Justify your
answer.
Q6 For a finite random variable X , let p denote the maximum probability of guessing X correctly.
Show that
Q7 Show that if X −◦− Y −◦− Z, then I(X ∧ Y |Z) ≤ I(X ∧ Y ) and find the conditions for equality.
Furthermore, provide an example where I(X ∧ Y |Z) > I(X ∧ Y ).
1
Q8 Let P and Q be pmfs over finite alphabetsPX and Y, respectively, and W : X → Y be a
discrete channel. Show that D(W kQ|P) = x P (x)D(WX kQ) is convex in Q (for a fixed P)
and linear in P for a fixed Q.
Q10 ∗
Pn
(i) Show that H(X1 , ..., Xn ) ≤ i=1 H(Xi ) and find conditions for equality.
(ii) Use the previous inequality to show that for 0 ≤ p ≤ 1/2
X n
≤ 2nh(p) ,
i
i≤np
Q11 (i) Consider random variables X1 , ..., Xn , Y where Y takes values in the set {0, 1}k and
X1 , ..., Xn are mutually independent. Show that we can always find a coordinate i such
that I(Xi ∧ Y ) ≤ nk .
(ii) Show via an example that the conclusion of the previous part doesn’t hold if the as-
sumption of mutual independence of X1 , ..., Xn is dropped.
Q12 Consider discrete random variables U and X where U is distributed uniformly over {0, 1}m .
We quantize X to b-bits (2b -levels) and store the quantized value. Later we form an estimate
Û of U from the stored value which agrees with U with probability of error less than 1/3.
Show that we must have quantized to more than 32 (m − 1) bits.
Q13 ∗ Using data processing inequality, show that Pinsker’s inequality holds for all distributions
iff it holds for binary distributions (those with support {0, 1}). Further, show that Pinsker’s
inequality for binary distributions holds, that is
p 1−p 2
p log + (1 − p) log ≥ · (p − q)2 .
q 1−q ln 2
Q14 ∗
(a) Establish the following variational formula for a discrete distribution P over real num-
bers: For all λ > 0,
h i
log EP 2λ X = max λ(EQ [X] − EP [X]) − D(QkP ).
Q:supp(Q)⊂supp(P )
(b) Show that if |X| ≤ 1 with probability 1, then EQ [X] − EP [X] ≤ d(P, Q).
(c) Conclude that if |X| ≤ 1, then
h i λ2
log EP 2λ X ≤ .
2 ln 2