You are on page 1of 1

CS683/CMPE683: Spring 2007-2008

Information Theory

Handout #6

Homework 2
(Due Thursday, April 17, 2008 at 5:00pm)
All problems are from first edition of Cover and Thomas unless stated otherwise.
Problem 1: Problem 2.6 in text
Problem 2: Problem 2.7 in text
Problem 3: Problem 2.21 in text
Problem 4: Problem 2.30 in text (hint: the correct answer may not be obvious)
Problem 5: Problem 2.40 in 2nd edition of C&T (repeated below)
Discrete entropies. Let X and Y be two independent integer valued random variables. Let
X be uniformly distributed over {1, 2, , 8}, and let Pr{Y = k} = 2k, k = 1, 2, 3,
(a) Find H(X).
(b) Find H(Y).
(c) Find H(X + Y, X Y).
Problem 6: Problem 2.43 in 2nd edition of C&T (repeated below)
Mutual information of heads and tails.
(a) Consider a fair coin flip. What is the mutual information between the top and bottom
sides of the coin?
(b) A six-sided fair die is rolled. What is the mutual information between the top side and
the front face (the side most facing you)?
Problem 7: Problem 2.44 in 2nd edition of C&T (repeated below)
Pure randomness. We wish to use a three-sided coin to generate a fair coin toss. Let the
coin X have probability mass function
A,

X = B,
C ,

pA
pB
pC

where pA, pB, pC are unknown.


(a) How would you use two independent flips X1, X2 to generate (if possible) a
Bernoulli(1/2) random variable Z?
(b) What is the resulting maximum expected number of fair bits generated?
Problem 8: Problem 4.1 in text
Problem 9: Problem 4.2 in text
Problem 10: Problem 4.4 in text
Problem 11: Problem 4.9 in text

You might also like