You are on page 1of 3

Information Theory COMM 1003

Spring 2017

Tutorial 2

Problem 1

The input source to a noisy communication channel is a random variable X over the four symbols
a, b, c, d. The output from this channel is a random variable Y over these same four symbols. The
joint distribution of these two random variables is as follows:

(a) Write down the marginal (individual) distribution for X and compute the marginal entropy
H(X) in bits.

(b) Write down the marginal distribution for Y and compute the marginal entropy H(Y) in bits.

(c) What is the joint entropy H(X, Y) of the two random variables in bits?

(d) What is the conditional entropy H(Y |X) in bits?

(e) What is the mutual information I(X; Y) between the two random variables in bits?

(f) Provide a lower bound estimate of the channel capacity C for this channel in bits.

Answer:

a) Using total probability: the marginal (individual) distribution for X is (1/4, 1/4,1/4, 1/4);

What we have in the table are the joint probabilities: thus,


∑𝑖 ∑𝑗 𝑝(𝑥𝑖 , 𝑦𝑗 ) = 1, ∀ 𝑖, 𝑗 ∈ {𝑎, 𝑏, 𝑐, 𝑑}  Check by yourself!!

𝑝(𝑥, 𝑦) = 𝑝(𝑦|𝑥). 𝑝(𝑥) = 𝑝(𝑥|𝑦)𝑝(𝑦) (1)

From the total probability theory:


𝑝(𝑥) = ∑𝑗 𝑝(𝑥|𝑦𝑖 )𝑝(𝑦𝑖 ) = ∑𝑗 𝑝(𝑥, 𝑦𝑗 )  From Eqn. (1)

p(x=a) = p(y=a) + p(y=b) + p(y=c) + p(y=d) =1/8+1/16+1/32+1/32 = 1/4 and so on


p(x=b) = 1/4, p(x=c) = 1/4, p(x=d) = 1/4 also
Information Theory COMM 1003
Spring 2017

The marginal entropy H(X) = 1/2 {-1/4*log2 (1/4)} + 1/2 + 1/2 + 1/2 = 2 bits.

b) Marginal distribution for Y = (1/2 {i.e., 1/8+1/16+1/16+1/4}, 1/4, 1/8, 1/8)


Marginal entropy of Y is 1/2 + 1/2 + 3/8 + 3/8 = 7/4 bits

c) joint entropy H(X, Y):


𝐻(𝑋, 𝑌) = ∑𝑖 ∑𝑗 𝑝(𝑥𝑖 , 𝑦𝑗 ) log 2 𝑝(𝑥𝑖 , 𝑦𝑗 )
i.e., over all 16 probabilities in the joint distribution(of which only 4 different non-zero
values appear, with the following frequencies):(1)(2/4) + (2)(3/8) + (6)(4/16) + (4)(5/32)
= 1/2 + 3/4 + 3/2 + 5/8 = 27/8 bits.

d) Conditional entropy:
27 16 11
𝐻(𝑌|𝑋) = 𝐻(𝑋, 𝑌) − 𝐻(𝑋) = 8 − 8 = 8
Or
𝐻(𝑌|𝑋) = − ∑𝑖 ∑𝑗 𝑝(𝑥𝑖 , 𝑦𝑗 ) log 2 𝑝(𝑦𝑖 |𝑥𝑖 )

𝑝(𝑥𝑖 ,𝑦𝑗 ) 𝑝(𝑥 )


= − ∑𝑖 𝑝(𝑥𝑖 ) ∑𝑗 𝑝(𝑦𝑗 |𝑥𝑖 ) log 2 𝑝(𝑦𝑗 |𝑥𝑖 ) = ∑𝑖 𝑝(𝑥𝑖 ) ∑𝑗 log 2 𝑝(𝑥 ,𝑦𝑖
𝑝(𝑥𝑖 ) 𝑖 𝑗)

As 𝑝(𝑦𝑖 |𝑥𝑖 )𝑝(𝑥𝑖 ) = 𝑝(𝑥𝑖 , 𝑦𝑗 )

.’. H(Y|X) = (1/4) H(1/2, 1/4, 1/8, 1/8) + (1/4) H(1/4, 1/2, 1/8, 1/8) + (1/4) H(1/4, 1/4, 1/4,
1/4) + (1/4) H(1, 0, 0, 0)
= (1/4) (1/2 + 2/4 + 3/8 +3/8) + (1/4) (2/4 + 1/2 + 3/8 + 3/8) + (1/4) (2/4 + 2/4 + 2/4 + 2/4)
+ (1/4) (0)
= (1/4) (7/4) + (1/4) (7/4) + 1/2 + 0
= (7/8) + (1/2) = 11/8 bits.

OR
H(X, Y) = H(X) + H(Y|X) = 2+11/8 = 16/8 + 11/8 = 27/8

e) Mutual information I(X; Y):


There are three alternative ways to obtain the answer:
I(X; Y) = H(Y) – H (Y |X) = 7/4 - 11/8 = 3/8 bits. Or,
I(X; Y) = H(X) – H (X|Y) = 2 - 13/8 = 3/8 bits. Or,
I(X; Y) = H(X) + H(Y) − H(X, Y) = 2 + 7/4 - 27/8 = (16+14-27)/8 = 3/8 bits.

f) Channel capacity is the maximum, over all possible input distributions, of the mutual
information that the channel establishes between the input and the output. So one lower bound
estimate can be simply any particular measurement of the mutual information for this channel,
i.e., such as the above measurement which was only 3/8 bits.
Information Theory COMM 1003
Spring 2017

Problem 2

Consider a binary symmetric communication channel, whose input source is the alphabet X = {0,
1} with probabilities {0.5, 0.5}; whose output alphabet is Y = {0, 1}; and whose channel matrix
is:
1−∈ ∈
[ ]
∈ 1−∈

Where ∈ is the probability of transmission error.

a. What is the entropy of the source, H(X)?

b. What is the probability distribution of the outputs, p(Y), and the entropy of this output
distribution, H(Y)?

c. What is the joint probability distribution for the source and the output, p(X, Y), and what is the
joint entropy, H(X, Y)?

d. What is the mutual information of this channel, I(X; Y)?

e. How many values are there for ∈for which the mutual information of this channel is maximal?
What are those values, and what then is the capacity of such a channel in bits?

f. For what value of ∈ is the capacity of this channel minimal? What is the channel capacity in
that case?

Answer:

a) Entropy of the source, H(X), is 1 bit.

b) Output probabilities are p(y = 0) = (0.5) (1 − ∈) + (0.5) ∈ = 0.5 and p(y = 1) = (0.5) (1 − ∈) +
(0.5) ∈ = 0.5. Entropy of this distribution is H(Y) = 1 bit, just as for the entropy H(X) of the input
distribution.

e) Joint probability distribution p(X, Y) and the rest are left as a self-study assignment.

Best Regards,
Eng. Omnia Mahmoud

You might also like