You are on page 1of 2

Digital Communication

Tutorial sheet

Q1- In the random experiment of tossing two dice, if the random variable X represents the subtraction of the two outcomes, calculate the expected value and the variance. Q2-Calculate the capacity of an AWGN channel with bandwidth of 5 kHz when the signal and noise powers are 10mw and 1mw respectively. Q3- Calculate the entropy of the random variable of question 1. Q4- Design the Huffman code for the following scenarios: q 0.25 0.25 0.17 0.17 0.16 q 0.25 0.25 0.25 0.25

a b Then calculate the Entropy and the average code word length. Q5- A fair coin is tossed until the first head occurs. Let X denotes the number of tosses required; find the entropy H(X). the following expression may be helpful:

Q6- A joint ensemble XY has the following joint distribution.

Digital Communication

Tutorial sheet

What is the joint entropy H(X, Y)? What are the marginal entropies H(X) and H(Y)? For each value of y, what is the conditional entropy H(X | y)? What is the conditional entropy H(X|Y )? What is the conditional entropy of Y given X? What is the mutual information between X and Y?

You might also like