Professional Documents
Culture Documents
Suhas Diggavi
Homework Set #7
Due Thursday, 27 November 2008, before 12:00 noon, INR 031/032/038
x, y p(x, y )
0,0 0.45
0,1 0.05
1,0 0.05
1,1 0.45
(a) The jointly typical set A (X, Y ) is dened as the set of sequences xn {0, 1}n , y n {0, 1}n that satisfy equations 1 log p(xn ) H (X ) < n 1 log p(y n ) H (Y ) < n 1 log p(xn , y n ) H (X, Y ) < . n
(n) (n)
(n)
The rst two equations correspond to the conditions that xn and y n are in A (X ) and A (Y ), respectively. Consider the last condition, which can be rewritten as 1 log p(xn , y n ) (H (X, Y ) , H (X, Y ) + ). n
Let k be the number of different places in which the sequence xn differs from y n (k is a function of the two sequences). Then we can write
n
p(x , y ) =
i=1
p(xi , yi )
=(0.45)nk (0.05)k = 1 2
n
(1 p)nk pk .
An alternative way of looking at this probability is to look at the binary symmetric channel as an additive channel Y = X Z , where Z is a binary random variable that is equal to 1 with probability p, and is independent of X . In that case, p(xn , y n ) =p(xn )p(y n |xn ) =p(xn )p(z n |xn ) =p(xn )p(z n ) = 1 2
n
(1 p)nk pk .
Show that the condition that (xn , y n ) are jointly typical is equivalent to the condition that xn is typical and z n = y n xn is typical. (b) Now consider random coding for the additive channel. As in the proof of the channel coding theorem, assume that 2nR codewords X n (1), X n (2), . . . , X n (2nR ) are chosen uniformly over the 2n possible binary sequences of length n. One of these codewords is chosen and sent over the channel. The receiver looks at the received sequence and tries to nd a codeword in the code that is jointly typical with the received sequence. This corresponds to nding a codeword X n (i) such (n) that Y n X n (i) A (Z ). For a xed codeword xn (i), what is the probability that the received sequence Y n is such that (xn (i), Y n ) is jointly typical? (c) Suppose now that a xed codeword xn (1) was sent and Y n was received. Denote the event Ej :
n) nR Ej = {(X n (j ), Y n ) A( } (X, Y )}, j {2, 3, . . . , 2
to be the event that there is a codeword other than xn (1) jointly typical with Y n . Use the union bound to nd an upper bound to P r {E2 E3 . . . E2nR |xn (1) was sent}. (d) Given that a particular codeword was sent, the probability of error (averaged over the probability distribution of the channel and over the random choice of codewords) can be rewritten as P r (error|xn (1) sent) =
y n :y n causes error
There are two kinds of error: the rst occurs if the received sequence y n is not jointly typical with the transmitted codeword, and the second occurs if there is another codeword jointly typical with the received sequence. Using the results of the preceding parts, calculate this probability of error. By the symmetry of the random coding argument, this does not depend on which codeword was sent.
1 PG PG PG
1 PG
(b) The good channel
Figure 2: Two binary symmetric channels. The state of the channel is random process distributed according to the rst order Markov process with transition probabilities shown in Fig. 2. g 1g B b Figure 3: Transition of the states of the channel. G 1b
(a) Find the capacity of each of the channels shown in Fig. 2. What is the optimal input distribution for each of them? (b) Compute the stationary distribution of the state of the channel. (c) Assume that there is a genie who tells the state of the channel at each time instance to both of the encoder and decoder. Compute the capacity of the channel, CSI (capacity with side information). (d) Let the state of the channel be unknown to both encoder and decoder. What can the decoder say about the channel state? (e) Compare the capacity of the channel when the states are known to the encoder/decoder, CN SI (capacity without side information) to the capacity obtained in part (c). Hint: Verify the following inequalities and equalities, and argue whether the inequalities are strict
or tight. 1 CN SI = max I (X n ; Y n , S n ) p(x) n 1 = max I (X n ; Y n |S n ) p(x) n 1 = max (H (Y n |S n ) H (Y n |X n , S n )) p(x) n 1 max n p(x) max
p(x) n n
H (Yi |S )
i=1 n i=1 n
1 n 1 n
H (Yi |Si )
i=1 n
max
p(x)
I (Xi ; Yi |Si )
i=1
0 1 2 3 4
(a) The channel of problem 4
0 1 2 3 4
4 2 3
(b) Channel of problem 4 is called pentagon channel