You are on page 1of 5

Information Theory and Coding Prof.

Suhas Diggavi

EPFL Winter Semester 2008/2009 Handout #18, Tuesday, 18 November 2008

Homework Set #7
Due Thursday, 27 November 2008, before 12:00 noon, INR 031/032/038

Problem 1 (J OINTLY TYPICAL SEQUENCES)


We consider a binary symmetric channel (BSC) with crossover probability p = 0.1. 0 0.9 0.1 0.1 1 0.9 1 0

Figure 1: Binary symmetric channel with crossover probability 0.1


1 The input distribution that achieves capacity is the uniform distribution (i.e. p(x) = ( 1 2 , 2 )), which yields the joint distribution p(x, y )

x, y p(x, y )

0,0 0.45

0,1 0.05

1,0 0.05

1,1 0.45

1 1 The marginal distribution of Y is also ( 2 , 2 ).

(a) The jointly typical set A (X, Y ) is dened as the set of sequences xn {0, 1}n , y n {0, 1}n that satisfy equations 1 log p(xn ) H (X ) < n 1 log p(y n ) H (Y ) < n 1 log p(xn , y n ) H (X, Y ) < . n
(n) (n)

(n)

(1) (2) (3)

The rst two equations correspond to the conditions that xn and y n are in A (X ) and A (Y ), respectively. Consider the last condition, which can be rewritten as 1 log p(xn , y n ) (H (X, Y ) , H (X, Y ) + ). n

Let k be the number of different places in which the sequence xn differs from y n (k is a function of the two sequences). Then we can write
n

p(x , y ) =
i=1

p(xi , yi )

=(0.45)nk (0.05)k = 1 2
n

(1 p)nk pk .

An alternative way of looking at this probability is to look at the binary symmetric channel as an additive channel Y = X Z , where Z is a binary random variable that is equal to 1 with probability p, and is independent of X . In that case, p(xn , y n ) =p(xn )p(y n |xn ) =p(xn )p(z n |xn ) =p(xn )p(z n ) = 1 2
n

(1 p)nk pk .

Show that the condition that (xn , y n ) are jointly typical is equivalent to the condition that xn is typical and z n = y n xn is typical. (b) Now consider random coding for the additive channel. As in the proof of the channel coding theorem, assume that 2nR codewords X n (1), X n (2), . . . , X n (2nR ) are chosen uniformly over the 2n possible binary sequences of length n. One of these codewords is chosen and sent over the channel. The receiver looks at the received sequence and tries to nd a codeword in the code that is jointly typical with the received sequence. This corresponds to nding a codeword X n (i) such (n) that Y n X n (i) A (Z ). For a xed codeword xn (i), what is the probability that the received sequence Y n is such that (xn (i), Y n ) is jointly typical? (c) Suppose now that a xed codeword xn (1) was sent and Y n was received. Denote the event Ej :
n) nR Ej = {(X n (j ), Y n ) A( } (X, Y )}, j {2, 3, . . . , 2

to be the event that there is a codeword other than xn (1) jointly typical with Y n . Use the union bound to nd an upper bound to P r {E2 E3 . . . E2nR |xn (1) was sent}. (d) Given that a particular codeword was sent, the probability of error (averaged over the probability distribution of the channel and over the random choice of codewords) can be rewritten as P r (error|xn (1) sent) =
y n :y n causes error

p(y n |xn (1)).

There are two kinds of error: the rst occurs if the received sequence y n is not jointly typical with the transmitted codeword, and the second occurs if there is another codeword jointly typical with the received sequence. Using the results of the preceding parts, calculate this probability of error. By the symmetry of the random coding argument, this does not depend on which codeword was sent.

Problem 2 (T HE G ILBERT-E LLIOTT C HANNEL)


In this problem we study a time varying problem. This channel have two states, B (bad) and G (good), and binary input alphabet X = {0, 1} and quaternary output alphabet Y = {0, 1, 2, 3}. At each time instance, the channel might be in bad or good state, Si {B, G}, and acts like a binary symmetric channel as shown in Fig. 2. 0 1 PB PB PB 1 1 PB
(a) The bad channel

1 PG PG PG

1 PG
(b) The good channel

Figure 2: Two binary symmetric channels. The state of the channel is random process distributed according to the rst order Markov process with transition probabilities shown in Fig. 2. g 1g B b Figure 3: Transition of the states of the channel. G 1b

(a) Find the capacity of each of the channels shown in Fig. 2. What is the optimal input distribution for each of them? (b) Compute the stationary distribution of the state of the channel. (c) Assume that there is a genie who tells the state of the channel at each time instance to both of the encoder and decoder. Compute the capacity of the channel, CSI (capacity with side information). (d) Let the state of the channel be unknown to both encoder and decoder. What can the decoder say about the channel state? (e) Compare the capacity of the channel when the states are known to the encoder/decoder, CN SI (capacity without side information) to the capacity obtained in part (c). Hint: Verify the following inequalities and equalities, and argue whether the inequalities are strict

or tight. 1 CN SI = max I (X n ; Y n , S n ) p(x) n 1 = max I (X n ; Y n |S n ) p(x) n 1 = max (H (Y n |S n ) H (Y n |X n , S n )) p(x) n 1 max n p(x) max
p(x) n n

H (Yi |S )
i=1 n i=1 n

H (Yi |Xi , Si ) H (Yi |Xi , Si )


i=1

1 n 1 n

H (Yi |Si )
i=1 n

max
p(x)

I (Xi ; Yi |Si )
i=1

Problem 3 (PARALLEL C HANNELS AND C HANNELS WITH TWO INDEPENDENT LOOKS AT Y )


(a) Consider two discrete memoryless channels (X1 , p(y1 |x1 ), Y1 ) and (X2 , p(y2 |x2 ), Y2 ) with capacities C1 and C2 respectively. A new channel (X1 X2 , p(y1 |x1 ) p(y2 |x2 ), Y1 Y2 ) is formed in which x1 X1 and x2 X2 are sent simultaneously, resulting in y1 , y2 . Find the capacity of this channel. (b) Let Y1 and Y2 be conditionally independent and conditionally identically distributed given X . 1. Show that I (X ; Y1 , Y2 ) = 2I (X ; Y1 ) I (Y1 ; Y2 ). 2. Show that the capacity of the channel (X , p(y1 , y2 |x), Y1 Y2 ) is less than twice the capacity of the channel (X , p(y1 |x), Y1 )

Problem 4 (Z ERO - ERROR CAPACITY )


A channel with alphabet {0, 1, 2, 3, 4} has transition probabilities of the form p(y |x) = This channel is shown in g (4). (a) Compute the capacity of this channel in bits. (b) The zero-error capacity of a channel is the number of bits per channel use that can be transmitted with zero probability of error. Clearly, the zero-error capacity of this channel is at least 1 bit (consider the codebook C = 0, 1 for example). Find a block code that shows that the zero-error capacity is greater that 1 bit. Can you estimate the exact value of the zero-error capacity? Hint: Consider codes of length 2 for this channel. 1/2 0 if y = x 1 mod 5 otherwise (4)

0 1 2 3 4
(a) The channel of problem 4

0 1 2 3 4

4 2 3
(b) Channel of problem 4 is called pentagon channel

Figure 4: Channel of problem 4.

You might also like