You are on page 1of 4

Imperial College London

BSc/MSci EXAMINATION May 2016

This paper is also taken for the relevant Examination for the Associateship

INFORMATION THEORY
For 4th-Year Physics Students
Friday, 13th May 2016: 14:00 to 15:00

The paper consists of two sections: A and B.

Section A contains one question [20 marks total].


Section B contains two questions [30 marks each].

Candidates are required to:


Answer ALL parts of Section A, and ONE QUESTION from Section B.

Marks shown on this paper are indicative of those the Examiners anticipate assign-
ing.

General Instructions

Complete the front cover of each of the TWO answer books provided.

If an electronic calculator is used, write its serial number at the top of the front cover of
each answer book.

USE ONE ANSWER BOOK FOR EACH QUESTION.

Enter the number of each question attempted in the box on the front cover of its corre-
sponding answer book.

Hand in TWO answer books even if they have not all been used.

You are reminded that Examiners attach great importance to legibility, accuracy and
clarity of expression.

c Imperial College London 2016



2016/PO4.4b 1 Go to the next page for questions
SECTION A: COMPULSORY QUESTIONS [20 marks total]

1. A binary source produces the following sequence of 12 symbols


OXXXOXOXXOOX

(i) Calculate the single, joint and conditional probabilities [5 marks]


(ii) Calculate conditional entropy [3 marks]
[You may assume that the 13th symbol is X in order to have 12 pairs of symbols]

1
A Morse message is transmitting dot and dash symbols with probabilities 4
and
3
4
respectively. Find the average information received every second if
(iii) Dot and dash take 1 s each to transmit [3 marks]
(iv) If a dot takes 1 s and a dash takes 2 s [3 marks]

Using the system consisting of a transmitter T and a receiver R (see fig. 1), show
that the information of an event with probability p0 transmitted in a noisy channel is
log2 (p1 /p0 ), where p1 is the probability after the reception through the noisy channel.
[6 marks]
[Total 20 marks]

Figure 1

2016/PO4.4b 2 Please go to the next page


SECTION B: CHOOSE ONE QUESTION [30 marks total]

2. A communication system consists of eight symbols S1...S8 which have probabilities


p1...p8 respectively (see table 1).

(i) Find an appropriate Huffman code for this [4 marks]


(ii) Find the source entropy H [4 marks]
(iii) Find the code length L and the efficiency for both fixed code and Huffman code
[6 marks]

Symbol Probability Fixed code Huffman code


S1 0.08 0000
S2 0.1 1000
S3 0.07 0100
S4 0.02 0010
S5 0.2 0001
S6 0.4 1001
S7 0.03 1111
S8 0.1 1110
Table 1

An greyscale image is composed of 200x200 pixels. Each pixel has three pos-
sible shades of grey. If the pixel values occur independently, calculate the infor-
mation in the image:
(iv) Assuming the probability for the three shades are the same [3 marks]
(v) Assuming that the probabilities are 14 , 1
3
and 5
12
[3 marks]

A (7, 4) Hamming code is used for a binary symmetric channel with bit error
probability p.
(vi) Write the probability of decoding to a correct codeword [4 marks]
(vii) Evaluate probability of decoding to a wrong codeword if p = 0.2 [6 marks]

[Total 30 marks]

2016/PO4.4b 3 Please go to the next page


3. An information source produces three symbols A, B and C with probabilities 41 , 14 , and
1
2
, respectively. Errors occur in transmission as shown in table 2.

(i) Find the probabilities of A, B and C at the receiver [7 marks]


(ii) Deduce the information transfer [7 marks]

ARX BRX CRX


1 2
ATX 0 3 3
BTX 0 1 0
1 2
CTX 3
0 3

Table 2

Find the capacity of a binary symmetric channel with a binary error rate of 18
[4 marks]

Prove that the information transfer of a binary symmetric channel is a maximum,


1 bit, for p = p = 21 (where p and p are probabilities of an error and correct
transmission) [3 marks]

A good code is able to detect errors.


(iii) Name three examples of coding technique that allow error detection.
[3 marks]
(iv) Name two examples of coding technique that allow error detection and correc-
tion. [2 marks]

In cryptography we would like perfect secrecy.


(v) How does the entropy of the plaintext change, when compared to the unen-
crypted case, if we only know the encrypted text and we dont know the key?
[2 marks]
(vi) How is the entropy of the plain text related to the entropy of the key? [2 marks]

[Total 30 marks]

2016/PO4.4b 4 End of examination paper

You might also like