You are on page 1of 3

DIGITAL COMMUNICATIONS

Question Bank 5

1. a) The probabilities of 4 messages are ½, ¼, 1/8 and 1/16. Calculate the Entropy. If the source
produces 100 Symbols/sec, determine the maximum information rate.
X Y
b) Verify that I ( XY )  H ( X )  H ( )  H (Y )  H ( ) in the case of Discrete Information
Y X
theory.
2. a) Find the entropy of a continuous source where the output message signal of the source is
Gaussian.
b) State and prove Shannon-Hartley Law.
3. a) Using Shanon-Fano algorithm find the code words for six messages occurring with
probabilities 1/3, 1/3, 1/6, 1/12, 1/24 and 1/24.
b) Explain the terms: i) Entropy ii) Information rate and iii) Channel Capacity.
4. a) Calculate the capacity of a standard 4 KHz telephone channel working in the range of 300 to
3400 Hz with a SNR of 32 dB.
b) Write a short note on Trade-off between Band width and SNR.
Y X
5. a) Verify the following expression: H [ XY ]  H [ X ]  H [ ]  H [Y ]  H [ ]
X Y
b) Let θ1, θ2, θ3 and θ4 have probabilities ½, ¼, 1/8 and 1/8.
i) Calculate H
ii) Find R if r=1 messages per second.
iii) What is the rate at which binary digits are transmitted if the signal is sent after
encoding θ1…… θ4 as 00, 01, 10 and 11.
6. A transmitter has an alphabet consisting of four letters [ F, G, H and I ]. The joint probabilities for
the communication are given below:

F G H I
A 0.25 0.00 0.00 0.00
B 0.10 0.30 0.00 0.00
C 0.00 0.05 0.10 0.00
D 0.00 0.00 0.05 0.10
E 0.00 0.00 0.05 0.00
Determine the Marginal, Conditional and Joint entropies for this channel.
7. Construct the Optimum Source Code for the following symbols.

X  X1 X2 X3 X4 X5 X6
P(x)  0.3 0.25 0.2 0.1 0.1 0.05

Find the coding efficiency.


8. a) An independent, discrete source transmits letters selected from an alphabet consisting of 3
letters A, B and C with respective probabilities 0.7, 0.2 and 0.1. if consecutive letters are
statistically independent and two symbol words are transmitted, find the entropy of the system
of such words.

A.S.Rao
Balaji Institute of Technology & Science
b) The noise characteristic of a channel is given as:
Y
0.6 0 .2 0 .2 
 0 .2 
X 0.2 0 .6  and P(x1)=1/8, P(x2)=1/8 and P(x3)=6/8.
0.2
 0 .2 
0 .6 

Find the mutual information.

9. a) A source is transmitting messages Q1, Q2 and Q3 with probabilities P1, P2 and P3. Find the
condition for the entropy of the source to be maximum.

b) Consider an AWGN channel with 4 KHz bandwidth and the noise PSD is  10 12 W / Hz.
2
the signal power required at the receiver is 0.1 mw. Find the capacity of the channel.
10. a)For the communication system whose channel characteristic is:

X Y
1/4
0 0
1/2
1/4 1

1 2
1
Given P(x=0), P(x=1) = 1/2. Verify that I[X, Y] = I[Y, X]

b) For a signal x(t) which is having a uniform density function in the range 0 ≤ x ≤ 4, find H(X). If
the same signal is amplified by a factor of 8, determine H(X)
11. a) Calculate the average information content in English language, assuming that each of the 26
characters in the alphabet occurs with equal probability.
b) a discrete memory less source X has four symbols X 1, X2, X3, and X4 with probabilities 0.4, 0.3,
0.2 and 0.1 respectively. Find the amount of information contained in the messages X 1 X2 X1 X3
and X4 X3 X3 X2.
12. Two binary channels are connected in cascade as shown.

a) Find the overall matrix of the resultant channel.


b) Find P(z1) and P(z2) when P(x1)=P(x2)=0.5
13. a) an analog signal having a 4KHz bandwidth is sampled at 1.25 times the Nyquist rate and each
sample is quantized into one of 256 equally likely levels. Assume that the successive samples are
statistically independent.
i) What is the information rate of this source.
ii) Find the bandwidth required for the channel for the transmission of the output
of the source, if the S/N ratio is 20 dB if error free transmission is required.
b) A discrete memory less source has five equally likely symbols. Construct a Shanon-Fano
code for the above source.

A.S.Rao
Balaji Institute of Technology & Science
c) Two dice are thrown and we are told that the sum of faces is 7. Find the information content
of above message.

A.S.Rao
Balaji Institute of Technology & Science

You might also like