You are on page 1of 16

This sheet is for 1 Mark questions

S.r No Question

In more uncertainty about the massage, Information


1
carried is

For M equally likely messages, the average amount


2
of information H is ________________

When X and Y are statistically independent,then


3
mutual information I(X;Y) is__________________

4 Lempel-ziv algorithm is

when the base of the logorithm is e the unit of


5
measure of information is

6 The mitual information is Symmetric means

7 I(X;Y) is related to the joint entropy H(X,Y) by

8 (Entropy) H=0 if Pk = ____________________

A prefix code is uniquely decodable, and thereby


9
satisfying____________

10 The output of the source encoder is an analog signal

DMS of Entropy H(s), the average codeword length


11
of prefix coden is bounded as H(s) ≤ _________

12 The mutual information

13 Information rate is defined as


The technique that may be used to increase average
14
information per bit is

15  Run Length Encoding is used for


A fair is tossed repeatedly until a ‘Head’ appears
for
16 the first time. Let L be the number of tosses to get
this first Head’. The entropy H(L) in bitsis
_________
A source alphabet consists
17 of N symbols with the probability
of the first two symbols being the
same. A source encoder increases
A memory
the lessofsource
probability emits
the first n
symbo
symbols each with a probability p.
l by a small amount e. After encoding,
18
The entropy of the source as
the entropy of the source________a
function of n __________

A source generates three symbols with probability


0.25, 0.25, 0.50 at a rate of 3000 symbols per
19
second. Assuming independent generation
ofsymbols, the most efficient source encoder would
have average bit rate
of________________________
A source produces 4 symbols
with probability 1/2,1/4,1/8𝑎1/8. For
this source, a practical coding scheme has an
20
average code word length of 2 bits/symbols. The
efficiency the code is
___________________

Consider a discrete memoryless source


with alphabet 
and respective
21 probabilities

of occurrence. The entropy of the source (in bits) is

Calculate the entropy of source with a symbol set


22 containing 64 symbols each with a probability pi =
1/ 64 .

A memoryless source emits n symbols each with a


23 probability p. The entropy of the source as a
function of n
An image uses 512 × 512 picture elements. Each of
the picture elements can take any of the 8
24 distinguishable intensity levels. The maximum
entropy in the above image will be
Consider a binary memoryless channel c
haracterized by the transition probablility
diagram shown in the figure The channel is

25

The input X to the Binary Symmetric Channel


(BSC) shown in the figure is '1' with probability 0.8.
The cross-over probability is 1/7. If the received bit
Y= 0, the conditional probability that '1' was
26 transmitted is ______.

Two independent random variable X and Y are


uniformly distributed in the interval [-1, 1]. The
probability that max [X, Y] is less than 1/2 is
27

Consider identical four, 3-faced dice. When the dice


are rolled, the faces of the dice appear with
28 probabilities given below. Which distribution has
the maximum entropy?
A source produces three symbols A, B and C with
probabilities
29  and The source entropy is

An analog baseband signal, band limited to 100 Hz,


is sampled at the Nyquist rate. The samples are
quantized into four message symbols that occur
30 independently with probabilities p1 = p4 = 0.125
and p2 = p3. The information rate (bits/sec) of the
message source is_________
A discrete memoryless source has an alphabet 
with corresponding probabilities.

31
The minimum required average code world length
in bits to represent 2 4 8 8 this source for error-free
reconstruction is _______.

In a digital communication system, transmission of


successive bits through a noisy channel are assumed
32 to be independent events with error probability p.
The probability of at most one error in the
transmission of an 8-bit sequence is

Consider a Binary Symmetric Channel (BSC) with


probability of error being p. To transmit a bit say 1,
33 we transmit a sequence of three sequence to
represent 1 if at least two bits bit will be represent in
error is

During transmission over a certain binary


communication channel, bit errors occur
34 independently with probability p. The probability of
at most one bit in error in a block of n bits is given
by

Let U and V be two independent and independent


and identically distributed random variables such
35 that 𝑃p(𝑈U = +1) = p𝑃(U𝑈 = −1) = 1 2 . The
entropy H(U+V) in bits is

An Analog signal is band limited to 4khz, Sampled


at the Nyquest rate and the samples are quanised
into 4 levels. The quantised levels are assumed to be
36
independent and equally probable. If we transmit
two quantised samples per second the information
rate is

A binarycommunication system makes use of


symbols"0" and "1". There are channel errors.
Consider the following events
x0: a zero is transmitted
x1: a one is transmitted
37 y0:a zero is received
y1:a one is received
The following probabilities are given P(x0)=1/2
P(y0/x0)=3/4 and P(y0/x1)=1/2 The following
information in bits that you obtain when you learn
which symbol has been received is
Image a

less

 H = log10M

Variable to Fixed length algorithm

Bits

I(X;Y)=I(X;Y)

I(X;Y) = H(X) - H(X,Y)

Kraft McMillan Inequality

H(S)-1

Is symmetric

Information per unit time


Shannon-Fano algorithm

Reducing the repeated string of characters

increases

increases as n

4500 bits/sec

7/8

1.5

4 bits/
symbol

increases
2097152 bits

Lossless

1.5

3/4

(1/2, 1/4, 1/4)

bit/symbol

1.5
1.5

7(1 − 𝑝)/+𝑝/8

𝑝 3 + 3𝑝 2 (1 − 𝑝)

𝑝 ^n

3/4

1 bit/sec

0.91 bits
b c d

more very less both a and b

 H = log10M2 H = log2M H = 2log10M

0 ln2 can not be determined

Fixed to Variable length Variable to Variable length Fixed to Fixed length


algorithm algorithm algorithm

Bytes Nats None of the mentioned

I(X;Y)=I(X;X) I(Y;X)=I(Y;Y) I(X;Y)=I(Y;X)

I(X;Y) = H(X) + H(Y) - I(X;Y) = H(X) + H(Y) -


I(X;Y) = H(X) + H(X,Y)
H(X,Y) H(X,Y)

1 -1 both a and b

Shannon Fano Huffman Extended Huffman

H(S)=1 H(S)+1 H(S)*1

Always non negative Both a) and b) are correct None of the above

 Average number of bits of


 rH all of the above
information per second
Digital modulation
ASK FSK
techniques

Correction of error in
Bit error correction All of the above
multiple bits

2 3 4

increases only if N = 2 decreases remains the same

increases increases as n log n decreases

1500 bits/sec 3000 bits/sec 6000 bits/sec

1/2 1/4 1

2.5 2 null

5 bits/ 6 bits/ 7 bits/


symbol symbol symbol

increases
remains the same decreases
only if N = 2
786432 bits 648 bits 144 bits

useless noiseless deterministic

2.5 0.39 null

5/9 1/4 2/3

(1/3, 1/3, 1/3) (1/6, 2/3, 1/6) (1/4, 1/6, 7/12)

1 bit/symbol bit/symbol bit/symbol

2.5 362.2 null


2.5 1.74 null

(1 − 𝑝) ^8 + 8𝑃(1 − 𝑝) ^7 (1 − 𝑝) ^8 + 𝑃(1 − 𝑝) ^7 (1 − 𝑝) ^8 + (1 − 𝑝) ^7

𝑝3 (1 − 𝑝)3 𝑝 3 + 𝑝2 (1 − 𝑝)

𝑛𝑝(1 − 𝑝) 𝑛−1 + (1 − 𝑝)n 1−𝑝n 1 − (1 − 𝑝) n

1 1 1/2 log23

2 bits/sec 3 bits/sec 4 bits/sec

0.71 bits 0.72 bits 0.81 bits


Correct
Answer

d
a

a
b

c
c

You might also like