Professional Documents
Culture Documents
1/1 Question 1
A deterministic channel has H( Y/X ) equal to
a. one or zero
b. one
d. zero
1/1 Question 2
Self information should be
b. negative
c. positive
1/1 Question 3
For a binary symmetric channel, the random bits are given as
1/1 Question 4
For M equally likely messages, M>>1, if the rate of information R ≤ C, the probability of error is
a. Arbitrarily small
b. Not predictable
c. unknown
d. Close to unity
0/1 Question 5
Information rate is defined as
d. rH
1/1 Question 6
Efficiency of coding is given by
a. H(X) / ( Average Length x log D )
b. H(X) / log D
1/1 Question 7
When the base of the logarithm is 2, then the unit of measure of information is
a. Nats
b. bytes
d. bits
0/1 Question 8
Entropy is
a. Amplitude of signal
b. None of these
c. Information in a signal
1/1 Question 9
For a noiseless channel
a. I(X;Y) = H(X) + H(Y)
1/1 Question 10
The relation between entropy and mutual information is
Back to assessments