You are on page 1of 2

COURSES GROUPS RESOURCES GRADES 17BEC0248 Teja

Your assessment has been submitted. x

QUIZ 20-21 ITC F1: Section 1   Tests/Quizzes

QUIZ 1 19-20 ITC F1 Submissions Enabled Grade: 80/100

My Submissions Test/Quiz Comments

17BEC0248 Teja Submission 1

1/1 Question 1
A deterministic channel has H( Y/X ) equal to
a. one or zero

b. one

c. none of the above

d. zero

1/1 Question 2
Self information should be

a. none of the above

b. negative

c. positive

d. Both of the mentioned

1/1 Question 3
For a binary symmetric channel, the random bits are given as

a. Logic 1 given by probability P and logic 0 by (1-P)/2

b. Logic 1 given by probability P and logic 0 by (1-P)

c. Logic 1 given by probability P/2 and logic 0 by 1-P

d. Logic 1 given by probability 1-P and logic 0 by P

1/1 Question 4
For M equally likely messages, M>>1, if the rate of information R ≤ C, the probability of error is

a. Arbitrarily small

b. Not predictable

c. unknown

d. Close to unity

0/1 Question 5
Information rate is defined as

a. All the above

b. Average number of bits of information per second

c. Information per unit time

d. rH
1/1 Question 6
Efficiency of coding is given by
a. H(X) / ( Average Length x log D )

b. H(X) / log D

c. none of the above

d. H(X) / Average length

1/1 Question 7
When the base of the logarithm is 2, then the unit of measure of information is
a. Nats

b. bytes

c. none of the above

d. bits

0/1 Question 8
Entropy is

a. Amplitude of signal

b. None of these

c. Information in a signal

d. Average information per message

1/1 Question 9
For a noiseless channel
a. I(X;Y) = H(X) + H(Y)

b. I (X;Y) = H(X) = H(Y)

c. I(X;Y) = H(X/Y) + H(Y/X)

d. none of the above

1/1 Question 10
The relation between entropy and mutual information is

a. I(X;Y) = H(X/Y) - H(Y/X)

b. I(X;Y) = H(Y) - H(X)

c. I(X;Y) = H(X) - H(Y)

d. I(X;Y) = H(X) - H(X/Y)

Back to assessments

You might also like