You are on page 1of 18

Multiple choice Question Template

BRIEF INFORMATION

1. Name of College: T.K.I.E.T., Warananagar

2. Name of Subject: Information Theory & Coding

3. Subject Code: 67526

4. Name of Faculty/ Teacher: Prof. Lokhande Pradipkumar Vitthalrao

5. Designation: Assistant Professor

6. Experience (in Years): 10

7. Status: Approved

8. Cell No.: 8605067454

9. Mail Id: pvlokhande@tkietwarana.ac.in


1) The channel capacity is
a) The maximum information transmitted by one symbol over the channel
b) Information contained in a signal
c) The amplitude of the modulated signal
d) All of the above

2) According to Shannon Hartley theorem,


a) The channel capacity becomes infinite with infinite bandwidth
b) The channel capacity does not become infinite with infinite bandwidth
c) Has a tradeoff between bandwidth and Signal to noise ratio
d) Both b and c are correct

3) Information rate is defined as


a) Information per unit time
b) Average number of bits of information per second
c) rH
d) All of the above

4) The relation between entropy and mutual information is

a) I(X;Y) = H(X) - H(X/Y)


b) I(X;Y) = H(X/Y) - H(Y/X)
c) I(X;Y) = H(X) - H(Y)
d) I(X;Y) = H(Y) - H(X)
5) The memory less source refers to

a) No previous information
b) No message storage
c) Emitted message is independent of previous message
d) None of the above

6) If the channel is bandlimited to 6 kHz & signal to noise ratio is 16, what would
be the capacity of channel?

a) 15.15 kbps
b) 24.74 kbps
c) 30.12 kbps
d) 52.18 kbps

7) Which among the following is/are the essential condition/s for a good error
control coding technique?

a) Faster coding & decoding methods


b) Better error correcting capability
c) Maximum transfer of information in bits/sec
d) All of the above

8) Which among the following represents the code in which codewords consists
of message bits and parity bits separately?

a) Block Codes
b) Systematic Codes
c) Code Rate
d) Hamming Distance
9) In a linear code, the minimum Hamming distance between any two code words
is ______minimum weight of any non-zero code word.

a) Less than
b) Greater than
c) Equal to
d) None of the above

10) Basically, Galois field consists of ______ number of elements.


a) Finite
b) Infinite
c) Both a and b
d) None of the above

11) While decoding the cyclic code, if the received code word is similar as
transmitted code word, then r(x) mod g(x) is equal to _________
a) Zero
b) Unity
c) Infinity
d) None of the above

12) In decoding of cyclic code, which among the following is also regarded as
'Syndrome Polynomial'?

a) Generator Polynomial
b) Received code word Polynomial
c) Quotient Polynomial
d) Remainder Polynomial
13) For designing of (4,1) cyclic repetition code, what would be the order of the
generator polynomial g(x)?
a) 3
b) 1
c) 4
d) 5

14) BCH codes exhibit the multiple error correcting capability with the provision
of selecting _________.
a) Alphabet size
b) Block length
c) Code rates
d) All of the above

15) While representing the convolutional code by (n,k,m), what does 'm' signify
or represent in it?

a) Coded bits
b) Message bits
c) Memory order
d) All of the above

16) In Viterbi's algorithm, which metric is adopted for decision making?


a) survivors
b) defenders
c) destroyers
d) carriers
17) Calculate the entropy of source with a symbol set containing 64 symbols each
with a probability 1/64.
a) 64 bits/symbol
b) 6 bits/symbol
c) 4 bits/symbol
d) 1 bit/symbol

18) In BSC,
a) Entropy of each row of P(Y/X) is same
b) sum of all columns in P(Y/X) is same
c) Option 1 & option 2 are correct
d) Option 1 & option 2 are wrong

19) Apply Huffmans coding for probabilities p(x1)= 0.37, p(x2)= 0.01, p(x3)=
0.33,p(x4)= 0.02,p(x5)= 0.16, p(x6)= 0.07,p(x7)= 0.04, Calculate code efficiency.
a) 97.47 %
b) 91.58 %
c) 100 %
d) 85.57 %

20) Apply Shannon fano coding for probabilities p(x1)= 0.05, p(x2)= 0.25,
p(x3)=0.08, p(x4)= 0.2,p(x5)= 0.12, p(x6)= 0.3 Calculate redundancy of the code.

a) 0.06
b) 0.01
c) 0.10
d) 0
21) Find the mutual information for the channel shown in fig.

a) 0.854 bits/message
b) 0.907 bits/message
c) 0.185 bits/message
d) 0 bit/message

22) Find the channel capacity for the channel shown in fig.

a) 0.067 bits/sec
b) 0.6 bits/sec
c) 0 bits/sec
d) 0.76 bits/sec
23) A discrete source transmits message X1, X2 & X3 with probabilities
P(X1)=0.3, P(X2)=0.25 and P(X3)=0.45 receptively. The source is connected to
channel whose conditional probability matrix is as shown in fig. Calculate mutual
information.

a) 0.85 bits/message
b) 0.75 bits/message
c) 0.95 bits/message
d) 0.1 bits/message

24) In Linear Block code if R=110110 & E=010000, calculate transmitted Code
a) 110110
b) 000110
c) 100110
d) 010000

25) In LBC, if minimum distance is 3 then find error correcting capability.


a) 3
b) 4
c) 1
d) 2
26) The maximum value of the probability is
a) 0
b) 0.5
c) 1
d) 10

27) 1 nat is equal to


a) 3.32 bits
b) 2.32 bits
c) 1.44 bits
d) 3.44 bits

28) The ideal communication channel is defined for a system which has
a) finite C
b) BW=0
c) S/N=0
d) infinite C

29) The coding efficiency is expressed as


a) 1 + redundancy
b) 1 - redundancy
c) 1 / redundancy
d) none of these
30) A (7,4) linear block code has a rate of
a) 7
b) 4
c) 1.75
d) 0.571

31) For GF (2^3), the elements in the set are


a) {1 2 3 4 5 6 7}
b) {0 1 2 3 4 5 6 }
c) {1 2 3 4 5 6 }
d) {0 1 2 3 4 5 6 7}

32) In modulo-2 arithmetic, _________give the same result.


a) addition & multiplication
b) addition & division
c) addition & subtraction
d) none of the above

33) The _____of a polynomial is the highest power in the polynomial.

a) range
b) degree
c) power
d) none of these
34) Hamming distance between v=1100001011 and w=1001101001 is
a) 1
b) 5
c) 3
d) 4

35) (7, 4) hamming codes are


a) single-error-correcting codes
b) double-error-correcting codes
c) burst-error-correcting codes
d) triple-error-correcting codes

36) In LBC, the _____of any two valid code words creates another valid code
word.
a) XORing
b) ORing
c) ANDing
d) NOTing

37) The hamming distance between equal code words is


a) 1
b) n
c) 0
d) 10
38) A code is with minimum distance = 5.How many errors can it correct?
a) 3
b) 2
c) 4
d) 1

39) The generator polynomial of a (7,4) cyclic code has degree of

a) 2
b) 3
c) 4
d) 5

40) The syndrome polynomial in a cyclic code solely depends on


a) generator polynomial
b) parity polynomial
c) error polynomial
d) code word

41) For m=4, what is the block length of the BCH code?
a) 16
b) 15
c) 16
d) none of these
42) What is the error-correction capability of a (15,5) BCH code over GF(2^4)?
a) 1
b) 2
c) 3
d) 4

43) A BCH code over GF (2^6) can produce the code maximum error capability
of
a) 6
b) 8
c) 10
d) 12

44) A (63, 15) BCH code over GF (2^6) can produce the code maximum error
capability of
a) 6
b) 8
c) 10
d) 12

45) For (n,k,m) convolution code, k represents


a) number of inputs
b) number of outputs
c) number of memory
d) none of these
46) An encoder for a (4, 3, 2) convolution code has the memory of order of
a) 1
b) 2
c) 3
d) 4

47) In convolution encoding state diagram bold line represents input as


a) 0
b) 1
c) 00
d) 11

48) In trellis diagram, the encoder output is written on the _____of each branch.
a) right
b) left
c) top
d) bottom

49) Graphical representation is possible in


a) Linear block code
b) Cyclic code
c) BCH code
d) Convolution code.
50) Viterbi algorithm applies the

a) maximum likelihood principle.


b) minimum likelihood principle.
c) maximum entropy principle.
d) minimum entropy principle.

51) Cyclic code is subclass of


a) Entropy
b) Channel
c) Linear block code
d) Mutual Information.

52) CRC code representation of


a) Cyclic Redundancy Check
b) Convolution Redundancy Check
c) Channel Redundancy Check
d) Check Redundancy Code

53) RS code provides a


a) wide range of code rates.
b) short range of code rates.
c) entropy
d) mutual information.
54) Calculate the entropy for probabilities as 1/4, 1/6, 1/8
a) 3.46 bits/message
b) 34.6 bits/message
c) 0.346 bits/message
d) 346 bits/message

55) Figure indicates _____________channel.

a) Binary symmetrical
b) Non-symmetrical
c) cascade
d) Binary Erasable

56) Entropy is _______when both messages are equally likely.


a) minimum
b) maximum
c) 0
d) none of these
57) Which formula is applicable to calculate code efficiency?

a) H/L bar
b) L bar/H
c) H * L bar
d) H + Lbar

58) What is the unit of channel capacity?


a) bits/message
b) bits/sec
c) both a & B options are correct
d) None of these

59) Calculate the entropy, If p(x1) =0.5, p(x2) =0.2, p(x3) =0.2, p(x2) =0.1
a) 76.4 bits/message
b) 76.4 bits/symbol
c) both a & B options are correct
d) None of these

60) A discrete memoryless source emits 5 symbols in every 2 ms., the symbol
probabilities as {0.5, 0.25, 0.125, 0.0625, 0.0625 }. Find the rate of average
information of source.
a) 900.5 bits/sec
b) 3.750 bits/sec
c) 937.5 bits/sec
d) 1.875 bits/sec

You might also like