You are on page 1of 18

# JNTU ONLINE EXAMINATIONS [Mid 2 - dc

]

Get more @ www.UandiStar.org

1.

Two binary random variables X and Y are distributed according to the joint Distribution given as P(X=Y=0) = P(X=Y=1) = P(X=Y=1) = 1/3.Then, [01D01] a. H(X) = H(Y) b. H(X) = 2.H(Y) c. H(Y) = 2.H(X) d. H(X) + H(Y) = 1. An independent discrete source transmits letters from an alphabet consisting of A and B with respective probabilities 0.6 and 0.4.If the consecutive letters are statistically independent , and two symbol words are transmitted, then the probability of the words with different symbols is [01D02] a. 0.52 b. 0.36 c. 0.48 d. 0.24 A memoryless source emits 2000 binary symbols/sec and each symbol has a probability of 0.25 to be equal to 1 and 0.75 to be equal to 0.The minimum number of bits/sec required for error free transmission of this source in bits/symbol is [01M01] a. 0.75 b. 0.81 c. 0.65 d. 0.55 Which of the following channel matrices respresent a symmetric channel? [01M02] a.

2.

3.

4.

b.

c.

d.

5.

The capacity of the channel with the channel Matrix

a. b. c. d. 6.

where xi`s are transmitted messages and yj`s are received messages is [01M03] log3 bits log5 bits log4 bits 1bit

Information rate of a source is [01S01] a. the entropy of the source measured in bits/message b. the entropy of the source measured in bits/sec. c. a measure of the uncertainity of the communication system d. maximum when the source is continuous If a. b. c. d. `a` is an element of a Field `F`, then its additive inverse is [01S02] 0 -a a 1

7.

8.

The minimum number of elements that a field can have is [01S03] a. 3 b. 2 c. 4 d. 1 Which of the following is correct? [01S04] a. The syndrome of a received Block coded word depends on the transmitted code word. b. The syndrome of a received Block coded word depends on the received code word c. The syndrome of a received Block coded word depends on the error pattern d. The syndrome for a received Block coded word under error free reception consists of all 1`s.

9.

www.UandiStar.org

100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts, Techology News, Tips/Tricks, JOB Alerts & more......

www.1. rows correspond to the Transmitter X and the columns correspond to the Receiver Y.25.1).. Techology News. 1. 3 c. 1.10. 3 15. 2 b. Average amount of information available with the source d. the Conditional entropy H(Y/X) in bits/message is [02M02] a.UandiStar. Then. The received code word should compatible with a Matched filter. The Channel Matrix of a Noiseless channel [02M03] a.org 11. zero b. 18. 19.75 A Communication channel is represented by the channel Matrix given as 14. less circuit complexity d. b. A Convolutional encoder of code rate 1/2 consists of two stage shift register.UandiStar. ... 2 b. JOB Alerts & more. Get more @ www. Joint Probability Matrix b. A Minimum Hamming distance of 3 c. Trellis diagram 16. then for error correction. Comma free nature of the code words b. Error detection and correction capability d. If the output of a continuous source is limited to an average power of σ 2. is a square Matrix Enropy of a source is [02S01] a. Conditional Probability Matrix d.125 and 0.5 d. which of the bits of the received code word is to be complemented? [02D02] a.0. 5 13. 4 d. 1 c. The constraint length of the encoder is [02D01] a. then the Maximum entropy of the source is [01S05] a. log 5 c.125 is [02M01] a. c. 4 d.1) and that of the bottom adder is (1. is an Identity Matrix d. The transition probabilities for a BSC will be represented using [02S04] a. 2 b. better coding gain c. better bit error probability b. soft decision decoding results in [02S02] a. consists of a single nonzero number in each row c.org 100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts. Average amount of information conveyed by the communication system b. log 3 d. In the above matrix.. The minimum number of bits per message required to encode the output of source transmitting four different messages with probabilities 0.. 17. 3 c.0.The Generator sequence of top adder is (1. State diagram c.5. If the Syndrome vector computed for the received code word is [ 1 1 0] .0. Tips/Tricks. Average amount of information transferred by the channel c. Average amount of information conveyed by the source to the receiver Relative to Hard decision decoding.3) Systematic Linear Block code is 12. d. lesser coding gain Which of the following is the essential requirement of a source coding scheme? [02S03] a. 5 The Parity Check Matrix of a (6. consists of a single nonzero number in each column b.

pq d. maximum entropy associated with a source c. alternate 1`s and 0`s starting with a 1 c.UandiStar.. 55 % Information content of a message [03S01] a.. 33 b. 0. 27 % c. a group with 0 as the multiplicative identity for its members c. all ones b. the average amount of information contained in the message " The sum of the faces is 7" in bits is [03M02] a. Exponential c.86 c. 28.. an Abelian group under addition b. if all the messages of the source are [03S04] a. 29. p/q When a pair of dice is thrown.org 100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts. the syndrome vector computed for the received cyclic code word consists of [03S02] a. continuously transmitted Shanon's Limit deals with [03S05] a. A Field is [02S05] a. all zeros A continuous source will have maximum entropy associated if the pdf associated with its output is [03S03] a. 25.75 b. 0.. Its capacity in bits is [03D02] a.. log 7 23. maximum capacity of a channel d. log 12 d. Poisson b.68 A source emits messages A and B with probability 0. with different transmission probability c.2 respectively.20.Under error free reception. 30. 27 c. increases with its uncertainty of occurrence d. . log 4 b. A Binary Erasure channel has P(0/0) = P(1/1) = p. a group with 1 as the additive identity for its members d. 0. JOB Alerts & more. Rayleigh d. 27. www. If the input of the encoder is a 5 bit message sequence. 45 % d. is the logarithm of its certainty of occurrence . 26. discretely transmitted d. increases with its certainty of occurrence b. P(k/0) = P(k/1) = q. 0. maximum bit rate of a source 24. 72 % b. The redundancy provided by the optimum source coding scheme for the above Source is [03M03] a. log 3 c. Its Capacity in bits/symbol is [03M01] a. Gaussian Variable length source coding provides better coding efficiency. maximum information content of a message b. Tips/Tricks.org 21. The constraint length of a convolutional encoder of code rate 1/3 is 5. 30 A communication channel is represented by its channel matrix with rows representing the messages associated with the source and the columns representing the messages associated with the receiver given as 22. p b. 24 d.UandiStar. a group with 0 as the additive inverse for its members Get more @ www. the length of the out put code word in bits is [03D01] a. alternate 0`s and 1`s starting with a 0 d. independent of the certainty of occurrence c. Techology News. q c. Equiprobable b.8 and 0.96 d.

35. Max H(X. Absolute measure c.Y) The entropy measure of a continuous source is a [04S03] a. FEC and ARQ are not used for error correction Error free communication may be possible by [04S05] a. 100 d. Its capacity is [04S02] a.. Both the sources 1 and 2 are having maximum amount of uncertainty associated d.. 010 c.Get more @ www..8 and Source 2 is transmitting two messages with probabilities 0. 110 The Memory length of a convolutional encoder is 3. Max H(X/Y) c. 4 d. 38.3) Linear Block code are 001110. the code polynomial is [05D01] a. providing redundancy during transmission d. E(x)]/g(x) d.4) non-systematic cyclic code with the generator polynomial 1+x+x3.UandiStar.If a 5 bit message sequence is applied as the input for the encoder. 4 b. 000101 c.101011.010011. which of the bits of the received code word is to be complemented? [04D01] a. 1+x+x3+x5 b. Maximum uncertainty is associated with Source 1 b.4) Cyclic code has a generator polynomial given as 1+x+x3. 3 c.110110. increasing the channel band width For the data word 1010 in a (7. 1+x+x3+x4 c.org 100 % & more.Which of the following is also a code word of the c Code? [04M02] a. FEC is used for error control when the receiver is unable to make a decision about the received bit d. 2 b. The S(x) is also equal to [04M03] a. Maximum uncertainty is associated with Source 2 c. Linear Measure d. 101101 The syndrome S(x) of a cyclic code is given by Reminder of the division . 5 d. then for the last message bit to come out of the encoder. 100101. Source1 is transmitting two messages with probabilities 0. Relative measure b. JOB Alerts SMS +x 37. transmitted code polynomial. increasing transmission power to the required level c. ARQ is used for error control after receiver makes a decision about the received bit c. the corresponding syndrome vector is [04D02] a. 41. 5 32. 6 The code words of a systematic (6. Max H(Y/X) d. E(x) is the error polynomial and g(x) is the generator polynomial. www. 34. .If the error pattern 0001000.5 and 0. 000111 b. Max H(X) b.2 and 0. then for error correction.org 31.. 3 c.3) Systematic Linear Block code is If the Syndrome vector computed for the received code word is [ 0 1 1] . 1+x2+x3+x4 2 5 d. There is no uncertainty associated with either of the two sources . 001 b.UandiStar. 39. where V(x) is the 33. Reminder of [V(x) . Reminder of V(x) /g(x) b.011101..5. Tips/Tricks. the number of extra zeros to be applied to the encoder is [04M01] a. Non-Linear Measure Which of the following is correct? [04S04] a. free1+x+xON<space>UandIStar to 9870807070 for JNTU Alerts.111000. Remainder of g(x)/V(x) 36. FEC is used for error control after receiver makes a decision about the received bit b.Then [04S01] a. Techology News. A (7. reducing redundancy during transmission b. A source X and the receiver Y are connected by a noise free channel. The Parity Check Matrix of a (6. 000000 d. 40. Reminder of E(x)/g(x) c.

3) Linear Block coded word 101011 [05S03] a. 52. 89 % b. Either of the Shanon. c.3) systematic Linear Block code. the third bit of the received code word is in error. 4 c. using source extension b. 100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts. The coding efficiency of the first order extension of the source is [05M03] a.42. the number of `6` bit code words that are not useful is [06D01] 56 64 8 45 Then [06D02] 48.HT The syndrome vector is S= [1 1 1] under error free reception 53. The Parity check Matrix H of a (6. asymmetric quaternary channel c. finite because of increase in noise power c. b. Deterministic channel 46. 8 d. asymmetric Binary channel In a. a transmitted 0 is received as 0 with a probability of 1/8. if the syndrome vector S computed for the received code word is [1 1 0]. d.org . The source coding efficiency can be increased by [05S01] a.fano and Huffman methods d. The syndrome vector S of the received code word is same as C. 4 b. 1 Get more @ www. a (6. d. 1/4. 49.UandiStar. the transition probability of the transmitted 0 is [06M01] www. and 1/4 respectively.3) Linear systematic Block code is a. 51. Symmetric channel d. In a Binary Symmetric channel.UandiStar. d. symmetric quaternary channel d. Asymmetric channel c. Only Shanon-Fano method c. For the source X transmitting four messages with probabilities 1/2.The entropy of the source in 43. 1/8 and 1/8.org . 2 The cascade of two Binary Symmetric Channels is a [05S04] a..Then. Tips/Tricks. Modulo-7 addition.. b. 92 % d. 6+1 is equal to [05M02] 7 5 4 0 44. using binary coding The capacity of a channel with infinite band width is [05S02] a. Noise free channel b. c. increasing the entropy of the source c. c. Convolutional codes b. Block codes In a. 81 % The noise characteristic of a communication channel is given as Rows represent the source and columns represent the columns.. C. 50.. 47. 45. symmetric Binary channel b. decreasing the entropy of the source d. Techology News. 5 d. 77 % c. JOB Alerts & more. finite because of finite message word length The Hamming Weight of the (6.HT = [1]. where C is the code word of the code. A source is transmitting two messages A and B with probabilities 3/4. infinite because of infinite band width b. 3 b. Maximum coding efficiency can be obtained by using [05M01] a. 2 c. infinite because of infinite noise power d. The Channel is a [05M04] a.. The output of a continuous source is a uniform random variable in the range bits/sample is [05D02] a. b.

. log n bits/symbol d. H(X) + H(Y) . 54.3) linear Block code is 100111 with an error in the pattern will be [06S05] a. Source coding introduces redundancy b.a.0111. 000010 bit.Y) bits/symbol If the received code word of a (6. given the receiver Which of the following is correct? [06S02] a.17 c. Conditional Entropy of the receiver. ARQ scheme of error control is applied when the receiver is unable to make a decision about the received bit. 1+x2+x3+x5 c. d. 63.Y) is equal to [06S01] a. b. log m .. 59. 1+x2+x3+x4 d. 1+x4+x5 The output of a source is band limited to 6KHz. 24Kbps b. 1 b. Non-systematic cyclic codes d. 57.UandiStar. Joint entropy of the source and receiver b.0001 b. 4 stage shift register b. 64. given the source d. n bits/symbol b. log m bits/symbol c. then. Conditional Entropy of the source. If the entropy of the source is 2bits/sample.17 d. 12Kbps d. the code polynomial is [07D01] a. the information content of the message ` the sum of the faces is 12` in bits is [07M01] a. 32Kbps When two fair dice are thrown simultaneously. c. 28Kbps c.H(Y/X) bits/symbol d. log m .H(X/Y) bits/symbol b. ARQ scheme of error control is applied after the receiver makes a decision about the received bit d. Techology News. the mutual information I(X.58 The encoder of a (7.0011. 1+x+x3+x5 b. The capacity of the channel is given as [06S04] a. Which of the following is an FEC scheme? [06S03] a. 0000.. 58.It is sampled at a rate of 2KHz above the nyquist`s rate. 0000 c. 60. JOB Alerts & more. www. Tips/Tricks.11) linear Block code? [06M03] a. .org A source transmitting `n` number of messages is connected to a noise free channel. Shanon-Fano encoding b. 61. 001000 d. then the entropy of the source in bits/sec is [07D02] a. 56. Which of these can not be a member of the parity check matrix of a (15. 0111 If X is the transmitter and Y is the receiver and if the channel is the noise free.4) non-systematic cyclic code with the generator polynomial 1+x+x3. Channel coding is an efficient way of representing the output of a source c. 4..4) systematic cyclic encoder with generating polynomial g(x) = 1+x2+x3 is basically a [07M02] a. 5. 11 stage shift register 62. Entropy of the source c. Huffman encoding c. n2 bits/symbol c. 100000 b.H(X.. For the data word 1110 in a (7. 0011 d. 2n bits/symbol There are four binary words given as 0000. 000001 c. The capacity of the channel is [06M02] a.org 100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts.0001. 1/8 7/8 6/8 5/8 Get more @ www.UandiStar. Duo-binary encoding A discrete source X is transmitting m messages and is connected to the receiver Y through a symmetric channel. the corresponding error 55. 3. 3 stage shift register c.

3 step responses b... x3+x+1 b. channel with maximum capacity d. 70. thrice the input PSD c.The two sided output noise 66. 0001000 c. Source coding scheme b. 68. 1 c.4) systematic Cyclic code? [07S02] a.k) systematic cyclic code. 0 bits b. Source encoding reduces the probability of transmission errors A linear block code with Hamming distance 5 is [07S05] a..d. H(X. 2 d. x5+x2+1 The time domain behavior of a convolutional encoder of code rate 1/3 is defined in terms of a set of [07S03] a. b. 65.org A received code word of a (7.. infinity White noise of PSD w/Hz is applied to an ideal LPF with one sided band width of 1Hz . In a convolutional encoder. 30 b. H(X.Y) . c. 76. Gaussian channel Automatic Repeat Request is a [08S01] a. 22 stage shift register Get more @ www. each code word is the cyclic shift of an another codeword of the code.k) block code.Then [08M02] a. 9 b. lossless network b. 3 ramp responses d.UandiStar. In an (n. In an (n.Y) = 2 bits/message b. same as the input PSD 73. twice the input PSD d. data conversion scheme 74. H(X/Y) = 2 bits/message c. 3 sinusoidal responses Which of the following is correct? [07S04] a. 36 A source X with entropy 2 bits/message is connected to the receiver Y through a Noise free channel. 0010001 The product of 5 and 6 in Modulo-7 multiplication is [07S01] a. x4+x3+1 c. given the receiver is H(X/Y) and the joint entropy of the source and the receiver H(X.4) systematic received cyclic code word 1000011 is corrected as 1000111.The corresponding error pattern is [07M03] a. four times the input PSD b. 0000100 b. 18 c. 75. 69. 2 bits d. double error detecting code c. error control scheme d. power of the channel is [08D02] a.UandiStar.Y) = 0 bits/message d. resistive network c. d.. 27 d. Tips/Tricks. 3 Which of the following can be the generating polynomial for a (7. Triple error correcting code d. 1bit c. 71. 72. . the constraint length of the encoder is equal to the tail of the message sequence + 1. Techology News. JOB Alerts & more. H(X/Y) = 1 bit/message A channel with independent input and output acts as [08M03] a. 67.The code word length obtained from the encoder ( in bits) is [08M01] a. Double error correcting code The channel capacity of a BSC with transition probability 1/2 is [08D01] a. The conditional probability of the source. 0000010 d. Single error correcting and double error detecting code b. x7+x4+x3+1 d. A convolutional encoder of code rate 1/2 is a 3 stage shift register with a message word length of 6. error correction scheme c. 3 impulse responses c.org 100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts. www. the sum of two code words is another code word of the code.

4 b.11) linear block code b. Which of the following provides minimum redundancy in coding? [09S02] a. 1+x2+x3+x4 d. then which of the 88.4) systematic Linear Block code. increases signal power level relative to channel noise d. Tips/Tricks.. If the band width of the channel is doubled. b. (15. In a.111. Which of the following is a valid source coding scheme for a source transmitting four messages? [09M01] a.. 0. 80. 1+x2+x3+x5 c.001. 83. d. 1. b. the number of extra zeros to be applied to the encoder is [09M03] a. then for the last message bit to come out of the encoder. S/N ratio at the input of the received gets halved d.org 100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts.77. 86. c. d. reduces transmission efficiency c. S/N ratio at the input of the received gets doubled The Memory length of a convolutional encoder is 4. results in reduced transmission band width requirement Get more @ www. Mutual information b. the product of 3 and 2 is [08S05] 6 3 2 4 79. d. 0.1110 c. (6.org 78. b. 87. c. Channel coding [08S02] a. 5 d. c.3) systematic cyclic code d.00. c.UandiStar. b.11. the code polynomial is [09D01] a. capacity d. Capacity of the channel gets doubled b. 3 c. the number of `7` bit code words that are not useful for the user is [09D02] 16 112 128 96 82. 85. 1+x+x5 In a.. Techology News. d.4) non-systematic cyclic code with the generator polynomial 1+x2+x3. www. c. b. For the data word 1110 in a (7. .01. the product of 4 and 3 is [09S01] 12 7 2 3 84.110. avoids redundancy b.001. d. Convolutional code If C is the channel capacity and S is the signal input of the channel and following is the Shannon`s limit? [09S03] a. a Linear Block code [08S04] the received power varies linearly with that of the transmitted power the encoder satisfies super position principle parity bits of the code word are the linear combination of the message bits the communication channel is a linear system Modulo-4 arithmetic. entropy In a.0010 A system has a band width of 4KHz and an S/N ratio of 28 at the input to the Receiver. 1. 81. then [09M02] a..If a 5 bit message sequence is applied as the input for the encoder.UandiStar. Shanon-Fano encoding c. The information content available with a source is referred to as [08S03] a.10. Trans information c. Capacity of the channel gets squared c. JOB Alerts & more.111 d. is the Input noise PSD.. 6 In a. Modulo-5 multiplication. 1+x+x3+x5 b. a (7.110 b.

8 93. 96.UandiStar. Which of the following provides minimum redundancy in coding? [10S02] a.4) systematic Cyclic code? [10S04] a.. Information content of a message b. which of the bits of the received code word is to be complemented? [10D01] a. Variable length coding schemes need not necessarily be used c. then for error correction. x4+x3+1 c. 1111 In a. 3 c. Signal power/ Noise power c. 6 A source is transmitting four messages with equal probability. Fixed length coding schemes should not be used d. 3 c. If the Syndrome vector computed for the received code word is [ 0 1 0] ..00. Entropy of the source c.001. Signal power + Noise Power d. Convolutional codes should be used Which of the following is a valid source coding scheme for a source transmitting five messages? [10M03] a.1111 b. 0.If a 6 bit message sequence is applied as the input for the encoder. 100.1111 d. 0. variable length coding schemes should be used b. Tips/Tricks. Modulo-7 addition. www.3) linear block code b. Techology News.01.UandiStar.org 100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts.If the output power of the filter is 8η . (15. Information content of the message b. 1. Signal Power x Noise Power The fundamental limit on the average number of bits/source symbol is [09S05] a.0010. x5+x2+1 Which of the following provides the facility to recognize the error at the receiver? [10S05] a. necessarily. Entropy of the source c.0001. 1.3) systematic cyclic code d. 99. (6. c.0000 c.org 90. then the value of B in Hz is [10D02] a.11) linear block code c. 6 + 4 is equal to [10S01] 10 2 3 5 94. 4 c.1110. JOB Alerts & more. The filter provides a gain of 2. 4 b. Signal power .for optimum Source coding efficiency. The Power received at the receiver input is [09S04] a. 95. 6 d. Then. White noise of PSD is applied to an ideal LPF with one sided band width of B Hz.89.11. 2 b.10. 91. 5 92.. 4 d. 97. then for the last message bit to come out of the encoder. (6.110. The Memory length of a convolutional encoder is 5. x7+x4+x3+1 d. x3+x2+1 b. Mutual Information d.1110. 2 b. Shanon-Fano Encoding 98. Convolutional code Which of the following involves the effect of the communication channel? [10S03] a. 5 d.3) Systematic Linear Block code is Get more @ www. Mutual information d. [10M02] a. the number of extra zeros to be applied to the encoder is [10M01] a.001..110. . b.Noise power b. A communication channel is fed with an input signal x(t) and the noise in the channel is negative.. information rate of the source Which of the following can be the generating polynomial for a (7. d. Channel capacity The Parity Check Matrix of a (6.

. 101.01. JOB Alerts & more. Entropy of a source is a measure of uncertainty of its output d. c.00.1110.001. d.111. H(X) = I(X.. 108. 110. the channel capacity c. R must be exactly equal to C c.} b. 111.10.110 d. 5 b. the noise matrix of the corresponding Communication channel is [12D01] a. 1. Source coding reduces transmission efficiency b.001. R must be greater than or equal to C b. H(X/Y)= I(X.001} d. 0. 109. 7 d.0001 c. b. 105. Tips/Tricks. the sampling rate required for the source If the transition probability of messages 0 and 1 in a communication system is 0.. the channel coding for the output of the source is based ? [11S03] a.01. H(X.Y) d. C = {111. its entropy measured in bits/message d.111. ARQ FEC differential encoding Get more @ www. 0.110. If a memoryless source of information rate R is connected to a channel with a channel capacity C.b. c. b. c. R must be less than or equal to C d. . Techology News. the rate of information transmission over the channel decreases as [11M01] transmission probability approaches 0.0010 The Hamming distance of a triple error correcting code is [11S01] a. H(Y/X) = 0 b.11. 102. 8 A channel whose i/p is xi and output is yj is deterministic if [11S02] a. Then [11M02] a. 106. 107.Y) Which of the following is a valid source coding scheme for a source transmitting four messages? [11M03] a. b..UandiStar.5 transition probability approaches 1 104.5 transmission probability approaches 1 transition probability approaches 0. its entropy measured in bits/sec b..110} c. C = {000.2. 103. d. C = {000.110. d. Cyclic code is an ARQ scheme of error control The minimum source code word length of the message of a source is equal to [11S05] a. Channel coding improves transmission efficiency c.org Which of the following coding schemes is linear ? [11D01] a. 6 c.110. then on which of the following statements.10. d.Y) = 0 c. the noise matrix of the corresponding Communication channel is [11D02] a. c. A source X is connected to a receiver R through a lossless channel.011. Minimum number of bits required to encode the output of the source is its entropy Which of the following is correct? [11S04] a.101} If the transition probability of messages 0 and 1 in a communication system is 0. C = { 00.UandiStar.org 100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts. 1. In a.1110 b. a BSC. www.1.

H(X/Y)= I(X. The two sided noise 119.k) cyclic code.. 8 b. JOB Alerts & more.110 b. g(x) is the generator Polynomial.2). Systematic Linear Block codes b. 116. 4 nW b... then Which of 113. www. E(x) is the error polynomial. . P.011 For an (n.UandiStar. c.5 d. Cyclic Hamming codes c. 1. differential encoding The output of a source is a continuous random variable uniformly distributed over (0. Remainder of C(x)/g(x) b.org 100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts. entropy of the source c.00. 121. 114. In a BSC. channel capacity d. Then [12M02] a.1 d. H(Y/X) = 0 c. 1. H(X/Y) = 0 b. 4 b. if the transition probability of the messages 0 and 1 is P. R(x) is the received code polynomial and C(x) is the transmitted code polynomial.1 .The entropy of the source in bits/sample is [13D01] a.P The number of bits to be used by the efficient source encoder to encode the output of the source is equal to [12M01] a. 122.. Techology News.01. Tips/Tricks.UandiStar. R(x) + g(x) If is the input noise PSD and S is the input signal power for a communication channel of capacity C. the Syndrome polynomial S(x) is [12S01] a. Remainder of E(x)/g(x) c. 2 c. d. Then.Y) d.g(x) d. P. 117. 112. E(x). 120..Get more @ www.01. 0. H(X) = I(X. d. the probability of these symbols to appear at the channel output is [12D02] a.Y) Which of the following can be valid source coding scheme for a source transmitting 3 messages? [12M03] a. Shanon-Fano Encoding b. the following is Shanon`s Limit? [12S02] a.org b.P b. 7 d. 1/2. Parity Check codes c. 6 BCH codes capable of correcting single error are [12S04] a. 1. then. 118. Non-Systematic Linear Block codes Which of the following provides the facility to recognize the error at the receiver? [12S05] a.10. Convolutional codes d. and if they are of equal transmission probability. 9 c. b.101 d. 0. Huffman encoding d. 2 nW = 10 W/Hz. 1/2 c. 115. The Hamming distance of an error correcting code capable of correcting 4 errors is [12S03] a.001 c. 1. c. the information rate of the source b. information content of each message A source X is connected to a receiver R through a deterministic channel. 1 An AWGN low pass channel with 4KHz band width is fed with white noise of PSD power at the output of the channel is [13D02] a.

the information gained by the observer at the receiver is [14M01] a. maximum information rate of a source c. 4B c. Modulo-6 addition. the weight of any non-zero code word b.111. JOB Alerts & more. Tips/Tricks. c. 133. the corresponding S/N ratio gets halved A source is transmitting two symbols A and B with probabilities 7/8 and 1/8 respectively. using noise free channel d. and Binary symmetric channel respectively d. 125..k) systematic cyclic code is [13S04] a. 01.011} A communication source is connected to a receiver using a communication channel such that. the sum of 1 and 5 is [13S02] 4 1 2 0 124. different coding efficiencies b. 16Kbps d.10. the corresponding S/N ratio gets doubled c. different sets of source code words Shanon's limit is for [13S01] a.. 126. the minimum of weights of all non zero code words of the code Which of the following is effected by the communication channel? [13S05] a. 132. the weight the code word consisting of alternate 1`s and 0`s d. Binary symmetric channel only b.10. www. its capacity gets halved d. Binary symmetric channel. b.110. FEC and ARQ schemes of error control can be applied for the outputs of [13S03] a..org 100 % . 129. 131. given the receiver X(t) and n(t) are the signal and the noise each is band limited to 2B Hz applied to a communication channel band limited to BHz. Information content of a message b. 6 nW 8 nW Get more @ www. maximum capacity of a communication channel under infinite band width In a. and Binary erasure channel respectively The Hamming distance of an (n. 2B b. C = {00. different average code word lengths c. information rate of the Mutual information d. C = {110. its capacity gets doubled b. The average source code word length can be decreased by [13M02] a.the minimum number of samples/sec that should be transmitted to recover the input of the channel at its output is [14M02] a. maximum information content of a message d. using pair coding Non-Uniqueness of Huffman encoding results in [13M03] a. C = {01. 123. maximum entropy of source b.UandiStar.11} b. Entropy of the source c. Then. source The maximum average amount of information content measured in bits/sec associated with the output of a discrete information source transmitting 8 messages and 2000messages/sec is [14D01] a. 6Kbps b. 3Kbps c. same as the conditional entropy of the source. Then . the weight of a code word consisting of all 1`s c. 134.UandiStar. If the band width of the channel gets doubled. after knowing the received is zero. C = {000. 4Kbps Which of the following coding schemes is linear ? [14D02] a. different entropies d. same as the entropy of the source b.001} d. same as the entropy of the receiver c. 127.11} c. reducing transmission probability b.org A system has a band width of 3KHz and an S/N ratio of 29dB at the input of the receiver.c.. Techology News. same as the joint entropy of the source and the receiver d. increasing transmission probability c. 130. d. 128. freeBSMS ON<space>UandIStar to 9870807070 for JNTU Alerts. Binary Erasure channel only c.. the uncertainty about the transmitted at the receiver. Binary Erasure channel. d. then [13M01] a.

C = 0 136.UandiStar. infinity b. To have 100 % transmission efficiency. 45. the minimum symbol rate of the system is [15M01] a. 2 b. [14S04] each code word of the (n.org The upper limit on the minimum distance of a linear block code is [14S01] a. minimum number of errors that can be corrected c. the reception of the message conveys maximum information to the user c. the channel capacity required is infinite If a.k) code is orthogonal to the code word of its dual code each code word of the (n. design the matched filter for the receiver A source is transmitting only one message. 3 kilosymbols/sec A source is transmitting four messages with probabilities 1/2. Theoritically. 142. 138. 3 bits d. but >0 and >1. 1/4. b. 145. 4 c. JOB Alerts & more. Techology News. 8 d. 1/8 and 1/8. Two different source code word sets can be obtained using Shanon-Fano coding The average source code word length per bit can be decreased by [15S01] a.. C is the code word and H is the Parity check Matrix of an (n. 48 kilosymbols/sec b. 147. c. 2 bits b.The information content of the word with two dashes in bits is [15D02] a. extending the order of the source c. using a channel with very large capacity d.75 bits A source is transmitting six messages with probabilities. 144. d.UandiStar. 50 % b. the exact code rate of the encoder is [15D01] a. differentiate between two sources b. Shanon`s limit b. the message received will be corrupted by noise d. a two bit word is the output of the encoder. increasing the transmitted power Trade-off between Band width and Signal to Noise ratio results in [15S02] a. Channel coding will reduce the average source code word length. 1/4. the reception of the message conveys zero information to the user b. 139.. Tips/Tricks.H = [0] HT. 1/2. 31. 1.25 % c.d. Source coding improves the error performance of the communication system. Two different source code word sets can be obtained using Huffman coding. then. 6 kilosymbols/sec c. 1/16. the probability of occurrence of a dash is one third of that of dot. 135.org for JNTU Alerts.. If the input message is of length 5. finite. b. Then [14S03] a. correct the errors at the receiving side d. minimum weight of the non-zero code word of the code b. 16 The voice frequency modulating signal of a PCM system is quantized into 16 levels. and 1/32. 3. to find the entropy in bits/message of a source c. 141. 23. d. minimum redundancy Binary erasure channel is an example of [15S03] a. 96 kilosymbols/sec d.. increasing the entropy of the source b.75 bits c.3 % d. a noise free channel d.1/32.3 % In a message conveyed through a sequence of independent dots and dashes. the concept of transmitting the given information using various combinations of signal power and band width c. www. the average source code word length of the message of the source should be [15M02] a. maximum number of errors that can be corrected d. 137.. c. zero c. the entropy of a continuous random variable is [14S05] a. 146.k) linear block code. Wide band channel free SMS ON<space>UandIStar to 9870807070 140.Then [15M03] a. If the signal is band limited to 3KHz. 1/8. 143. 6B Get more @ www.k) code is orthogonal to the code word of the same code C. unity d. A convolutional encoder is having a constraint length of 4 and for each input bit. 100 % . maximum weight of the non-zero code word of the code Information rate of a source can be used to [14S02] a.

3 bit shift register and 4 Modulo-2 adders.10.. 152.b. In a. 155. Mutual information of a communication system is same as the entropy of the source b. 20 Which of the following is a valid source coding scheme for a source transmitting Six messages? [16M02] a. c. 10. Shanon`s limit b. Its capacity is [16S05] a. Redundancy of the code d. Code rate of the code c.UandiStar.1111. remains unaltered d. is less c.25 to be equal to 1 and 0. . c. Shanon`s channel coding Theorem A source X is connected to a receiver Y through a noise free channel.110. The coding efficiency due to second order extension of a source [16S03] a. 153. Maximum of H(X. b. due to noise in the channel.Shanon`s Law d.1110.UandiStar.. Hartley . 11.10. [15S04] transmission errors will be less rows and columns of the corresponding channel matrix are identical. d. Hamming weight of the code word As a. Assuming independent generation of symbols. c.Thre minimum number of bits/sec required for error free transmission of this source is [16D02] a.org a symmetric channel. 4 bit shift register and 4 Modulo-2 adders. 11000 b. 4 bit shift register and 3 Modulo-2 adders. it is not possible to find any uniquely decodable code whose average length is [16S02] less than the entropy of the source greater than the entropy of the source equal to number of messages from the source equal to the efficiency of transmission of the source 150. d... Which of the following is correct? [15S05] a. 0.11110. 101.25 and 0. Maximum of H(X) c. 154. 1500 bps A memoryless source emits 2000 binary symbols/sec and each symbol has a Probability of 0. Shanon`s source coding Theorem c. 1622 c. Maximum of H(X/Y) b.5 at a rate of 3000 symbols/sec. 151. Mutual information of a communication system is same as the entropy of the receiver A source generates three symbols with probabilities of 0. c. 1500 b. 1. severe effect of channel noise on the message transmitted narrowband channel symmetric channel Get more @ www. an average of one symbol in each 100 received is incorrect. JOB Alerts & more.Y) 158. www. The distance between the any code word and an all zero code word of an (n.11110.100. 159.The number of bits in error in the received symbols is [16M01] a.11) systematic cyclic code requires [16M03] a. d. is more b.110. b.1110. 1734 d. The symbol transmission rate is 1000.. Mutual information of a communication system is the information gained by the observer c.25. Mutual information is independent of the channel characteristics d. Techology News.11111 d. 0. 6000 bps b.75 to be equal to 0. 1. 0. 156.11111 The encoder of an (15. 11010.org 100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts. 148. 10 c. Maximum of H(Y/X) d.11001 c. 3000 bps d. 1100. the most efficient source encoder would have an average bit rate of [16D01] a. 157. 1885 In a communication system. Hamming distance of the code b.k) linear Block code is referred to as [16S01] a. 3 bit shift register and 3 Modulo-2 adders. b. Tips/Tricks. 1100. per source coding Theorem. except for permutation required transmission power will be less rows and columns of the corresponding channel matrix are identical on permutation basis 149. can not be computed Exchange between Band width and Signal noise ratio can be justified based on [16S04] a. 4500 bps c. 100 d. d. 1 b.

Y) is equal to [17S04] a. The channel Matrix of a noise free channel is an Identity Matrix d. Capacity of a BSC with infinite band width is not infinity. because [17S03] a.56 bits/ message b. Zero syndrome vector 171.org 161. 1. the remainder of the division V(x)/g(x) is [18D02] a. the generator polynomial is 1+x+x3. 111. Then. entropy of the source d.UandiStar.Its differential Entropy is [17D02] a. 1010111 An (7. A zero memory source emits two messages A and B with probability of 0. 0. Tips/Tricks. a communication system. 011 d. In a. JOB Alerts & more. 168. H(X) = H(Y) A transmitting terminal has 128 characters and the data sent from the terminal consist of independent sequences of equiprobable characters. 162. 0. given the receiver b. 166. b. 1. Noise power in the channel will not effect the signal power For a noise free channel. c. 1101110 b.2 respectively. the Syndrome vector corresponding to the error pattern 0000010 is [17M03] a. then the binary word corresponding to x2.1). 100 b.Y) H(X) H(X/Y) H(Y/X) Get more @ www. 2 bits/sample b. A noise free channel is of infinite capacity. H(X. 1101001 c.UandiStar. information lost in the channel is measured using [17S02] H(X/Y) I(X. joint entropy of the source and the receiver The output of a continuous source is a Gaussian random variable with variance σ 2 and is band limited to fm Hz. c. 10 b. d.78 bits/ message d. 1. 4 bits/sample c.44 bits/message A signal amplitude X is a uniform random variable in the range (-1.. Which of the following is correct? [17S01] a. 167. 165. 3 bits/sample A communication channel is so noisy that the output Y of the channel is statistically independent of the input X. www.. 010 c. 169. 170.Y) = H(X). 164.. The maximum entropy of the source is [17S05] a. Noise power in the channel inversely varies with band width c. b. b.4) systematic Cyclic code. 1 bit/sample d.44 d. 1010101 d.org 100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts. Noise power in the channel and the band width varies linearly b. d. For a noise free channel. entropy of the receiver.4) Non-systematic cyclic code is given as g(x) = 1+x+x2+x4. Techology News.8 and 0.g(x) + g(x) is [18D01] a.and the Code polynomial is V(x) = x+ .72 bits/ message c. H(X.160..H(Y) b. H(X/Y) = H(Y/X) d. 14 For a (7. 7 c.Then.. given the source c. . The entropy of the second order extension of the source is [17D01] a.4) systematic cyclic code has a generator polynomial g(x) = 1+x+x 3. H(X/Y) = 1 c. The entropy of the above terminal in bits/character is [17M02] a. Then. 163. [17M01] a. A noise free channel is not a deterministic channel b. If the generator polynomial of a (7.Y) = H(X)+H(Y) c. I(X. entropy of the source. Noise power in the channel is independent of band width d.

5. Then the information transferred by the channel connecting the source and sink in bits is [18M03] a.1.org The out put of a continuous source is a uniform random variable of (0.1. output of the source is a gaussian random variable c. . given the source c.1.5 % d. 178. H..T = 0 d.. a symmetric channel with transition probability P2.0. comparing the actual rate of transmission and maximum limit for the rate of transmission of information over the channel d.1.. A Binary symmetric channel is a noise free channel c. 0.4) cyclic code for the data word 1 1 0 0 is 1+x2. source and sink jointly in bits/message are 1. The channel capacity of a Symmetric channel is always 1 bit/symbol d. 1. www. received code word received and corrected code word.0. Binary Erasure channel is the mathematical modeling of [18S02] a. 0..0.P) A source is transmitting four messages with probabilities of 0. 183. 2.25. HT. 4.Then [18M01] a. which of the following matrices. 177. the sum of each row is one ? [18S03] Joint probability Matrix Channel Matrix Conditional probability of the source. 20 % c. Self information and mutual information are one and the same. The cascade of two such channels is [19D01] a. 174. 172. 0. d. comparing the entropy of the source and the information content of each individual message of the source. 1. 10 % b. the inability of the receiver to make a decision about of the received message bit in the back ground of noise c..432 respectively.8707 The efficiency of transmission of information can be measured by [18S01] a. b.It is its response for an input sequence of [18M02] a..0613.. c. 1 1 0 0 1 0 1 b.org 100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts.. 25 % The parity polynomial in the generation of a systematic (7. A BSC has a transition probability of P.. = 0. an asymmetric channel with transition probability 2P b.. comparing the entropy of the source and maximum limit for the rate of Ans. 1 1 1 0 1 0 0 d.0. . In a communication system. given the receiver Generator Matrix 173. the relative entropy of the source is zero d.c.UandiStar. the effect of channel noise resulting in the incorrect decision of the message bit transmitted b..T = 0 c.5 and 2. If T is the code vector and H is the Parity check Matrix of a Linear Block code.1). Tips/Tricks. d. 176.1.0. d.125. then the code is defined by the set of all code vectors for which [18S04] a. error detection mechanism at the receiving side In a.HT = 0. 175.. 0.125and 0.. 0. a symmetric channel with transition probability 2P(1 .P) d. the error correction mechanism at the receiving side d.0.0. Techology News.0. 1 1 0 1 0 1 0 Which of the following are prefix free codes? [19M02] 179. b.UandiStar. The entropy measure of a continuous source is not an absolute measure b.The corresponding code word is [19M01] a.1). c.9933 c.1.945 d. 0.By using Huffman coding. 180. 1. sink. comparing the and maximum limit for the rate of transmission of information over the channel and the conditional entropy of receiver. Which of the following is correct? [18S05] a. b. 1 1 1 0 0 1 0 c.c transmission of information over the channel. the source is discrete memory less source The generator sequence of an adder in a convolutional encoder is (1. b. the percentage reduction in the average source code word length is [19D02] a. an asymmetric channel with transition probability P(1 . 181. 182. 12. c.0.1. Get more @ www. the absolute entropy of the source is zero b. T. JOB Alerts & more.1. the average amount of uncertainty associated with the Source.1293 b.

b.4 b. Then. weight of b. 0.Its Hamming distance is [20D01] 187. and a transmitted bit is identified as 1 respectively. then [19S05] maximum information is conveyed over the channel no information is transmitted over the channel no errors will occur during transmission information loss is zero .. then the average information content of the pair (X. b.11 Get more @ www. www. 190. 4 A source X with symbol rate of 1000 symbols/sec is connected to a receiver Y using a BSC with transition probability P.UandiStar. c. 0. P(X = i/Y=j) measures [19S03] uncertainty about the received bit based on the transmitted certainty about the received bit based on the transmitted certainty about the transmitted bit based on the received uncertainty about the transmitted bit based on the received X and Y are related in one-to-one manner. 0.01. a transmitted bit unidentified. then.The columns of the matrix represent the probability that a transmitted bit is identified as 0.11 00. Convolutional code of code rate 1/2 If X is the transmitted message and Y is the received message.3 d. Tips/Tricks. average information of X b. d. The Hamming distance of the code vectors Ci and Cj is [20M01] a. 0. ) is [19S02] 185. X and Y are the transmitter and the receiver. Then.0001 1. The minimum number of parity bits required for the single error correcting linear block code for 11 data bits is [20M02] a. c. b. 0. c. d. If a. rate of information transmission over the channel in bits per sec is [20M03] free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts.00. d. 6 d. d. If a.a. c. b.6 c. minimum of the weights of Ci and Cj weight of sum of the weights of Ci and Cj 193. c. mutual information of (X. 186. information X2 with out knowing X1.4) systematic Linear Block code c. The Parity check matrix of a linear block code is a. 194.3) systematic cyclic code b. the average information of X after Y is known d.001. 188.. the average information of Y after X is known c. (7. JOB Alerts & more.5 the output of the channel is independent of the input.. c. information X2 to some one who knows X1. the probability that the bit is not identified is [20D02] a. in a BSC.1.01. d. m being the number of messages with source 0 0. If a. Techology News.Y) is equal to the average information of Y plus [19S01] a. A source with equally likely outputs is connected to a communication channel with channel matrix .00.org 100 % . The messages of the source are equally likely.Y). (6. 3 b. H(X/Y) in bits is [19S04] 1 log m.UandiStar. The entropy H( a.011. c.. (15. b. 189. b. 4 3 6 5 191.0001 0.org Which of the following is a single error correcting perfect code? [19M03] a. 5 c.11) Hamming code d. Mutual information of X1 and X2 effect of noise in receiving X1 as X2. d..2 192.01. d. 184.

. mutual information becomes maximum 197. Mutual information is symmetric about transmitted and received pairs b. correct all single error patterns and detect all double error patterns A source of information rate of 80 Kbps is connected to a communication channel of capacity 66.a. P c. source coding will make the errors corrected at the receiver d. Entropy coding is a [20S01] a. 2(1 .UandiStar. c. differential coding scheme Get more @ www. JOB Alerts & more. 198. channel coding results in error free transmission c. Binary Erasure channel is a symmetric channel c.. d. Then [20S05] a. www. Tips/Tricks. 195.. fixed length coding scheme c. correct all single error patterns b. channel coding scheme d.P b. detect all triple error patterns d.org 100 % free SMS ON<space>UandIStar to 9870807070 for JNTU Alerts.6 Kbps.UandiStar. 2P d.org 196. detect all double error patterns c.P) A (4. error free transmission is not possible b. the bit error probability is [20S03] a. b. For a BSC with transition probability P. Techology News. Channel capacity of a noise free channel is zero.. .3) Parity check code can [20S04] a. 1 . Which of the following is correct? [20S02] a. 199.. variable length coding scheme b. Channel matrix gives the joint probabilities of the transmitted and received pairs d.