You are on page 1of 2
USN | L | | ] | [ ISECS4 Time: od oe r Fifth Semester B.E. Degree Examination, Dec.2017/Jan.2018 Information Theory and Coding 3 hrs. Max. Marks: 80 Note: Answer FIVE full questions, choosing any one full question from each module. Derive an expression for average information content of symbols in long independent sequence. (03 Marks) For the Markov source shown below, find i) The stationary distribution ii) State entropies iii) Source entropy — iv) G, G; and show that G; > G; 2 H(s). (10 Marks) onc" ye ae Detine Self Information, Entropy and Information rate. (03 Marks) OR Mention different properties of entropy and prove external property. (07 Marks) A source emits one of the four symbols $) S: Sy and Ss with probabilities of oe oh 4& 1 Show that H(S*) = 24S). (04 Marks) a le transmission of a picture, there are about 2.25 < 10° pixels/frame. For a good reproduction at the recciver 12 brightness levels are necessary. Assume all these levels are equally likely to occur. Find the rate of information if one picture is to be transmitted every 3 min. Also compute the source efficiency (05 Marks) ‘Module-2 A discrete memory less source has an alphabet of five symbols with their probabilities as given below : a _ (10 Marks) Symbol S&S [S18 TS 7S Probabilities | 0.55 | 0.15 | 0.15 | 0.1 | 0.05 Compute Huffman code by pl composite symbol as high as possible and by placing composite symbol as low as po: Also find i) The average codeword length ii) The variance of the average code word for both the cases. Using Shannon Fano ~ coding. find code words for the probability distribution p-j! at ind average code word length and efficiency, (06 Marks) oR Write a short nete on Lempel Ziv algorithm. Derive Source coding theorem. Apply Shannon's encoding algorithm and generate binary codes for the set of messages given below. Also find variance, code efficiency and redundancy. (06 Marks) Mi 1M; [M: [MT Mi] Vs [16 [316] 4[ 38] Find the capacity of the discrete channel whose noise matrix is (04 Marks) 08 02 0} e(X)- O1 08 O4 0 02 08 lof2 b, Fora (2.1, 3) convolutional encoder with gy = 1011, a. © d. ISECS4 Define Mutual Information. List the properties of Mutual information and prove that IX sy) = Hx) + Hy) ~ Hexy) bits’system (06 Niarks) A channel has the following characteristics trad) Pf y/\-|3 3 6 6 bad 7x) |4 & P(X1) = pose) d Hix) Hoy). Hox. y) and Channel 6643 capacity if: =1000 symbols/see. (06 Macks) oR A binary symmetstc channel has the following noise matrix with source probabilities of x4 2 1 (yf\-|¥ 4 - Posi) 2 and Posy Land eX) i i (08 Yiarks) ra i) Determine Hix). Hy). Hex. yy). Hiyix) © Hox! ud Mx. y). ii) Find channel capaci iii) Find channel efficiency and redundancy. Derive an expression for efficiency for a Binary Erasure channel. (05 Marks) Write a note on Ditferential Entropy. (03 Marks) Module-4 For a systematic (6.3) linear block code generated by Cy=d) @ ds. Cs= dd Gah. Cyd) @ ds. i) Find all possible gode vectors ii) Draw encoder circuit and syndrome circuit Detect and correct the code word ifthe received code word is 110010. ¥) Hamming weight for all code vector. inin hamming distance. Error detecting. and correcting capability, (14 Mats: Detine the following : i) Block code and Convolutional code ii) Systematic and non - systematic code. (02 Ma hs) OR A linear Hamming code for (7, 4) is described by a generator polynomial g(x) = 1 + x > « Determine Generator Matrix and Parity check matri (03 Matsy A generator polynomiat for a (15, 7) cyclic code is g(x) = Lex) + x7 HN + NS, i) Find the code vector for the message D(x) = x° + x’ + x‘, Using eyelie encoder circuit. ii) Draw syndrome calculation circuit and find the syndrome of the ieceived polynomia Znyalexts' er tster ent tt (13 Mas: Module-5 Consider the (3. 1, 2) convolutional code with gi = 110. gs= 101. gys= 111. U2 Maks) i) Draw the encoder block diagram ii) Find the generator matrix. iii) Find the code word corresponding to the information sequence 11101 using tine domain and transform Domain approach. Write short note on BCH code, (04 Mass OR 2 = L101. (16 Mads Draw the state b. Draw the code tree. Draw trellis diagram and code word for the message | 10.1. Using Viterbi decoding algorithm decode the obtained code word if first bit is erroneous, nana 2of2

You might also like