36 views

Uploaded by brar351

Attribution Non-Commercial (BY-NC)

- 2018-12-22T21-44-28_r3dlog
- Wireless Digital Communications
- Markov Chain
- Hidden Markov Model HMM
- Sharath Resume
- game.log.txt
- CS6665 10 Optimtool GA
- Mux General Tech Description
- Burgard98 - Experiences With an Interactive Museum Tour Guide Robot
- ssuetnet
- Buffer Status Reporting
- Commercial Prospects for Quantum Computing
- modf
- Markov (1).pptx
- ID3 Presentation
- 150430111123 Information Theory
- Bits - Bytes - Logic Operations(1)
- Morse Code Generator Using Microcontroller With Alphanumeric Keypad
- iec 60870 101.pdf
- Encoders for elevators.pdf

You are on page 1of 4

Prof. Uf Tureli, Stevens Institute of Technology email/tel/fax: utureli@stevens-tech.edu, 201.216.5603/8246 Note: Cover and Thomas, Q. 5.4,5.8,5.14,5.25 1. Question: Human coding. Consider the random variable: X = x1 x2 x3 x4 x5 x6 x7 0.49 0.26 0.12 0.04 0.04 0.04 0.02

(a) Find a binary Human code for X. Solution: The Human tree for this distribution is: Codeword 1 00 011 01000 01001 01010 01011 x1 x2 x3 x4 x5 x6 x7 0.49 0.26 0.12 0.04 0.04 0.03 0.02 0.49 0.26 0.12 0.05 0.04 0.04 0.49 0.26 0.12 0.08 0.05 0.49 0.49 0.51 1 0.26 0.26 0.49 0.13 0.25 0.12

(b) Find the expected codelength for this encoding. The expected length of the codeword for the binary Human code E(X)=2.02 bits. (c) Find a ternary Human code for X. Solution: Codeword 0 x1 1 x2 20 x3 22 x4 210 x5 211 x6 212 x7

2. Question Simple optimum compression of a Markov source. Consider the 3-state Markov process having transition matrix in Table 2: Thus, the probability that S 1 follows S3 is equal to zero. Design 3 codes C1 , C2 , C3 (one for each state, S1 , S2 , S3 ), each code mapping elements of the set Si s into sequences of 0s and 1s, such that this Markov process can be sent with maximal compression by the following scheme: (a) Note the present symbol Si . (b) Select code Ci . 1

Un1 /Un S1 S2 S3

S1 1/2 1/4 0

S2 10 0 0

S3 11 11 1

Table 2: Code

(c) Note the next symbol Sj and send the codeword in Ci corresponding to Sj . (d) Repeat for the next symbol. Solution A possible solution is shown in 2: The average message lengths of the next symbol conditioned on the previous state being Si are just the expected lengths of the codes Ci . Note that this code assignment achieves the conditional entropy lower bound. To nd the unconditional average, we have to nd the stationary distribution on the states. Let be the stationary distribution. Then: =

1 2 1 4 1 4 1 2 1 2 1 4 1 4 1 2

We can solve this to nd that = (2/9, 4/9, 1/3). Thus the unconditional average number of bits per source symbol: EL =

3 i=1 i E(L|Ci ) = 2 x1.5 + 4 x1.5 9 9 4 = 3 bits/symbol

+ 1 x1 3

The entropy rate H of the Markov chain is: H = H(X2 |X1 ) = 3 I H(X2 |X1 = Si ) i=1 = 4/3 bits/symbol Thus the unconditional average number of bits per source symbol and the entropy rate H of the Markov chain are equal, because the expected length of each code Ci equals 2

the entropy of the state after state i, H(X2 |X1 = Si , and thus maximal compression is obtained. 3. Question Human Code Find the: (a) binary Human code for the random variable X with the probabilities: p= 1 2 3 5 6 . , , , , 21 21 21 21 21

Solution The Human tree for this distribution is: Codeword 00 10 111 010 0110 0111 x1 x2 x3 x4 x5 x6 6/21 5/21 4/21 3/21 2/21 1/21 6/21 5/21 4/21 3/21 3/21 6/21 9/21 12/21 1 6/21 6/21 9/21 5/21 6/21 4/21

x1 x2 x3 x4 x5 x5 x5

(c) Calculate L = pi li in each case. Solution The expected length of codewords for the binary Human code is 51/21 = 2.43 bits. The ternary code has an expected length of 34/21 = 1.62 ternary symbols. 4. Question Shannon Code. Consider the following method for generating a code for a random variable X which takes on m values {1, 2, , m} with probabilities p 1 , p2 , , pm . Assume that the probabilities are ordered such that p1 p2 pm . Dene Fi = i1 pi , the k=1 sum of the probabilities of all symbols less than i. Then the codeword for i is the number 1 Fi [0, 1] rounded o to li bits, where li = log Pi . (a) Show that the code constructed by this process is prex-free and the average length satises: H(X) L < H(X) + 1,

1 1 Solution The Shannon code has li = log pi , such that we have log pi li 1 log pi + 1, which implies that H(X) L = pi li xi < H(X) + 1. To prove that

the code is a prex code, note that by the choice of li , we have 2li pi < 2(li 1) . Thus Fj , j > i diers from Fi by at least 2li , and will therefore be at least one place dierent in the rst L i bits of the binary expansion of Fi . Thus the codeword, for Fj , j > i which has length Lj li diers from the codeword for Fi at least once in the rst li places. Thus no codeword is a prex of any other codeword. (b) Construct the code for the probability distribution (0.5, 0.25, 0.125, 0.125) Solution We build the following table: Symbol 1 2 3 4 Probability 0.5 0.25 0.125 0.125 Fi in decimal 0.0 0.5 0.75 0.875 Fi in binary 0.0 0.1 0.110 0.111 li 1 2 3 3 Codeword 0 10 110 111

The Shannon code in this case achieves the entropy bound 1.75 bits and is optimal.

- 2018-12-22T21-44-28_r3dlogUploaded byIonuț Nica
- Wireless Digital CommunicationsUploaded byGetachew Melese(Ph.D.)
- Markov ChainUploaded byHumptySharma
- Hidden Markov Model HMMUploaded byGanga Sagar
- Sharath ResumeUploaded bycsshyamala
- game.log.txtUploaded byhbkmarius
- CS6665 10 Optimtool GAUploaded byillyes
- Mux General Tech DescriptionUploaded byChoirul Huda
- Burgard98 - Experiences With an Interactive Museum Tour Guide RobotUploaded byArturo Aguirre
- ssuetnetUploaded byishahma
- Buffer Status ReportingUploaded byAjay Sharma
- Commercial Prospects for Quantum ComputingUploaded byJoe Mush
- modfUploaded byRajeev Mishra
- Markov (1).pptxUploaded byPrincess Melanie Melendez
- ID3 PresentationUploaded byMihir Shah
- 150430111123 Information TheoryUploaded byvivek
- Bits - Bytes - Logic Operations(1)Uploaded byAdewale Quadri
- Morse Code Generator Using Microcontroller With Alphanumeric KeypadUploaded byJunaid
- iec 60870 101.pdfUploaded byRicardo Alonso
- Encoders for elevators.pdfUploaded byFERNS
- lab 2Uploaded byMd Assad
- 24C02-BMUploaded byHùng Nguyễn Văn
- AnswersUploaded byLeenendra Chowdary
- Comparison_of_three_geostatistical_methods_for_hyd.pdfUploaded bySky walking
- MARKOV Tutorial8Uploaded bymaundumi
- 1991-06-Path Loss Prediction in Multifloored Buildings at 914 MHzUploaded byYash Gupta
- 01PL0592Uploaded byedu.costa-rj6143
- 2003-02Uploaded byLuis Gonzalo Trigo Soto
- zupper1Uploaded byNavin Kumar
- 05585806Uploaded bysathish14singh

- Manual_ReportAdapter for SmartPlant P&IDUploaded bywilliams richardson
- Edigas Standard Presentation 2009-03-12Uploaded byMarin Draganinsky
- 10 Simple but Useful Math Hacks for Everyday Use.pdfUploaded byluknok
- Revit Project ReportUploaded byRaUlKishan
- Java_Ch4Uploaded bySujan Neupane
- helpfile.pdfUploaded byKartika Iva Riyanti
- AIM HandbookUploaded bymohamed_sp
- Sound Design of Railway StaionsUploaded byHarish Gupta
- A Review Paper on Smart Phone’s Smart Security of Secret QuestionsUploaded byIJRASETPublications
- Very Good Inf3720_summaryUploaded byManny
- SingaporeUploaded byGaurav Singh
- google_infrastructure_whitepaper_fa.pdfUploaded bycollective2trader
- PCM600 v2.6 Release NoteUploaded byjjcanoolivares
- Migration GuideUploaded byCem Tuğrul
- SM32X Release NoteUploaded byAndrei Ba
- MAX6397-MAX6398Uploaded byjd
- A Step-By-step Guide to Non-linear Regression Analysis of Experimental Data Using a Microsoft Excel SpreadsheetUploaded byReymart S. Lagunero
- Official Playstation 1 Magazine UK issue 62Uploaded byAli Morrison
- Mla Electronic Sources Guide Purdue OwlUploaded byKen Ryan
- conferenceUploaded bySPS
- NEChal 2 Progress Test 4AUploaded byHodžić Nisveta
- Google vs AppleUploaded bySadia Mansoor
- Full Text Search Sphinx PhpUploaded bykamaludeencrm
- a89869_9irac.pdfUploaded byPraveen Kumar
- Ethereum White PaperUploaded byJosephAffonso
- Audio SpotlightUploaded bysananda4u
- chapt_01Uploaded byHassan
- SQL.docUploaded byphanick
- teslapersonalsupercomputer-160201192005Uploaded byNaveen kumar
- 70-462Uploaded bynirajcpu2502