573 views

Uploaded by Hermandeep Kaur

Huffman coding uses a specific method for choosing the representation for each symbol, resulting in a prefix code (sometimes called "prefix-free codes", that is, the bit string representing some particular symbol is never a prefix of the bit string representing any other symbol) that expresses the most common source symbols using shorter strings of bits than are used for less common source symbols. Huffman was able to design the most efficient compression method of this type: no other mapping of individual source symbols to unique strings of bits will produce a smaller average output size when the actual symbol frequencies agree with those used to create the code.

- Lesson Special Cases in Transportation Problems
- VLSI ARCHITECTURE FOR LOW POWER VARIABLE LENGTH ENCODING AND DECODING FOR IMAGE PROCESSING APPLICATIONS
- Chapter Three New (2)
- Random Fields
- Phase I
- Asa058 Prb Output
- Input Design
- cs171_hw1_solutions.pdf
- Explicit o 070
- 06 Compression
- Nirva - Smell
- demografi kuisioner
- 7 Goal Seek and Solver
- Lecture 7 BinaryTrees
- EE4CL4_lecture30
- dsppp
- HW#1 (With Explanation)
- Gl 2411611166
- Settles 2005 - An Introduction to Particle Swarm Optimization
- 978-3-540-72432-2_46

You are on page 1of 10

3 Huffman Coding

1

Can remove and still have prefix code… and a lower avg code length 2 . Likely Symbols → Short Codewords Unlikely Symbols → Long Codewords <Recall Entropy Discussion> The two least likely symbols have codewords of the same length 2.Two Requirements for Optimum Prefix Codes 1. Why #2??? Suppose two least likely symbols have different lengths: Symbols ai aj φ φ Codewords Unique due to prefix prop.

THE Huffman Code • Build it from bottom up. starting w/ the two least likely symbols • The external nodes correspond to the symbols • The internal nodes correspond to “super symbols” in a “reduced” alphabet 3 .Additional Huffman Requirement The two least likely symbols have codewords that differ only in the last bit These three requirements lead to a simple way of building a binary tree describing an optimum prefix code .

3. If there is more than one such pair. 2. • Label each node w/ one of the source symbol probabilities Merge the nodes labeled by the two smallest probabilities into a parent node Label the parent node w/ the sum of the two children’s probabilities This parent node is now considered to be a “super symbol” (it replaces its two children symbols) in a reduced alphabet 4. 6. Label the parent node w/ the sum of the two children probabilities.Huffman Design Steps 1. choose the pair that has the “lowest order super symbol” (this assure the minimum variance Huffman Code – see book) 5. merge two with smallest probs. • Among the elements in reduced alphabet. Repeat steps 4 & 5 until only a single super symbol remains 4 .

1 0 1 2 0.05 0 1 1 0. 5.35 0 0 1 1 6 0.10 1011 0.Example of Huffman Design Steps 1. Label each node w/ one of the source symbol probabilities Merge the nodes labeled by the two smallest probabilities into a parent node Label the parent node w/ the sum of the two children’s probabilities Among the elements in reduced alphabet.6 0.25 4 0.05 1110 1 0. 6. merge two with smallest probs.10 1010 0.40 0 0 1 7 0.15 110 0 1 5 0. Repeat steps 4 & 5 until only a single super symbol remains 0.2 0 3 0.04 11110 0. 2. Label the parent node w/ the sum of the two children probabilities. 3.01 11111 5 .15 100 0. 4.

5 ⎧ H1 ( S ) + Pmax . Pmax ≥ 0.Performance of Huffman Codes Skip the details. l <⎨ ⎩ H1 ( S ) + Pmax + 0. State the results How close to entropy H(S) can Huffman get? Result #1: If all symbol probabilities are powers of two then l = H1 ( S ) Info of each symbol is an integer # of bits Result #2: H1 ( S ) ≤ l < H1 ( S ) + 1 l − H1 ( S ) = Redundancy Result #3: Refined Upper Bound Pmax < 0.5 Note: Large alphabets tend to have small Pmax Î Huffman Bound Better Small alphabets tend to have large Pmax Î Huffman Bound Worse 6 .086.

03 CR = 1.14 ≤ CR ≤ 1.67 1.63 1.66 ≤ CR ≤ 2.65 Not That Great! Text Compression Example Lossless Audio Compression Examples Not That Great! Not That Great! So… why have we looked at something so bad??? – Provides good intro to compression ideas – Historical result & context – Huffman is often used as building block in more advanced methods • Group 3 FAX (Lossless) • JPEG Image (Lossy) • Etc… 7 .16 ≤ CR ≤ 1.Applications of Huffman Codes Lossless Image Compression Examples Directly: Differences: Applied to Ch.47 ≤ CR ≤ 1.3 1. 3: Directly: Differences: 1.

02 P(a3) = 0. . a j .… am } ⎧ ⎪ → ⎨( a1a1 a1 ) . a2 .( a1 ⎪ ⎩ n times m n elements in new alphabet Need mn codewords… use Huffman procedure on probs of blocks Block probs determined using IID: P ( ai .8 P(a2) = 0.… a p ) = P ( ai ) P ( a j ) P (ap ) 8 .( am ⎫ ⎪ am am ) ⎬ ⎪ ⎭ {a1 .Block Huffman Codes (or “Extended” Huffman Codes) • • • • Useful when Huffman not effective due to large Pmax Example: IID Source w/ P(a1) = 0.18 Book shows that Huffman gives 47% more bits than the entropy!! Block codes allow better performance – Because they allow noninteger # bits/symbol • Note: assuming IID… means that no context can be exploited – If source is not IID we can do better by exploiting context model • Group into n-symbol blocks Î – map between original alphabet & a new “extended” alphabet a1a2 ).

53 of 3rd edition. Î no “shared” info between sequence . which uses independence & properties of log – After much math manipulation we get H ( S ( n ) ) = nH ( S ) Makes Sense: . H ( S ( n ) ) = H ( S ) + H ( S ) + + H (S ) 9 = nH ( S ) .Info is additive for Indep.Performance of Block Huffman Codes • Let S(n) denote the block source (with the scalar source IID) R(n) denote the rate of the block Huffman code (bits/block) H(S(n)) be the entropy of the block source • Then.Indep. how is H(S(n)) related to H(S)? – See p. Seq.Each symbol in block gives H(S) bits of info . using bounds discussed earlier H (S (n) ) ≤ R(n) < H (S (n) ) + 1 # bits per n symbols Î R = R(n)/n # bits/symbol H (S (n) ) H (S (n) ) 1 ≤ R< + n n n • Now.

Rate approaches H(S) • Thus. longer blocks lead to the “Holy Grail” of compressing down to the entropy… BUT… # of codewords grows Exponentially: mn Impractical!!! 10 .Final Result for Huffman Block Codes w/ IID Source 1 H (S ) ≤ R < H (S ) + n n = 1 is case of “ordinary” single symbol Huffman case we looked at earlier … H(S) H(S)+1/8 H(S)+1/4 H(S) + 1/2 H(S) + 1 • As blocks get larger.

- Lesson Special Cases in Transportation ProblemsUploaded byMiracle Boy Ravi
- VLSI ARCHITECTURE FOR LOW POWER VARIABLE LENGTH ENCODING AND DECODING FOR IMAGE PROCESSING APPLICATIONSUploaded byIJAET Journal
- Chapter Three New (2)Uploaded byayadman
- Random FieldsUploaded byReív Paramo
- Phase IUploaded byAsa Desyana
- Asa058 Prb OutputUploaded bybp
- Input DesignUploaded byMouli
- cs171_hw1_solutions.pdfUploaded bysingu ruthwik
- Explicit o 070Uploaded bymutucaum
- 06 CompressionUploaded byLuPham
- Nirva - SmellUploaded bySheetscore
- demografi kuisionerUploaded byMaman Mansyur
- 7 Goal Seek and SolverUploaded byRajul Rkg
- Lecture 7 BinaryTreesUploaded byDany Moko
- EE4CL4_lecture30Uploaded byMorteza Sepehran
- dspppUploaded byrkinkar4
- HW#1 (With Explanation)Uploaded byRhaiza Pabello
- Gl 2411611166Uploaded byAnonymous 7VPPkWS8O
- Settles 2005 - An Introduction to Particle Swarm OptimizationUploaded byegonfish
- 978-3-540-72432-2_46Uploaded byMaharNadirAli
- b 04402016018Uploaded byInternational Journal of computational Engineering research (IJCER)
- TQC TUGASUploaded byArga Saputra
- DesignOfSequenceDetectorUploaded bysgoshi84
- VCOP Punctuation ActivitiesUploaded byMegan Hughes
- Advanced Systems Lab Exercises From the Book – VISkiUploaded byChandana Napagoda
- Alphabet 13Uploaded byTháiSơn
- Bank Note AuthenticationUploaded byAnkit Goyal
- v3-03 Georgescu OvidiuUploaded bymexx007
- Social EntropyUploaded byDiego André
- C4 H MSUploaded byshah143

- Adaptiv Huffman Coding pptUploaded byHermandeep Kaur
- Channel CodingUploaded byHermandeep Kaur
- Part 1Uploaded byNidhi Mishra
- Coding and CompressionUploaded byHermandeep Kaur
- Ch5 DigitalUploaded bySaba Wasim
- Patton Aukerman ShorterUploaded byHermandeep Kaur
- Interview QuestionsUploaded byHermandeep Kaur
- Viterbi decodingUploaded byHermandeep Kaur

- 05Uploaded byFelipe Bueno
- Radar and Sensor Data Transport SolutionsUploaded byPT
- Paper MngUploaded byAbhilashPadmanabhan
- DB2 11.1 New Features HighlightUploaded byrekaputr4
- C-TADM55-75Uploaded byHideaki Yamaji
- Privilege Scalation in WindowsUploaded byelabir
- Practical on RDBMSUploaded byPankaj Dadhich
- Admin+ Sesssion5 Apr11Uploaded bysunkumar
- Delphi Documentation GuidelinesUploaded byHelder Lopes
- C.S0005-A_v6.0.pdfUploaded byBilal Ali
- Callmanager_Database_Replication.pptUploaded byGoutham Baratam
- RA Ad for Python PETRARCH 11.23.15 Draft.pdfUploaded byAnonymous JcQj6VnUV
- L14 Introduction to Function Block ProgrammingUploaded byJeshuy Lucky-n
- ModelSimqk GuideUploaded byAkshay Mahajan
- NShield Microsoft ADCS and OCSP Windows Server 2012 Ig[1]Uploaded byluv_angel88
- 03. M.E. CSEUploaded byNarayana Swamy
- SAD AssignmentUploaded bystraticblaze
- Brent OptimizationUploaded byFinigan Joyce
- HWUploaded byKanwal Dhindsa
- Windows 7 Configuration Chapter 09Uploaded bymohdadam2002_7934828
- Fortinet DDoS Strategy GuideUploaded byar15t0tle
- 25-boostUploaded byDzulhanif Hanif
- QA Manager, IT Program Test Manager, QA Architect, QA Test Lead,Uploaded byapi-78880319
- DS2 NVS Manual 2[1][1].0Uploaded byGas2shack
- Simpson's 1/3rd rule C program and flowchartUploaded byRichita Ghosh
- Book4_SqlProgramming_WebSampleUploaded byIoana Negrea
- COMPUSOFT, 2(5),121-126.pdfUploaded byIjact Editor
- MSP430 LaunchPad Workshop v2.22Uploaded byphwind
- Keyphrase Extraction Using Word EmbeddingUploaded byroberto86
- FiscalNote whitepaper on Legislative TechUploaded byJon Ortiz