ni

© All Rights Reserved

24 views

ni

© All Rights Reserved

- Analysis of Efficient Wavelet Based Volumetric Image Compression
- XID8300 Ethernet Printer Driver Manual ENG 200911 (1)
- Reception Theory (1)
- AZ CDISC Implementation
- Compression and Decompression
- digital image compression
- Entropy Coding Technique for Compression of Satellite Vibration Test Data
- 2 (2)
- Orthogonal Frequency Division Multiplexing
- Aim
- Alaa Approach 1
- Compressing and Editing PDF Files.
- DIP_Lec6.pdf
- FELFCNCA: Fast & Efficient Log File Compression Using Non Linear Cellular Automata Classifier
- Lab Tti Rezolvari Intrebari Engleza
- Image Compression
- Fuzzy Logic Based Hybrid Image Compression Technology
- Area Efficient Realization of Error Detection and Data Recovery Architecture in Motion Estimation
- bukuasd1
- Hilongos FS Program of Instruction for OJT Interns

You are on page 1of 7

1) Consider a dice with 8 faces written the letters from A to H. The probability for

each side are: A (1/2), B (1/4), C (1/8), D (1/16), E (1/32), F (1/64), G (1/128) e

H (1/128).

a) Find the Shannon-Fano and Huffman encodings for the symbols emitted

by this source

b) Compute the entropy of the source and compare with the average length

of the code words obtained in a), determining the coding efficiency of

Shannon-Fano and Huffman techniques

2) An unfair dice with 5 faces has probability of to give face A and to give

face B. The other three faces C, D and E have probability each

a) Find the Shannon-Fano coding for symbols emitted by this source.

b) Compute the entropy of the source and compare with the average length

of the code words obtained in a), determining the coding efficiency of

Shannon-Fano and Huffman techniques

3) Create the Shannon-Fano and Huffman codings for the following set of

symbols, then compare with the average length of code words obtained with the

entropy H(X), determining its efficiency:

Symbol Probabilities

x1 0,2

x2 0,18

x3, x4, x5 0,1 cada

x6 0,061

x7 0,059

x8, x9, x10, x11 0,04 cada

x12 0,03

x13 0,01

4) You want to transmit the following phrase to a receptor: this list is very easy

using the ASCII character set to map characters to 7 bits sequences

a) How many bits are needed to encode the sequence above?

b) How would be this sequence after applying the Shannon-Fano coding? What

is the average length?

c) How would this sequence be after applying the Huffman coding? What is the

average length?

d) Compute the entropy of the source and the efficiency of the codings found at

a), b), and c).

molecules. You will store the data using the 4 possible bases.

a) How many bases are needed to store 1 Terabyte of data?

b) You want to store a file containing only 8 symbols with probabilities: 10%,

20%, 20%, 15%, 15%, 10%, 5% e 5%. Propose the most compact possible selfpunctuating coding using the bases A, C, T and G, then determine its coding

efficiency.

6) You were asked to compactly encode a tongue-twister. This is the phrase:

peter piper picked a peck of pickled peppers. The frequency distribution for

each symbol is:

a) A way to encode this sequence would use a fixed size code, with code words

long enough to encode the 14 different symbols. How many bytes would be

needed to transmit this phrase with 44 characters using a fixed code size?

b) Determine the minimum number of bits required to encode the phrase

assuming that each character is independent of its surrounding characters.

c) What is the theoretical contribution of each one of the 14 symbols to the

average information?

d) Build a code dictionary using the Huffman algorithm for the 14 symbols

e) Encode the phrase using the code sequence of item d):

i. How many bits are needed?

ii. How does this number compare with the number of bits needed when using

the code obtained at a)?

iii. How does this number compare with the information content of the phrase

calculated in item b

7) Consider a source X with symbols x1, x2, x3, x4 encoded with the following

codes:

b) Which of them are uniquely decodable? Justify.

8) Explain the codes below:

9) A source X has four symbols x1, x2, x3 e x4 with p(x1) = , p(x2) = and

p(x3) = p(x4) = 1/8. Build the Shannon-Fano code for X. Show that this code

has 100% efficiency.

10) Given a source X with m equiprobable symbols xi, i=1, ..., m. Let n be the

size of a word in a fixed size encoding. Show that, if n=log2m, then the code

efficiency is 100%.

11) In a DNA string, there are 4 kinds of bases: G, T, C and A. Which is the

information contained in a DNA of size 10 in the following cases?

a) All bases are equiprobable

b) The bases G and T are twice more probable than the bases C and A

b) Decode the sequence 0011011111000 using the prefix codes identified in a).

Try to repeat the same procedure to the other codes.

13) Consider a dice with 8 faces written the characters from A to H. Considering

that all faces have equal probability, show whether it is possible or not to create

a code which is more efficient than the fixed size code in this case.

14) A source X has 5 symbols with the following probabilities: p(x1) = 0,4, p(x2)

= 0,19, p(x3) = 0,16, p(x4) = 0,15 and p(x5) = 0,1.

a) Create a Shannon-Fano code for X and compute the code efficiency.

b) Repeat for the Huffman code and compare its results.

15) A source X has 5 equiprobable symbols.

a) Create a Shannon-Fano code for X and compute the code efficiency.

b) Repita para o cdigo de Huffman e compare os resultados.

16) Let be an unfair die with the following probabilities: 1: 0.05; 6: 0.3; de 2 a 5:

0,1625.

a) Determine the efficiency of the following codes:

efficiencies.

17) Given a discrete source without memory in which the alphabet consists of K

equiprobable symbols. What conditions must be satisfied by K and by the size

of code-words (fixed size) so that the efficiency becomes 100%?

18) Joozinho says that it is possible to design a method which produces more

efficient codes than Huffman. He created the following iterative method:

- 1 iteration: For the 2 most probable symbols, assign 1bit codewords following

the binary order

- 2 iteration: For the next 4 most probable symbols, assign 2bits codewords

following the binary order

- 3 iteration: For the next 8 most probable symbols, assign 3bits codewords

following the binary order

...

- N iteration: For the next 2 N most probable symbols, attribute N bits codewords

following the binary order

a) Apply the method described by Joozinho for the following probabilities of an

unfair dice:

P(X = 1) = 1/4; P(X = 2) = 1/4; P(X = 3) = 1/4; P(X = 4) = 1/8; P(X = 5) = 1/16; P(X = 6) = 1/16

b) Determine the efficiencies of both Joozinho and Huffman codes for the

unfair dice of item a). Is there something strange with Joozinhos code? If yes,

justify.

i singular

ii non-singular, but uniquely decodable

iii uniquely decodable, but non-instantaneous

iv instantaneous

a) Determine the type of each one of the following codes and justify

a1) Morse code:

a3) A = 010, B = 100, C = 101, D = 100

a4) A = 10, B = 11, C = 00, D = 110

a5) A = 0, B = 011, C = 10, D = 01

b) Why does Morse code need a pause (larger time interval) between the

emission of two characters and an even larger pause between two words?

20) Two fair dices were launched by Alice and the sum of their results was

written by her. Bob must ask a series of yes/no questions to find this number.

Describe an strategy to do it minimizing the number of questions

asked. This strategy must be better than the one obtained at

exercise 8 of the previous list.

21) Consider the message 122121213. Assuming that each character is an 8

bits ASCII, how would it be transmitted using run-length encoding? What is the

compression ratio?

22) Consider this famous phrase from JFK (in portuguese):

A pergunta de cada um de ns no deve ser o que o pas pode fazer por ns;

mas sim o que cada um de ns pode fazer pelo pas

a) How many bytes does this phrase occupy?

b) What frequency has each one of these words?

c) Build a static dictionary for this phrase, based on frequencies on b)

d) How many space does the dictionary occupies?

Does this algorithm ensure space saving? In what situations the algorithm

results in a file bigger than the original?

24) How can the following sequences be encoded as run-length sequences?

a) AAABBBBBBYYYYPPPPPPPPPTKKKKKKKK

b) 111112223333312222221111111333333333

25) What compression rate was obtained in a) and b) of the previous exercise?

26) In what cases are recommended lossy or lossless compression? Justify.

27) Create a simple method to quantize an image with 8 bits gray levels in other

image with 4 bits gray levels. Can the method be easily extended to colored

images (RGB)?

28) Given the following messages:

i) AAAAAAAAAAAAAAAAAAAABBBBBBBBBBBBBBBBBBBB

ii) ABABABABABABABABABABABABABABABABABABABAB

Would run-length be a great choice to encode the message i)? And the

message ii)? Justify

29) Consider an image with the Germany flag in a resolution of 30X40 pixels in

the RGB standard of 1 byte per channel, where the first 10 lines correspond to

the color black, the following 10 lines correspond to the color red, and the last

10 lines correspond to line yellow.

a) What is the size of a file containing this image (in bits and bytes)? Justify.

b) Consider the run-length compression method in which consecutive pixels

of same color are encoded by a binary pattern followed by a value in

brackets [X], where X is the number of consecutive repetitions of that

color. Suppose that the brackets are represented by 1 byte (ASCII

character) and X is an integer number represented by 4 bytes. Also, the

compressed file must start with a header which informs the file resolution

(two 2 bytes integers). What is the file size of the previous file after this

compression (in bits and bytes)? What is the compression rate with

regard to the file obtained in a). Justify.

c) What compression methods would be the most adequate to compress

the Germany flag image? The run-length or Huffman code? Justify.

- Analysis of Efficient Wavelet Based Volumetric Image CompressionUploaded byAI Coordinator - CSC Journals
- XID8300 Ethernet Printer Driver Manual ENG 200911 (1)Uploaded byRomer roma
- Reception Theory (1)Uploaded byditia_ad
- AZ CDISC ImplementationUploaded byElizabeth Nicole
- Compression and DecompressionUploaded byNaveena Sivamani
- digital image compressionUploaded byKetan Sharma
- Entropy Coding Technique for Compression of Satellite Vibration Test DataUploaded byInternational Journal of Emerging Trends in Signal Processing (IJETSP)
- 2 (2)Uploaded byjohn
- Orthogonal Frequency Division MultiplexingUploaded byNsrc Nano Scientifc
- AimUploaded byHarinder Grewal
- Alaa Approach 1Uploaded bySanjay Wasan
- Compressing and Editing PDF Files.Uploaded byuncle.phil
- DIP_Lec6.pdfUploaded byMuhammad Ali Raza
- FELFCNCA: Fast & Efficient Log File Compression Using Non Linear Cellular Automata ClassifierUploaded bySEP-Publisher
- Lab Tti Rezolvari Intrebari EnglezaUploaded byElena Damoc
- Image CompressionUploaded byvanithapremkumar
- Fuzzy Logic Based Hybrid Image Compression TechnologyUploaded byEditor IJRITCC
- Area Efficient Realization of Error Detection and Data Recovery Architecture in Motion EstimationUploaded byInternational Organization of Scientific Research (IOSR)
- bukuasd1Uploaded byirfi
- Hilongos FS Program of Instruction for OJT InternsUploaded byHILONGOS FIRE STATION HILONGOS, LEYTE
- Presentation 2Uploaded byJerson Alex
- 1-s2.0-S0030402613006669-mainUploaded byMario Dehesa
- CS1354Uploaded byBetsy Freda
- Enc SchemesUploaded byAarthi Subbiah
- 10.1.1.45.2022Uploaded bySatinder Singh
- brew-cqUploaded bySaurabhMoharir
- KOFIDIS-Wavelet-based Medical Image CompressionUploaded byShaliza Jumahat
- 3-D WAVELET CODEC (COMPRESSION/DECOMPRESSION) FOR 3-D MEDICAL IMAGESUploaded byijitcs
- 4K Workflow Whitepaper 03-70-00209-00Uploaded byJesús Odremán, El Perro Andaluz 101
- Digital Image Compression TechniquesUploaded byInternational Journal of Research in Engineering and Technology

- Digital Communication IntroductionUploaded bylokeshwarrvrjc
- Notes on the McEliece CryptosystemUploaded bygauss1181
- DcomUploaded bydearmohseen
- Presentation_lempelziv Ese751 (Dr tiUploaded bySuhana Sabudin
- Php String FunctionUploaded byEBookTutorials
- ldpcUploaded byBlissMei
- Error Control Coding Fundamentals and Applications by Shu Lin PDFUploaded byMaggie
- 09Decoding Block CodesUploaded byMuhammad Reza Aditya
- ECC35Uploaded byLêTrungĐức
- Extended Ascii Table PDFUploaded byJames
- ALT+NUMPAD ASCII Key Combos_ The α and Ω of Creating Obscure PasswordsUploaded byUniversalTruth
- Error Control Coding Shu Lin PDFUploaded byShela
- Ldpc_tutorial [Compatibility Mode]Uploaded byAnbuselvi Mathivanan
- MatlabUploaded byvisakh21
- 22-huffmanUploaded bymanishbhardwaj8131
- Howto UnicodeUploaded byJuan Pez
- Cheat SheetUploaded bySteven Galford
- A Survey On Designing Of Turbo Encoder & Turbo DecoderUploaded byIJAFRC
- The Evolution of Error Control Coding (IEEE)(6p)Uploaded byAhmed Badr
- It 2302Uploaded byNirmal Kumar
- Cyclic CodesUploaded bySabin Bhandari
- Error detection and hammiing codesUploaded bynanda_shivam2113
- Burrows–Wheeler Transform - Wikipedia, The Free EncyclopediaUploaded byRakesh Inani
- Viterbi DecodingUploaded byGaurav Naval
- JPEGUploaded byPratik Dutta
- python_strings.pdfUploaded bySai Ku
- Turbo Coding for Satellite and Wireless Communications - 2002 -SoleymaniUploaded byVPhe
- SCS2101-CLASS++ASSIGNMENT-N01521664RUploaded byugwak
- Analysis and Performance Evaluation of Convolutional Codes over Binary Symmetric Channel Using MATLABUploaded byliezelleann
- 4 Error Detection&CorrectionUploaded byFurqan Saleem