You are on page 1of 11

Analog & Digital Communication

KEE058
Unit - 5
Course Coordinator Mr. Vijit Srivastava
Department of Electronics & Communication
UCER Allahabad
Course content
Hartley Shannan law
Huffman coding
Shannan Fano coding.
Hartley Shannan law
The Shannon-Hartley Theorem describes the
maximum amount of error-free digital data that can be
transmitted over a communications channel with a
specified bandwidth in the presence of noise.
The theorem establishes Shannon's channel capacity
for such a communication link, a bound on the
maximum amount of error-free digital data (that is,
information) that can be transmitted with a specified
bandwidth in the presence of the noise interference.
where
C is channel capacity in bits per second (bps)
W is bandwidth of the channel in Hz
S/N is the signal-to-noise power ratio (SNR).
SNR generally measured in dB.
Huffman Coding
Huffman coding is a simple and systematic way to design
good variable-length codes.
Huffman coding is a lossless data compression algorithm.
The most frequent character gets the smallest code and the
least frequent character gets the largest code.
It is used in image compression (JPEG), video
compression (MPEG), and codes used in fax machines.
The Huffman coding algorithm can be summarized as
follows:
In constructing a Huffman code to sort the Probability pi
in decreasing order.
Repeatedly join two nodes with the smallest probabilities
to form a new node with the sum of the probabilities just
joined.
Assign a 0 to one branch and a 1 to the other branch.
The codeword for each symbol is given by the sequence
of 0’s and 1’s.
Example: 1.
Consider a source with symbols s1, s2, s3, s4 with
probabilities 1/2, 1/4, 1/8, 1/8, respectively. Construct the
Huffman code and calculate average length and efficiency.
Shannan Fano coding.
Shannon Fano Algorithm is an entropy encoding
technique for lossless data compression of multimedia. 
In Shannon–Fano coding, the symbols are arranged in
order from most probable to least probable and then
divided into two sets whose total probabilities are as close
as possible to being equal.
All symbols then have the first digits of their codes
assigned; symbols in the first set receive "0" and symbols
in the second set receive "1".
Example: 1.
Consider a source with symbols s1, s2, s3, s4 with
probabilities 1/2, 1/4, 1/8, 1/8, respectively. Construct the
Huffman code and calculate average length and efficiency.
Example: 2
The geometric source of information A generates the
symbols {A0, A1, A2 and A3} with the corresponding
probabilities {0.4, 0.2, 0.2 and 0.2}. Encoding the source
symbols using Shannon-Fano encoder.
Thankyou

You might also like