You are on page 1of 15

Analog & Digital Communication

KEE058
Unit - 5
Course Coordinator Mr. Vijit Srivastava
Department of Electronics & Communication
UCER Allahabad
Course content
Information Theory
Entropy
Information Rate
channel capacity
Information Theory
Information is the source of a communication system,
whether it is analog or digital.
Information theory is a mathematical approach to the
study of coding of information along with the
quantification, storage, and communication of
information.
In communication system, information is
transmitted from source to destination.
Entropy
Entropy can be defined as a measure of the average
information content per source symbol.
Claude Shannon, the “father of the Information
Theory”, provided a formula for it as −
Where pi is the probability of the occurrence of character.
The amount of uncertainty remaining about the channel
input after observing the channel output, is called
as Conditional Entropy.
It is denoted by H(x/y)

H = p1 logs(1/p1) + p2 logs(1/p2) + ⋯ + pk logs(1/pk).


The average length formula

Avg Length = p1 Length(c1) + p2 Length(c2) + ⋯


+ pk Length(ck)

where pi is the probability of the ith character and


Length(ci) represents the length of the encoding for ci
Information Rate
The information rate is given by equation as, R = rH 
Information Rate : R = rH

Here R is the information rate.


H is the Entropy or average information
r is the rate at which messages are generated.
Information rate R is represented in average number
of bits of information per second.
It is calculated as follows:

R = ( r in messages per sec ) * ( H in bits per messages)


= bits / second
Ques. 1.
A PCM source transmits four samples (messages) with a
rate 2B samples / second. The probabilities of occurrence
of these 4 samples (messages) are p1 = p4 = 1/8 and p2 =
p3 = 3/8. Find out the information rate of the source.

Sol. 1.
We have four messages with probabilities p1 = p4 = 1/8
and p2 = p3 = 3/8. Average information H (or entropy) is
given by equation as;
H = P1 log 1/P1 + P2 log 1/P2 + P3 log 1/P3 +P4 log 1/P4

= (1/8)log 8 + (3/8)log(8/3) + (3/8)log(8/3) + (1/8)log 8

= 1.8 bits / message

Message rate r = 2B samples / second


= 2B messages / second
Hence, R = rH
Putting values of r and H in the above
equation

R = 2B messages / second * 1.8 bits / message


= 3.6 B bits/second
Channel Capacity
The channel capacity, C, is defined to be the maximum
rate at which information can be transmitted through a
channel.
The errors in the transmission medium depend on the
energy of the signal, the energy of the noise, and the
bandwidth of the channel. 
If the bandwidth is high, we can pump more data in the
channel. If the signal energy is high, the effect of noise is
reduced.
According to Shannon, the bandwidth of the channel and
signal energy and noise energy are related by the formula.

where
C is channel capacity in bits per second (bps)
W is bandwidth of the channel in Hz
S/N is the signal-to-noise power ratio (SNR). SNR
generally measured in dB.
Thankyou

You might also like