Professional Documents
Culture Documents
KEE058
Unit - 5
Course Coordinator Mr. Vijit Srivastava
Department of Electronics & Communication
UCER Allahabad
Course content
Information Theory
Entropy
Information Rate
channel capacity
Information Theory
Information is the source of a communication system,
whether it is analog or digital.
Information theory is a mathematical approach to the
study of coding of information along with the
quantification, storage, and communication of
information.
In communication system, information is
transmitted from source to destination.
Entropy
Entropy can be defined as a measure of the average
information content per source symbol.
Claude Shannon, the “father of the Information
Theory”, provided a formula for it as −
Where pi is the probability of the occurrence of character.
The amount of uncertainty remaining about the channel
input after observing the channel output, is called
as Conditional Entropy.
It is denoted by H(x/y)
Sol. 1.
We have four messages with probabilities p1 = p4 = 1/8
and p2 = p3 = 3/8. Average information H (or entropy) is
given by equation as;
H = P1 log 1/P1 + P2 log 1/P2 + P3 log 1/P3 +P4 log 1/P4
where
C is channel capacity in bits per second (bps)
W is bandwidth of the channel in Hz
S/N is the signal-to-noise power ratio (SNR). SNR
generally measured in dB.
Thankyou