You are on page 1of 19

Information Theory

Uncertainty,
Information, Entropy

Dr. Reba P,
ASP, ECE
Claude E Shannon

Digital Communincation 01/04/2023 2


Uncertainty
Look at these statements
1. There will be daylight tomorrow
2. United States invades Iran
3. Iran invades the United States

What can you feel?

• Statement 1 : Does not convey any information


• Statement 2 : Conveys information
• Statement 3: Conveys yet larger information

Digital Communincation 01/04/2023 3


Uncertainity
Amount of Information Uncertainity

If is the probability of occurrence of message, then,


When

What is the function behind?

Digital Communincation 01/04/2023 4


Information
• If ……. With are their respective probability of occurrence, also , then the amount
of information conveyed by is
• unit: bit
• Where, probability of

• If two equally likely message occurs, (equiprobable) with, , and , then if the
correctly identifies the message, the amount of information conveyed is,

Digital Communincation 01/04/2023 5


Contd…
• Similarly, if are the messages with probability , and then the amount of
information conveyed by these two messages is,

• If 8 independent messages occurs (each with probability 1/8) then,

• In general if independent messages occurs, , then,



• “Greater amount of information is conveyed when the receiver correctly
identifies less likely messages”

Digital Communincation 01/04/2023 6


Properties of Information Measure:
1. If 0  If the message is certain,  No information is gained

2. 0 for  Message conveys some or no information but never loss of information

3. If  less probable message conveys more information

- If two independent are correctly identified, then the total amount of information
conveyed their individual sum.

Digital Communincation 01/04/2023 7


Tutorial
A source emits one of the four possible symbols during each signalling interval. The
symbols occur with the probabilities:
• P0=0.4
• P1=0.3
• P2=0.2
• P3=0.1
 
Find the amount of information gained by observing the source emitting each of
these symbols?

Digital Communincation 01/04/2023 8


Entropy

Average Information per message: Entropy of a source

Consider a memoryless source emitting messages , their respective probabilities are


also . The mean (or) average information per message emitted by the source is called
as entropy.

Entropy

Digital Communincation 01/04/2023 9


Entropy of a binary memoryless source
• Message 0 - Probability
• Message 1  Probability
• Then,

• If
• If
• If

Digital Communincation 01/04/2023 10


Properties of Entropy
• If events are equiprobable;, ,
• Then

Digital Communincation 01/04/2023 11


Tutorial 1
• Consider a discrete memoryless source with source alphabets , and their
probabilities are

• Find the entropy

Digital Communincation 01/04/2023 12


Tutorial -2
• Consider a discrete memoryless source with source alphabets , and their
probabilities are

• Find the entropy.

Digital Communincation 01/04/2023 13


Tutorial 3
• Consider the 2nd order extension of the discrete memoryless source , with their
individual probabilities, ,,

• Their respective probabilities are

• Find the entropy.

Digital Communincation 01/04/2023 14


Maximum Entropy
• The entropy of a discrete memoryless source, is bounded as,

where, represents the total no of symbols in .

Proof:
• is always non-negative,
• if and only if
• Similarly,
if and only if for some and all other s are zero

Contd..

Digital Communincation 01/04/2023 15
Contd..
To prove the upper bound , consider any two probability distributions, on the
alphabet of a discrete memoryless source.
Consider the term,
=

• Note :
Contd..

Digital Communincation 01/04/2023 16


Contd…

-
Contd..

Digital Communincation 01/04/2023 17


Contd…
Suppose if & , i.e. the set with equiprobable symbol,

Digital Communincation 01/04/2023 18


Digital Communincation 01/04/2023 19

You might also like