Professional Documents
Culture Documents
Uncertainty,
Information, Entropy
Dr. Reba P,
ASP, ECE
Claude E Shannon
• If two equally likely message occurs, (equiprobable) with, , and , then if the
correctly identifies the message, the amount of information conveyed is,
•
•
- If two independent are correctly identified, then the total amount of information
conveyed their individual sum.
Entropy
• If
• If
• If
Proof:
• is always non-negative,
• if and only if
• Similarly,
if and only if for some and all other s are zero
Contd..
•
Digital Communincation 01/04/2023 15
Contd..
To prove the upper bound , consider any two probability distributions, on the
alphabet of a discrete memoryless source.
Consider the term,
=
• Note :
Contd..
-
Contd..