Professional Documents
Culture Documents
THEORY AND
CODING
• Entropy
• Probabilistic information measure
INFORMATION?
• Is a mathematical approach to study the coding of information along with the quantification
,storage and communication of information
INFORMATION THEORY: UNCERTAINITY
•consider:
•m = {mo , m1 , m2 ..... mn }
•p = {po , p1 , p2 ....... pn } n
•when an event has a chance of occuring, then it has an uncertainity e.g p= 0.5
•if an event does not occur or is certain then then there is a lot of information and minimum probability
INFORMATION THEORY: MEASURE OF
INFORMATION
• Measure of information
• it is the infomation content of a message
m = {mo , m1 , m2 ..... mn }
p = {po , p1 , p2 ....... pn }
n
total probability is given by : p
i 1
i
INFORMATION THEORY: MEASURE OF INFORMATION
• formula :
1 base units
I k log 2 ( ) 2 bits
pk e nat
10 decit
1
log 2 ( )
pk
Ik
log 2
INFORMATION THEORY: PROPERTIES
• uncertanity is directly prpotional to information
• if there are M=2N equiproable messages, then the amount o infromation carried by each message
will be N bits
ENTROPY(H)
• Source efficiency, η =
• Redundancy, Re = 1 - η
INFORMATION RATE