You are on page 1of 12

Information Theory

Dr. Muhammad Imran Farid


Topics:
▪ Entropy

2
Information:
▪ Tomorrow the sun will rise in the east.

▪ Pakistan wins the series against Africa by 4-0

▪ There is snowfall in Lahore

3
Information:
Let the prob. of occurance of an event P is:
P  1I  0
P  0I  
Let we have an event E
whose prob.P (E )
then I  E  log P1E 
Now, suppose an event is consists of two sub-events
E  e1 & e2 with prob. P  e1  & P  e2 
I  E   log P1E   log P e11,e2   log P1e1   log P1e2 
= I  e1   I  e2 
4
Information: Example
▪ TV Image
▪ Let TV have 572 lines x 720 pixels = 414720 pixels
▪ Let TV is allow to have 10 gray levels
▪ The number of TV images that can be formed is 10414720
▪ The Prob. Of any particular image is given by P(E) = 1/10414720
▪ I(E) = (log10) x 414720 = 1.4x106 bits

5
Towards Entropy:
▪ Let source S  i How do we associate measurement of information to this source?

1 2 ......... i ........ N


 i  s1 , s2 ,....., sq Sourcealphabet
P  s1  P  s2  ......P  sq 
S  zero  memorysource
 We are calling S a zero memory source because all the symbols are independent of each other

6
Towards Entropy:

7
Towards Entropy:
s1s2 ..........s N
s1n1I  s1 n1 I  s1 
s2 n2 I  s2 n2 I  s2 
 
sq nq I  sq nq I  sq 
n1 I  s1   n2 I  s2   ....  nq I  sq 
H S  
N
8
Entropy:
q
H  S    I  si P  si 
i 1
q
1
  P  si  log
i 1 P  si 
• This is the average amount of information that we get from source per symbol

• Entropy of S
• H(S) is positive
• H(S) is maximum when all symbols are equally probable
9
Entropy:
▪ Binary source (assuming zero memory binary source)

S  0,1
P 0  w
P 1  1  w  w
H  S   w log w1  w log w1  SourceEntropy
H  w   w log w1  w log w1  Entropy function

10
Entropy:
▪ If we plot H(w) vs w, we will get something like this

H(w)

0 0.5 1 w

• For a zero memory binary source if both probabilities are equal then H(S) = 1 bits;
which is the maximum entropy for a zero memory binary source
11
Entropy Example:
▪ Let we have numbers in a grid and let us say one of this grid
element is marked. Find out which grid is being marked
▪ 1 2 3 4 • Better way to find which grid is marked
• Is the marked grid lie at the bottom (Y)
5 6 7 8 • Is the marked grid lie on the left side (Y)
• Is it lie on the top (Y)
9 10 11 12 • Is it lie on the left (N)
13 14 15 16
• We only require 4 questions before I could
𝑀𝑎𝑡ℎ𝑒𝑚𝑎𝑡𝑖𝑐𝑎𝑙𝑙𝑦 find out which number in the grid is marked
prob. 𝑜𝑓𝑒𝑎𝑐ℎ symbol is 1Τ16
16
1
𝐻 𝑆 = ෍ 𝑃𝑖 log = 4𝑏𝑖𝑡𝑠 • This gives mathematical concept between entropy and
𝑃𝑖 questionnaire physically
𝑖=1 12

You might also like