You are on page 1of 3

Information theory lecture-3

Discrete memoryless information source

“Memoryless source means that probabilities being considered remain unchanged with
passage of time”
If a source that generate a sequence of symbols
U={u1,u2,u3,…,un)
A group of consecutive symbols are known as a word, if a word of length L is denoted by
V, and then the number of all possible words equals nL.
The amount of information generated by discrete memoryless source
n
1
H ( u )=∑ Pi log ❑
i=1 Pi

A maximum amount of information


Max H ( u )=log ❑ n
Information source redundancy
H ( u)
¿=1−
Max H (u)
1. If a source generates codewords with equal probability of occurrence then
H ( u )=Max H ( u ) → ¿=0
2. If a source generate one codeword with probability 1 wile the other probabilities
are 0 then
H ( u )=0 → ¿=1
From 1,2 it is known that the value of redundancy is between 0 and 1

Example-1

1/3
Information theory lecture-3

A binary memoryless source generates the symbols 0,1with probability ¼ and ¾ , the
alphabet U={0,1}, compute the source redundancy
1 1
H ( u )=P log 2 +(1−P)log 2
P (1−P)
1 3 4
¿ log 2 4+ log 2 =0.5+0.311=0.811 bits
4 4 3
Max H ( u )=log 2 2=1
H ( u) 0.811
¿=1− =1− =0.189
Max H (u) 1
Example-2
Consider the binary source of example-1 and assume a word of length 3 are being
formed, find
a. Number of possible words
b. The probability of each word
c. The amount of information
d. Redundancy
Solution
a. The number of possible words nL =23=8
U={u1, u2, …, u8}
b. Probability of each word equals the product of individual alphabet
U P(u) U P(u)
000 1/4*1/4*1/4=1/64 100 3/4*1/4*1/4=3/64
001 1/4*1/4*3/4=3/64 101 3/4*1/4*3/4=9/64
010 1/4*3/4*1/4=3/64 110 3/4*3/4*1/4=9/64
011 1/4*3/4*3/4=9/64 111 3/4*3/4*3/4=27/64

c. Amount of information
n
1
H ( u )=Pi ∑
i=1 Pi

2/3
Information theory lecture-3

1 3∗3 64 3∗9 64 27 64
H ( u )=
64 ( )
log 4 64+
64
log 2 +
3 ( )
64
log 2 + log 2 =2.429 bits
9 64 27

d. Redundancy
Max H ( u )=log 2 8=3
H ( u) 2.429
¿=1− =1− =0.19
Max H (u) 3

3/3

You might also like