You are on page 1of 4

Communications Systems II-Lec.

2 Fourth stage 2020-2021

2. Information theory

As mentioned before, that there are three main parts in any communication
system; transmitter, channel and receiver. Now, there is one more important
aspect need to be considered at the transmitter and receiver (with digital
systems) which is the existence of encoder and decoder as shown below:

Information Encoder Channel Decoder Rx. point

Figure 2.1 Digital communication systems.

The encoder includes source, channel encoders with modulation and decoder
include demodulation, channel decoding and source decoding. In
communications and information processing, code is a system of rules to
convert information—such as a letter, word, sound, image, or gesture—into
another form, sometimes shortened or secret, for communication through a
communication channel or store in a storage medium.
The original information theory is related with three points; information
measure (R), channel capacity (C) and coding. The role of engineer is to make
sure that R < C in order to pass the information with less amount of errors.
Otherwise, coding helps in making R pass through channel. Unit of both ( R &
C) is bits/sec and unit of basic information is bits

2.1 Information measure

2.1.1 Self information

Acts consists of information that are inversely proportional with their


probability of occurrence.
Let the symbol (I) denote the information having a probability (p), then :

1
I ∝ 𝑝. (2.1)
1. If act always happens, it is with a probability of (1). (I=0).
2. If act seldom happens, then it consists of high information.
After derivation the above relationship we get:

Dr. Hasanain A. Hasan Department of Electrical Engineering, University of Misan


Communications Systems II-Lec.2 Fourth stage 2020-2021

1
I = log 2 𝑝 . bits (2.2)

The above equation (2.2) known as Shannon’s first law of information theory.

Example 2.1: A communication source emits the following information.


Evaluate the information given in each type.
ABBCABBBCCBAACBBBABC

Solution:

Since here we do have three types of information A, B and C and they are not
equally likely. So, we need to estimate the probability of each act’s(type’s)
occurrence:
N = number of symbols,
N= 20
M = number of types.
M= 8
A,B and C
𝑁𝐴 5 1 1 1
𝑃𝐴 = = = 𝐼𝐴 = log 2 = log 2 = 2 𝑏𝑖𝑡𝑠
𝑁 20 4 𝑃𝐴 1
4
𝑁𝐵 10 1 1 1
𝑃𝐵 = = = 𝐼𝐵 = log 2 = log 2 = 1 𝑏𝑖𝑡𝑠
𝑁 20 2 𝑃𝐵 1
2
𝑁𝑐 5 1 1 1
𝑃𝑐 = = = 𝐼 = log 2 = log 2 = 2 𝑏𝑖𝑡𝑠
𝑁 20 2 𝐶 𝑃𝑐 1
4
2.1.2 Entropy and Information rate

We should know from our experience in maths that there are three types of
averaging:
𝐴+𝐵
1. Arithmetic mean= 2

2. Geometric mean = √𝐴𝐵

3. Probabilistic mean. It is a bit different which will be explained in the


Entropy meaning below:
Let start with total information :

Dr. Hasanain A. Hasan Department of Electrical Engineering, University of Misan


Communications Systems II-Lec.2 Fourth stage 2020-2021

IAT = Total infromation in A


= 𝑁𝐴 ∗ 𝐼𝐴 = 𝑁 ∗ 𝑃𝐴 ∗ 𝐼𝐴
1
= 𝑁 ∗ 𝑃𝐴 ∗ log 2
𝑃𝐴
Similarly,
IBT = Total infromation in B
= 𝑁𝐵 ∗ 𝐼𝐵 = 𝑁 ∗ 𝑃𝐵 ∗ 𝐼𝐵
1
= 𝑁 ∗ 𝑃𝐵 ∗ log 2
𝑃𝐵
ICT = Total infromation in C
= 𝑁𝐶 ∗ 𝐼𝐶 = 𝑁 ∗ 𝑃𝐶 ∗ 𝐼𝐶
1
= 𝑁 ∗ 𝑃𝐶 ∗ log 2
𝑃𝐶
Therefore, total information of source (IT ) = IAT +IBT +ICT

1 1 1
IT = 𝑁 ∗ 𝑃𝐴 ∗ log 2 + 𝑁 ∗ 𝑃𝐵 ∗ log 2 + 𝑁 ∗ 𝑃𝐶 ∗ log 2
𝑃𝐴 𝑃𝐵 𝑃𝐶
𝑀
1
IT = ∑ 𝑁𝑃𝑘 ∗ log 2
𝑃𝑘
𝑘=1
𝑀
IT 1
= ∑ 𝑃𝑘 ∗ log 2
𝑁 𝑃𝑘
𝑘=1
IT
= 𝐴𝑣𝑒𝑟𝑎𝑔𝑒 𝑖𝑛𝑓𝑜𝑟𝑚𝑎𝑡𝑖𝑜𝑛 = 𝐸𝑛𝑡𝑟𝑜𝑝𝑦 (𝐻)
𝑁
1 𝑏𝑖𝑡
𝐻 = ∑𝑀 𝑃𝑘 ∗ log 2 𝑃
𝑘=1
𝑖𝑛 𝑠𝑦𝑚𝑏𝑜𝑙 . (2.3)
𝑘

The above equation is called Shannon’s second law in information theory.

Now, If there is two sources have the same average information but one
faster than other, there will be an importance to define Information rate (R).
Information rate (R)is given by the formula below:
𝐻 𝑏𝑖𝑡𝑠
𝑅= 𝑖𝑛 .
𝜏 𝑠𝑒𝑐
𝜏: is the average symbol duration

Example 2.2
A discrete source emits one of five symbols once every millisecond. The
symbol probabilities are : 0.5, 0.25, 0.125, 0.0625 and 0.0625 respectively. Find
the Entropy and information rate.

Dr. Hasanain A. Hasan Department of Electrical Engineering, University of Misan


Communications Systems II-Lec.2 Fourth stage 2020-2021

𝑀
1
𝐻 = ∑ 𝑃𝑘 ∗ log 2 𝑃
𝑘
𝑘=1
1 1 1 1 1
𝐻= log 2 2 + log 2 4 + log 2 8 + log 2 16 + log 2 16
2 4 8 16 16

𝐻= 0.5 +0.5+0.375+0.25+0.25

𝑏𝑖𝑡𝑠
𝐻 = 1.875 .
𝑠𝑒𝑐

𝐻 1.875 𝑏𝑖𝑡𝑠
𝑅= = = 1875 𝑠𝑒𝑐 .
𝜏 10−3

Dr. Hasanain A. Hasan Department of Electrical Engineering, University of Misan

You might also like