Professional Documents
Culture Documents
Principles of Communication
Lecture 35 Transcript
INFORMATION
Zainab Ali
210108062
EEE
Contents
1 INFORMATION 2
1.1 Basic Idea of Information Theory . . . . . . . . . . . . . . . . 2
1.2 Communication Engineering . . . . . . . . . . . . . . . . . . . 2
1.2.1 General Model . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Information Source . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3.1 Discrete Memoryless Source . . . . . . . . . . . . . . . 3
1.4 Measure of Information . . . . . . . . . . . . . . . . . . . . . 3
1.4.1 Properties of Information . . . . . . . . . . . . . . . . . 3
1.4.2 Self Information . . . . . . . . . . . . . . . . . . . . . . 4
1.5 Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.5.1 Binary Entropy . . . . . . . . . . . . . . . . . . . . . . 5
1.5.2 Joint and Conditional Entropy . . . . . . . . . . . . . . 6
1.6 Studied So far . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
1
1 INFORMATION
In this lecture, we’ll start by understanding information in regard to commu-
nication and how it’s connected to the probability of occurrence of events.
We’ll then learn how to quantify it.
2
1.3 Information Source
Information only comes from randomness. And information originates
from an Information Source. It is the provider of the data or message to
be communicated. The information can take various forms, such as text,
audio, video, or any other data type. For example, an image as the source
of information. And it’s in the form of pixels.
2. Since we know,
1
The information content of an event ∝ probability of the event
3
this implies their no information on a certain event. Hence, I(pj ) is a
continuous decreasing function of pi .
It can be proved that the only function that satisfies all the above properties
is the logarithmic function; i.e.,I(x) = − log(x).
The base of the logarithm is not important and defines the unit by which
the information is measured. If the base is 2, the information is expressed
in bits, and if the natural logarithm is employed, the unit is nats. From this
point on we assume all logarithms are in base 2 and all information is given
in bits.
The above information is contained in the event when the random variable
takes a value ai .
But what about the information contained by the whole of the random vari-
able? There comes the concept of entropy, which we are going to cover in
the next section.
4
1.5 Entropy
We can define the information content of the source as the weighted average
of the self-information of all source outputs. This is justified by the fact
that various source outputs appear with their corresponding probabilities.
Therefore, the information revealed by an unidentified source output is the
weighted average of the self-information of the various source outputs; i.e.,
N
X N
X
H(X) = pi I(pi ) = − pi log pi (1.5.1)
i=1 i=1
where 0 log 0 = 0. Note that there exists a slight abuse of notation here.
One would expect H(X) to denote a function of the random variable X and,
hence, be a random variable itself. However, H(X) is a function of the PMF
of the random variable X and is, therefore, a number.
5
Figure 1: The binary entropy function.
dHb (p) d
= − (p log p + (1 − p) log(1 − p))
dp dp
Now, we set the derivative equal to zero to find the maximum:
dHb (p)
=0
dp
Solving for p:
d
− (p log p + (1 − p) log(1 − p)) = 0
dp
You can solve the above equation to find the value of p that maximizes the
binary entropy function Hb (p). In the case of Bernoulli random variables, we
get p=1/2. This means when both events are equally likely to occur we get
the maximum entropy. Hence,
When randomness is maximum, Entropy is maximum.
6
joint and conditional probabilities are introduced, one can introduce joint
and conditional entropies. These concepts are especially important when
dealing with sources with memory.
X
H(X|Y ) = − p(x1 , x2 , . . . , xn |Y ) log p(x1 , x2 , . . . , xn |Y ) (1.5.5)
x1 ,x2 ,...,xn
In general, we have:
X
H(Xn |X1 , . . . , Xn−1 ) = − p(xn |X1 , . . . , Xn−1 ) log p(xn |X1 , . . . , Xn−1 )
xn
7
1.6 Studied So far
In conclusion, we’ve covered the core principles of Information Theory in
this lecture. We began by introducing the fundamental idea of Information
Theory and its practical applications in Communication Engineering. From
there, we explored the concept of Information Sources, focusing on the sim-
plified Discrete Memoryless Source. We learned how to measure information
and discussed the properties of information, particularly Self Information.
Entropy, a central concept, was explained, including its application in Bi-
nary Entropy and more complex scenarios involving Joint and Conditional
Entropy. This knowledge lays the groundwork for understanding and using
information effectively in further lectures.