You are on page 1of 12

Consider two RVS X and Y

 The joint distribution of the random variables 𝑋 and 𝑌 is given by


 The marginal probability distribution of the random variable 𝑌

i.e. given the prior probabilities 𝑝 𝑥𝑗 and the conditional probabilities 𝑝 𝑦𝑘 /𝑥𝑗 , the

probabilities 𝑝 𝑦𝑘 can be calculated.


Conditional entropy
Entropy of a RV given another RV

➢ Conditional entropy of X , given Y = 𝑦𝑘

➢ This is a random variable taking values with


probabilities respectively
➢ The expectation of entropy H(X|Y = 𝑦𝑘 ) over the RV 𝒴 is therefore given by

Conditional entropy

The conditional entropy, H(X|Y), is the average amount of uncertainty


remaining about the RV X after the RV Y has been observed
Mutual Information

➢ Entropy H(X) accounts for the uncertainty about X before observing Y and the
conditional entropy H(X|Y) accounts for the uncertainty about X after observing Y, hence

The mutual information I(X;Y) is a measure of the uncertainty a RV X , which is


resolved by observing another RV Y.
We use the formula for entropy and

and

We can write
We know that

Substituting H(X) and H(X|Y) in the expression for I(X;Y) we get


➢ We can also define

The mutual information I(Y;X) is a measure of the uncertainty about a RV Y that is


resolved by observing another RV X

Properties of Mutual Information

1. Symmetry

The mutual information is symmetric in the sense that


2. Nonnegativity

3. Expansion of the mutual information

Note: H(X,Y) = H(Y,X)

You might also like