You are on page 1of 6

Joint Entropy

JOINT ENTROPY
• Two sources of information X and Y giving messages x1, x2, x3…xm
and y1, y2, y3…yn respectively.
• Can be inputs and outputs of a communication system.
• Joint event of X and Y considered.
• Should have complete probability scheme i.e. sum of all possible
combinations of joint event of X and Y should be 1. m n Σ Σ p (xi, yj) =
1 i =1 j =1
• Entropy calculated same as marginal entropy.
• Information delivered when one pair (xi, yj) occur once is -log p (xi,
yj) .
• Number of times this can happen is Nij out of total N.
• Information for Nij times for this particular combination is - Nij log p
(xi, yj) .
• Total information for all possible combinations of I and j is m n - Σ Σ
Nij log p (xi, yj) i =1 j =1
Joint Entropy
• H (xi, yj) = Total / N p (xi, yj) = Nij / N

• JOINT ENTROPY H (xi, yj):


m n
H (xi, yj)= - Σ Σ p (xi, yj) log p (xi, yj)
i =1 j =1
• ALSO,
m
Σ p (xi, yj) = p ( yj)
i =1

n
Σ p (xi, yj) = p ( xi)
j =1
Joint Entropy

• H(X) = Average information per character at the source or the entropy of the
source.
• • H(Y) = Average information per character at the destination or the entropy at
the receiver.
• • H(X,Y) = It is average information per pair of transmitted and received
character or average uncertainty of communication of communication as a
whole.
• In H(X,Y), X and Y both are uncertain.
• • H(Y/X) = A specific Xi is transmitted (known). One of permissible yj is
received with given probability. This Conditional Entropy is the measure of
information about the receiver where it is known that a particular Xi is
transmitted. It is the NOISE or ERROR in the channel. • H(X/Y) = A specific
Yj is received (known). It may be the result of one of the Xi with a given
probability. This Conditional Entropy is the measure of information about the
source where it is known that a particular Yj is received. It is the measure of
equivocation or how well input content can be recovered from the output.
Conditional Entropy
• Complete Probability Scheme required
• Baye’s Theorem –
p (xi, yj) = p (xi) p (yj /xi) = p (yj) p (xi /yj )
• For a particular Yj received, it can be ONLY from one of x1, x2, x3…
xm. • p(x1/yj) + p(x2/yj) + p(x3/yj) …+ p(xm/yj) = 1
m
Σ p (xi / yj) = 1
i =1
Similarly
n
Σ p (yj / xi) = 1
J =1
Yj is received.
m
• H (X/ yj)= - Σ p (xi/ yj) log p (xi / yj)
i =1
• Average conditional entropy is taking all such entropies for all Yj.
• No of times H (X/ yj) occurs = no of times Yj occurs = Ny1
• H(X/Y) = 1/N( Ny1* H (X/ y1) + Ny2* H (X/ y2) + Ny3* H (X/ y3) …

n
• H(X/Y) = Σ p ( yj) H (X/ yj)
j =1
mn
• H(X/Y) = -Σ Σ p ( yj) p (xi/ yj) log p (xi / yj)
i =1 j =1
mn mn
• H(X/Y) = -Σ Σ p (xi,yj) log p (xi / yj) ; H(Y/X) = -Σ Σ p (xi,yj) log p (yj /
xi)
i =1 j =1 i =1 j =1
RELATION AMONG ENTROPIES
H(X,Y) = H(X) + H(Y/X) = H(Y) + H(X/Y)
mn
• H (xi, yj)= - Σ Σ p (xi, yj) log p (xi, yj)
i =1 j =1
Bay’s Theorem - p (xi, yj) = p (xi) p (yj /xi) = p (yj) p (xi /yj )
mn
• H (xi, yj)= - Σ Σ p (xi, yj) log p (xi) p (yj /xi)
i =1 j =1

- Σ Σ p (xi, yj) log p (xi) - Σ Σ p (xi, yj) log p (yj /xi)

H(X,Y) = H(X) + H(Y/X )

You might also like