Professional Documents
Culture Documents
PRINCIPLES OF COMMUNICATION
SYSTEMS
Differential Entropy
Objectives
• To discuss about
– Joint entropy
– Conditional entropy
– Differential entropy
Joint Entropy
• Joint entropy of two discrete random variables (X,Y) is defined
as
Conditional Entropy
• Conditional entropy of random variable X, given Y is defined
as
K 1 J 1 1
H ( X Y ) p x j , y k log 2
k 0 j 0 p x j y
k
H ( X , Y ) H (Y ) H ( X Y )
H ( X , Y ) H ( X ) H (Y X )
Differential Entropy
h( X , Y ) h(Y ) h( X Y )
I ( X ; Y ) h(Y ) h(Y X )
Differential Entropy
log a
Differential Entropy
1 x2
2
h( X ) log log e 2
f ( x)dx
2 2
Differential Entropy
x2
2
1
h( X ) log f ( x)dx log e 2 f ( x)dx
2
2
1
x2
h( X ) log f ( x)dx 2 log e f ( x)dx
2
2
2
1
1 log e
2
h( X ) log 2 2 x 2
f ( x)dx
2 2
1 log e 2
h( X ) log 2 2 2
2 2
1
h( X ) log 2 2e 2
2
Summary
• Following topics were discussed in detail
– Joint entropy
– Conditional entropy
– Differential entropy
Test your understanding
Prove that H ( X , Y ) H (Y ) H ( X Y )
References
1. Proakis.J.G, Salehi.M, Fundamentals of Communication
Systems, Pearson Education, Second Edition, 2006.
2. S. Haykin, “Digital Communications”, John Wiley, 2005.
3. https://nptel.ac.in/courses
Thank you !