You are on page 1of 14

UEC 1404

PRINCIPLES OF COMMUNICATION
SYSTEMS
Differential Entropy
Objectives
• To discuss about
– Joint entropy
– Conditional entropy
– Differential entropy
Joint Entropy
• Joint entropy of two discrete random variables (X,Y) is defined
as
Conditional Entropy
• Conditional entropy of random variable X, given Y is defined
as
K 1 J 1  1 
H ( X Y )   p x j , y k log 2  
k 0 j 0  p x j y 
k 

• Similarly Conditional entropy of random variable Y , given X


is defined as
K 1 J 1  1 
H (Y X )   p x j , y k log 2  
k 0 j 0  p  y k x 
j 

Relation between Conditional and Joint
Entropies

• Joint and conditional entropies of random variables X and Y


are related as,

H ( X , Y )  H (Y )  H ( X Y )
H ( X , Y )  H ( X )  H (Y X )
Differential Entropy

• The differential entropy of continuous random variable X, with


the probability density function f X (x) is denoted by h( X ) and is
defined as

h( X )    f X ( x) log f X ( x) dx


h( X , Y )  h(Y )  h( X Y )

I ( X ; Y )  h(Y )  h(Y X )
Differential Entropy

• Determine the differential entropy of a random variable X


uniformly distributed on [0,a].

1 1
h( X )    log dx

a a

 log a
Differential Entropy

• Determine the differential entropy of a zero mean Gaussian


random variable with variance  2
x2
1 
2 2
f ( x)  e
2 2
 1 
2
 x
 2
h( X )    log e 2  f ( x)dx
 2 2 
  

   1   x2
2

h( X )     log   log e 2

  f ( x)dx
 2 2  

    
Differential Entropy
  x2 
2
  1


h( X )     log   f ( x)dx    log e 2   f ( x)dx
 2    
   2  
  
 1  
 x2 
h( X )   log    f ( x)dx     2  log e  f ( x)dx
2
 2    
2 

1

1 log e
2 
h( X )  log 2 2  x 2
f ( x)dx
2 2 

1 log e 2
h( X )  log 2 2  2

2 2

1
h( X )  log 2 2e 2
2
Summary
• Following topics were discussed in detail
– Joint entropy
– Conditional entropy
– Differential entropy
Test your understanding

Prove that H ( X , Y )  H (Y )  H ( X Y )
References
1. Proakis.J.G, Salehi.M, Fundamentals of Communication
Systems, Pearson Education, Second Edition, 2006.
2. S. Haykin, “Digital Communications”, John Wiley, 2005.
3. https://nptel.ac.in/courses
Thank you !

You might also like