Professional Documents
Culture Documents
http://ocw.mit.edu
For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
16.36: Communication Systems Engineering
Lecture 3:
Eytan Modiano
Eytan Modiano
Slide 1
Information content of a random variable
Random variable X
Outcome of a random experiment
Discrete R.V. takes on values from a nite set of possible outcomes
PMF: P(X = y) = Px(y)
Eytan Modiano
Slide 2
Measure of Information
Eytan Modiano
Slide 3
Entropy
X {x1,,xM}
X = x1 with probability p
X = x2 with probability (1-p)
Not surprising that the result of a binary experiment can be conveyed using
one bit
Eytan Modiano
Slide 4
0 H(X) Log2(M)
Proof of A is obvious
Y=x-1
Proof of B requires
Eytan Modiano
Slide 5
Proof, continued
M M
Consider the sum !
Pi Log(
1
MPi
)=
1
ln(2)
!
PiLn (
1
MPi
), by log inequality:
i=1 i =1
M M
"
1
ln(2)
! Pi (
1
MPi
#1) =
1
ln(2)
! 1
( # Pi) = 0, equality when Pi =
M
1
M
i=1 i =1
Writing this in another way:
M M M
! PiLog (
1
MPi
)= ! 1
PiLog( ) +
Pi
! 1
Pi Log( ) " 0,equality when Pi =
M
1
M
i =1 i=1 i=1
M M
That is, ! 1
PiLog ( ) "
Pi
! PiLog (M ) = Log( M)
i =1 i=1
Eytan Modiano
Slide 6
Joint Entropy
!
1
Joint entropy: H(X,Y) = p(x, y) log( )
p(x, y)
x ,y
!
1
H(X | Y = y) = p(x | Y = y) log( )
x
p(x | Y = y)
!
1
H(X | Y) = p( x, y) log( )
p(x | Y = y)
x ,y
!
1
H(X n | X 1,...,X n- 1) = p(x1,...,xn ) log(
Eytan Modiano
Slide 7 x1 ,...,xn
p(xn | x1,...,xn- 1)
Rules for entropy
1. Chain rule:
4. H(X1, .., Xn) H(X1) + H(X2) + + H(Xn) (with equality iff independent)
Proof: use chain rule and notice that H(X|Y) < H(X)
entropy is not increased by additional information
Eytan Modiano
Slide 8
Mutual Information
X, Y random variables
Eytan Modiano
Slide 9