Professional Documents
Culture Documents
Lecture2 PDF
Lecture2 PDF
• Entropy
• Mutual Information
0
0 10 20 30 40 50 60 70 80 90 100
• It is a function of p1, · · · , pM
H(p1, · · · , pM )
f (M ) < f (M ′).
Picking one person randomly from the classroom should result less
possibility than picking a person randomly from the US.
f (M L) − f (M ) = f (L)
• Grouping rule (Problem 2.27 in Text). Dividing the outcomes into two,
randomly choose one group, and then randomly pick an element from
one group, does not change the number of possible outcomes.
• The only function that satisfies the requirements is the entropy function
∑
M
H(p1, · · · , pM ) = − pi log2 pi
i=1
• 0 log 0 = 0
– Concave
– Maximizes at p = 1/2
Example: how to ask questions?
Proof:
H(X ) H(Y )
1.23" !"($ )$ )$ )$
*+$>7:?=87-$@!"'A$!"(A$!"#A$!"#B$
4+$>7:?=87-$@!"(A$!"(A$!"(A$!"(B$
C@*B$D$E"($,=0;$ C@4B$D$'$,=0;$
F.8/=G.87-$380:.21+$C@*H4B$D$!!"#$,=0;A$C@4H*B$D$!&"#$,=0;$
$$
C@4H*B$I$C@*H4B$
JK0K7-$=89.:>7G.8+$L@*M$4B$D$C@*B$N$C@*H4B$D$)O&EP$,=0$
Dr. Yao Xie, ECE587, Information Theory, Duke University 16
#$!"
!"
!"
#$!"
-$./012#
34.456#'$70/&580$#
!"### %"###
!# 9(!"#)*)#!$+# 9(%"#)*)#%&:!")#*)#!$+# 9(%"#)*)#%&+#
#!$##### #%&#####
,(!+## '(!"#)*)#!$!#%")#*)#%&+#