Professional Documents
Culture Documents
Information Theory
Entropy, Joint Entropy, Conditional Entropy, Mutual Information, Source Coding, Channel Coding
𝑃 𝑦𝑘 = 1
𝑘=1
❑ Examples: English Alphabet, Morse Code, Braille Code
𝑋 is independent of 𝑌
Channel
𝑋
Data Source 𝑌
𝑋 is completely dependent on 𝑌
𝐻 𝑋 = − 𝑃 𝑥 log 𝑃 𝑥
𝑥∈𝑋ത
❑ The entropy of a random variable 𝑋 is a measure of
uncertainty or ambiguity.
❑ If we observe 𝑋, uncertainty about it is removed. By
consequence, 𝐻 𝑋 can be thought of as the measure of
information acquired by observing 𝑋.
Spoken [ɑ]
Written A, a
Sign Language
Morse Code
Braille Code
Data Source 𝑋
𝑎1 𝑎1 0
0.81
𝑎1 𝑎1 0 𝑎1 𝑎2 10
1
0.09
𝑎1 𝑎2 𝑎2 𝑎1 110
0 0.19
0.09 𝑎2 𝑎2 111
𝑎2 𝑎1 1
0 0.10
0.01 1
𝑎2 𝑎2
1
❑ 𝐿ത = 0.645 bits/symbol.
𝑃 𝒙|𝒚 = ෑ 𝑃 𝑥𝑖 𝑦𝑖
𝑖=1
❑ This means that the output 𝑦𝑖 is only dependent on 𝑥𝑖 .
𝑝
1 1
1−𝑝
𝐶 ′ = max 𝐼 𝑋, 𝑌 bits/symbol
𝑝 𝑥
𝐶 = 𝐶 ′ 𝑅𝑐 bits/s
❑ 𝑅𝑐 is the symbol rate, 𝐶′ is the maximum mutual information
that can be transferred, 𝐶 is the maximum bitrate that can be
transmitted.
𝐻 𝑌|𝑋 = − න න 𝑃 𝑦 𝑥 log 2 𝑃 𝑦 𝑥 𝑑𝑦 𝑃 𝑥 𝑑𝑥
′
𝜎𝑦2
𝐶 = 0.5 log 2
𝜎𝑛2
𝜎𝑥2
𝐶 ′ = 0.5 log 2 1+ 2
𝜎𝑛
𝑆
𝐶′ = 0.5 log 2 1+
𝑁
𝑥 Channel 𝑦
Bandwidth
limited region
More spectrally efficient
Power limited
region