You are on page 1of 15

Module 5: Lecture 39

Information Theory: Channel Capacity

Dr. S .M. Zafaruddin


Associate Professor
Deptt. of EEE, BITS Pilani, Pilani Campus

Dr. Zafar (BITS Pilani) CommSys: M5L39 IT 1 / 42


Objectives of Today Lecture

Source Coding: Examples


Channel Capacity

Dr. Zafar (BITS Pilani) CommSys: M5L39 IT 2 / 42


Measure of information

I = H(m) = log2 (1/P ) bits/message


Entropy: Average information.
H(m) = − ni Pi log2 Pi bits/message.
P

Dr. Zafar (BITS Pilani) CommSys: M5L39 IT 3 / 42


Variable Length Encoding: Example

Encode using fixed length and variable length (Shannon-Fano


and Huffman/compact) algorithms.
A with PA = 0.20, B with PB = 0.20, C with PC = 0.10, D
with PD = 0.50

Dr. Zafar (BITS Pilani) CommSys: M5L39 IT 4 / 42


Solution
H(m) =
−PD log2 (PD ) − PA log2 (PA ) − PB log2 (PB ) − PC log2 (PC ) = 1.76
bits/message.
Shannon-Fano:
D=0
A = 11
B = 100
C = 101
Huffman:
D=0
A = 11
B = 100
C = 101
L̄ = 1 × PD + 2 × PA + 3 × PB + 3 × PC = 1.8 bits/message.
Efficiency= 1.76/2.1 = 97.78%
Dr. Zafar (BITS Pilani) CommSys: M5L39 IT 5 / 42
Problem: Source Extension

A source emits two messages with probabilities 1/3 and 2/3


randomly and independently. Find the Shannon-Fano code of
second-order extension of the source.

Dr. Zafar (BITS Pilani) CommSys: M5L39 IT 6 / 42


Solution

Second order extension: new messages are four with


probabilities 1/9, 4/9, 2/9, 2/9.
Shannon-Fano Encoding:
Codes are 0, 10, 110, 111.
Similarly for Huffman coding.

Dr. Zafar (BITS Pilani) CommSys: M5L39 IT 7 / 42


Channel Capacity: Motivation

q
Pe = k1 Q( kN
2 Eb
0
)

Dr. Zafar (BITS Pilani) CommSys: M5L39 IT 8 / 42


Channel Capacity: Motivation

To make Pe → 0, Eb → ∞
Eb = Pb /Rb
To increase Eb : either increase Pb or decrease Rb
There is always a limit on Pb
Hence to get zero-error transmission, Rb → 0
Shannon’s Channel Capacity: Achieves zero-error
transmission even without Rb = 0.
In essence: If Rb ≤ C, pe → 0, where C is channel capacity.

Dr. Zafar (BITS Pilani) CommSys: M5L39 IT 9 / 42


Condition for Maximum Entropy

Intuitively?
H(m) = − ni=1 Pi log2 Pi under constraint of
P
Maximize
Pn
i=1 Pi = 1.
Lagrangian: f (P1 , P2 , · · · , Pn ) = − ni=1 Pi log2 Pi + λ ni=1 Pi
P P
df
dPj
= −Pj ( P1j ) log2 e − log2 Pj + λ = − log2 Pj + λ − log2 e,
j = 1, 2, · · · n.
λ
Setting to zero P1 = P2 = · · · = Pn = 2e
Invoking the constraint:
λ
n 2e = 1
P1 = P2 = · · · = Pn = n1 .
H(m) = log2 n.

Dr. Zafar (BITS Pilani) CommSys: M5L39 IT 10 / 42


Discrete Memory Less Channel: Binary

Find the channel capacity of the channel as shown in Figure 1:

0 a

1 b

Dr. Zafar (BITS Pilani) CommSys: M5L39 IT 11 / 42


Discrete Memory Less Channel: M-ary

Find the channel capacity of an n- input and n-output channel as


shown in the following figure.

x1 y1
x2 y2
x3 y3

xn yn

Dr. Zafar (BITS Pilani) CommSys: M5L39 IT 12 / 42


Binary Symmetric Channel (BSC)

Dr. Zafar (BITS Pilani) CommSys: M5L39 IT 13 / 42


Mutual Information I(X; Y )

Mutual information:
I(X : Y ) = H(X) − H(X|Y ) = H(Y ) − H(Y |X)
bits/message or bits/symbol

Dr. Zafar (BITS Pilani) CommSys: M5L39 IT 14 / 42


Channel Capacity C

Channel capacity C = maxP (xi ) I(X; Y ) bits per symbol.

Dr. Zafar (BITS Pilani) CommSys: M5L39 IT 15 / 42

You might also like