Professional Documents
Culture Documents
INFORMATION
•It is quantitative measure of information.
•Most UN-expected events give maximum information.
80
60
50
30
10
• ALSO,
m
Σ p (xi, yj) = p ( yj)
i =1
n
Σ p (xi, yj) = p ( xi)
j =1
p (xi, yj) =
y1 y2 y3 y4
x1 0.25 0 0 0
x2 0.1 0.3 0 0
x3 0 0.05 0.1 0
x4 0 0 0.05 0.1
x5 0 0 0.05 0
• Similarly
n
Σ p (yj / xi) = 1
J =1
CONDITIONAL ENTROPY H(X/Y)
• Yj is received.
m
• H (X/ yj)= - Σ p (xi/ yj) log p (xi / yj)
i =1
• Average conditional entropy is taking all such entropies for all Yj.
• No of times H (X/ yj) occurs = no of times Yj occurs = Ny1
• H(X/Y) = 1/N( Ny1* H (X/ y1) + Ny2* H (X/ y2) + Ny3* H (X/ y3) …
n
• H(X/Y) = Σ p ( yj) H (X/ yj)
j =1
m n
• H(X/Y) = -Σ Σ p ( yj) p (xi/ yj) log p (xi / yj)
i =1 j =1
m n
• H(X/Y) = -Σ Σ p (xi,yj) log p (xi / yj) Similarly
i =1 j =1
m n
• H(Y/X) = -Σ Σ p (xi,yj) log p (yj / xi)
i =1 j =1
PROBLEMS - 1
1. Transmitter has an alphabet consisting of five letters x1, x2, x3, x4 , x5
And receiver has an alphabet consisting of four letters - y1, y2, y3, y4 .
Following probabilities are given.
y1 y2 y3 y4
x1 0.25 0 0 0
x2 0.1 0.3 0 0
x3 0 0.05 0.1 0
x4 0 0 0.05 0.1
x5 0 0 0.05 0
• It is P(X,Y)
• H(X) = 2.066 bits / symbol
• H(Y) = 1.856 bits / symbol
• H(X,Y) = 2.665 bits / symbol
• H(X/Y) = 0.809 bits / symbol
• H(Y/X) = 0.6 bits / symbol
RELATION AMONG ENTROPIES
H(X,Y) = H(X) + H(Y/X) = H(Y) + H(X/Y)
m n
• H (xi, yj)= - Σ Σ p (xi, yj) log p (xi, yj)
i =1 j =1
Bay’s Theorem - p (xi, yj) = p (xi) p (yj /xi) = p (yj) p (xi /yj )
m n
• H (xi, yj)= - Σ Σ p (xi, yj) log p (xi) p (yj /xi)
i =1 j =1
90
80
• ≤[1 – 1] log2e 70
60
50
• H(X/Y) -H(X) ≤ 0 40
30
20
0
0 10 20 30 40 50 60 70 80 90 100
PROBLEMS - 2
• Identify the following probability scheme and find all
entropies.
• Given P(x) = [ 0.3 0.4 0.3]
x1 3/5 y1
1/5
1/5
1/5
x1 3/5 y2
1/5
1/5
1/5
x1 3/5 y3
PROBLEMS - 3
x1 y1
x2 p
y2
1-p
1-p
x3 y3
p
0.8 0.1 0.1 PROBLEMS - 4
0.1 0.8 0.1 p(x) =( 0.3 0.2 0.5)
0.1 0.1 0.8
Find all entropies.
PROBLEMS – 6
3/40 1/40 1/40
1/20 3/20 1/20 p(x) = 1/4 12/40 18/40
1/8 1/8 3/8
Find all entropies.
Show that in discrete noise free channel both the
conditional entropies are zero.
x1 P(x1,y1) y1
x2 . P(x2,y2) y2
.
.
xm P(xm,yn) yn
p1 p1 p1 …p1 p1 p2 p3 …pn
p2 p2 p2 …p2 OR p1 p2 p3 …pn
… …
pm pm pm ..pm p1 p2 p3 …pn
m n
• Σ Σ p(x,y) = 1 In this case p(x,y) = p(x) * p(y)
i =1 j =1
m
• Σ pi = 1/n
i =1
np1
p(x) = np2
np3
.
npm
EFFICIENCY
• η = I(X,Y) / C
ASSIGNMENT
1. Find mutual information for noiseless channel.
2. Given p(x) = 1/4, 2/5, 3/20, 3/20, 1/20
Find efficiency for following probability matrix –
1/4 0 0 0
1/10 3/10 0 0
0 1/20 1/10 0
0 0 1/20 1/10
0 0 1/20 0
3. 2/3 1/3
1/3 2/3
If p(x1) = ¾ and p(x2) = ¼,
find H(X), H(Y), H(X/Y), H(Y/X), I(X,Y), C and
efficiency.
ASSIGNMENT
4. a. For a given channel find mutual information I(X,Y) with
p(x) = {½, ½}
y1 y2 y3
x1 2/3 1/3 0
x2 0 1/6 5/6
b. If receiver decides that x1 is the most likely
transmitted symbol when y2 is received, other
conditional probabilities remaining same, then calculate
I(X,Y)
HINT: b) Find p(X/Y). Change p(x1/y2) to 1 and p(x2/y1) to
0. Rest remains same. Calculate I.
Types of Channels – Symmetric channel
• Every row of the channel matrix contains same set of
numbers p1 to pJ.
• Every column of the channel matrix contains same set of
numbers q1 to qJ.
1-ϵ ϵ 0≤ ϵ≤1
ϵ 1-ϵ
• H(Y/X) is independent of input distribution and solely
determined by channel matrix
m n
• H(Y/X) = -Σ Σ p (xi,yj) log p (yj / xi)
i =1 j =1
Types of Channels – Symmetric channel
m n
• H(Y/X) = -Σ Σ p (xi,yj) log p (yj / xi)
i =1 j =1
Y1
½ ½ 0 0 0 0
X1
Y2 0 0 3/5 3/10 1/10 0
Y3 0 0 0 0 0 1
X2 Y4
Y5
X3 Y6
Types of Channels – Deterministic channel
• Input uniquely specifies the output.
• H(Y/X) = 0
• Matrix has one and only one non-zero element in each
row.
• Also Sum in row should be 1.
• Elements are either 0 or 1.
• P(Y/X) =
1 0 0
X1 Y1 1 0 0
0 1 0
0 1 0
0 1 0
Y2 0 0 1
X6 Y3
Types of Channels – Useless channel
• X and Y are independent for all input distributions.
• H(X/Y) = H(X)
• Proof of sufficiency:-
• Assuming channel matrix has identical rows. Source is NOT equi-
probable.
• For every output p(yj)–
m
p(yj) = Σ i =1 p (xi, yj)
p11
0 0
p12
p21
1 1
p22
q
0 0
p p+q=1
p
1 1
q
• p(X,Y) = q 0 p
• 0 q p
q
x1 y1
p
y3
p
x2 y2
q
• p(X,Y) = q/2 0 p/2
0 q/2 p/2
• p(X/Y) = 1 0 ½
0 1 ½
p p
x2 1 y2 z2
q q
p’
1 1
q’
Z1 Z2
p(Z/S) = X1 q’ p’
X2 p’ q’
• q’ = p2 + q2 = 1- 2pq
• p’ = 2pq
• Find C.
• C = 1- p’ log p’ + q’ log q’
• C = 1+ 2pq log (2pq) + (1-2pq) log (1-2pq)
• Find C.
P11 P12
PY/X =
P21 P22
Then
• C = 0.58 bits
EXTENSION OF ZERO-MEMORY SOURCE
• Binary alphabets can be extended to s2 to give 4
words, 00, 01, 10, 11.
• Binary alphabets can be extended to s3 to give 8
words, 000, 001, 010, 011, 100, 101, 110, 111.
• For m messages with m=2n, an nth extension of the
binary is required.
EXTENSION OF ZERO-MEMORY SOURCE
• Zero memory source s has alphabets {x1, x2,…xm.}.
• Then nth extension of S called Sn is zero memory
source with m’ symbols {y1, y2,…ym’.}.
• y(j) = {xj1, xj2,…xjn }.
• p(y(j)) = {p(xj1) p(xj2 )…p(xjn )}.
• H(Sn) = nH(S)
• Prove
H(Sn) = nH(S)
• Zero memory source S has alphabets {x1, x2,…xm.}
with probability of xi as pi.
• nth extension of S called Sn is zero memory source
with m’ = mn symbols {y1, y2,…y m’ }.
• y(j) = {xj1, xj2,…xjn }.
• p(y(j)) = {p(xj1) p(xj2 )…p(xjn )}.
• ∑ Sn p(y(j)) = ∑ Sn {p(xj1) p(xj2 )…p(xjn )}.
• =∑j1 ∑j2 ∑j3… p(xj1) p(xj2 )…p(xjn )
• =∑j1 p(xj1)∑j2 p(xj2 )∑j3… …p(xjn )
• =1
Using this..
• H(Sn) = -∑ Sn p(y(j)) log p(y(j))
H(Sn) = nH(S)
• H(Sn) = -∑ Sn p(y(j)) log {p(xj1) p(xj2 )…p(xjn )}
• = -∑ Sn p(y(j)) log p(xj1) -∑ Sn p(y(j)) log p(xj2 )…n times
• Here each term..
• -∑ Sn p(y(j)) log p(xj1)
• = -∑ Sn {p(xj1) p(xj2 )…p(xjn )} log p(xj1)
• = -∑j1 p(xj1)∑j2 p(xj2 )∑j3… …p(xjn ) log p(xj1)
• = -∑j1 p(xj1) log p(xj1) ∑j2 p(xj2 )∑j3p(xj2 ) …
• = -∑s p(xj1) log p(xj1) = H(S)
• H(Sn) = -∑s p(xj1) log p(xj1) … n times
• H(Sn) = n H(S)
EXTENSION OF ZERO-MEMORY SOURCE
• Problem 1: Find entropy of third extension S3 of a
binary source with p(0) = 0.25 and p(1) = 0.75. Show
that extended source satisfies complete probability
scheme and its entropy is three times primary
entropy.
• Problem 2: Consider a source with alphabets x1, x2 ,
x3, x4 with probabilities p{xi} ={ ½, ¼, 1/8, 1/8}.
Source is extended to deliver messages with two
symbols. Find the new alphabets and their
probabilities. Show that extended source satisfies
complete probability scheme and its entropy is twice
the primary entropy.
EXTENSION OF ZERO-MEMORY SOURCE
• Problem 1: 8 combinations
• H(S) = 0.815 bits/symbol H(S3) = 2.44 bits/symbol
• Problem 2: 16 combinations
• H(S) = 1.75 bits/symbol H(S2) = 3.5 bits/symbol
EXTENSION OF BINARY CHANNELS
• BSC q
0 0
p
p
1 1
• Channel matrix for second q
extension of channel –
Y 00 01 10 11
p(Y/X) = X
00 q2 pq pq p2
01 pq q2 p2 pq
10 pq p2 q2 pq
11 p2 pq pq q2
For C – x(i) = {0.5, 0.5}. Un-extended
For C – y(j) = {0.25, 0.25, 0.25, 0.25}. Extended
EXTENSION OF BINARY CHANNELS
• Show that
• I(X2, Y2) = 2 I(X, Y)
• = 2(1+ q log q + p log p)
CASCADED CHANNELS
• Show that
• I(X, Z) = 1-H(2pq) for 2 cascaded
BSC,
– I(X,Y) = 1- H(p)
– if H(p) = -(p log p +q log q)
• ALSO N = η B where
η / 2 is two sided noise power spectral density.
• Formula is for Gaussian channel.
• Can be proved for any general physical system
assuming-
1. Channels encountered in physical system are generally
approximately Gaussian.
2. The result obtained for a Gaussian channel often provides a
lower bound on the performance of a system operating over a
non Gaussian channel.
PROOF-
• Source generating messages in form of fixed voltage levels.
• Level height is λσ. RMS value of noise is σ. λ is a large enough
constant.
s(t)
5λσ/2 ־
3λσ/2 ־
T
λσ/2 ־
t
-λσ/2 ־
-3λσ/2 ־
-5λσ/2 ־
• M possible types of messages.
• M levels.
• Assuming all messages are equally likely.
• Average signal power is
• M =[ ( 1+ 12 S / (λσ )2 ]1/2 N= σ2
• M =[ ( 1+ 12 S / λ2 N ]1/2
• M equally likely messages.
• H = log2 M
• Square signals have rise time and fall time through channel.
• Rise time tr = 0.5/B = 1/(2B)
• Signal will be detected correctly if T is at least equal to tr .
• T = 1/(2B)
• Message rate r = 1/T = 2B messages/s
• C = Rmax = 2B * H
• C = Rmax = 2B * ½ * log2 [ 1+ S / N ]
• C = B log2 [1+ S / N ]
SHANNON’s LIMIT
• In an ideal noiseless system, N = 0
S/N → ∞
With B → ∞, C → ∞.
• In a practical system, N can not be 0.
– As B increases , initially S and N both will
increase. C increases with B
– Later, increase in S gets insignificant while N
gradually increases. Increase in C with B
gradually reduces .
– C reaches a finite upper bound as B → ∞.
– It is called Shannon’s limit.
SHANNON’s LIMIT
• Gaussian channel has noise spectral density - η/2
• N = η/2 * 2B = ηB
• C = B log2 [ 1+ S / ηB]
• C = (S / η ) (η / S) B log2 [ 1+ S / ηB]
• C = (S / η ) log2 [ 1+ S / ηB] (ηB / S)
Lim x → 0 ( 1+X ) 1/X = e
X = S / ηB
• At Shannon’s limit, B → ∞, S / ηB → 0
• C ∞ = (S / η ) log2 e
• C ∞ = 1.44 (S / η )
CHANNELS WITH FINITE MEMORY
• Statistically independent sequence: Occurrence of
a particular symbol during a symbol interval does
NOT alter the probability of occurrence of symbol
during any other interval.
• Most of practical sources are statistically
dependant.
• Source emitting English symbol :
– After Q next letter is U with 100% probability
– After a consonant, probability of occurrence of
vowel is more. And vise versa.
• Statistical dependence reduces amount of
information coming out of source.
CHANNELS WITH FINITE MEMORY
Statistically dependent source.
• A source emitting symbols at every interval of T.
• Symbol can have any value from x1, x2, x3…xm.
• Positions are s1, s2, s3…sk-1 ,sk
• Each position can take any of possible x.
• Probability of occurrence of xi at position sk will
depend on symbols at previous (k-1) positions.
• Such sources are called MARKOV’s SOURCE of (k-
1) order.
• Conditional Probability =
» p (xi / s1, s2, s3…sk-1 )
• Behavior of Markov source can be predicted from
state diagram.
SECOND ORDER MARKOV SOURCE-
STATE DIAGRAM
• SECOND ORDER MARKOV SOURCE- Next symbol depends
on 2 previous symbols.
• Let m = 2 ( 0, 1)
0.6
• No of states = 2 = 4.
2
• A – 00
• B – 01 11
• C – 10
0.4 0.5
• D - 11 0.5
• MEMORY = 2.
• State Equations are: 0.6
0.5 0.4
• H(s1s2 xi) = ?
• = 2 * 1/6 * log2 6 + 6 * 1/9 * log2 9
• H(s1s2 xi) = 2.97
• H(xi /s1s2) = ?
• H(xi /s1s2) = 2.97 - 2
• H(xi /s1s2) = 0.97
PROBLEMS
2/3
1/3 B
A 1/3
2/3
PROBLEMS
• Identify the states and find all entropies.
1/8
D
1
7/8 1/4
C 3/4 B
2 3
3/4
1/4 7/8
4 A
1/8
CONTINUOUS COMMUNICATION
CHANNELS
Continuous channels with continuous noise
• Signals like video and telemetry are analog.
• Modulation techniques like AM, FM, PM etc. are
continuous or analog.
• Channel noise is white Gaussian noise.
• Required to find rate of transmission of information
when analog signals are contaminated with
continuous noise.
CONTINUOUS COMMUNICATION
CHANNELS
∞
• H(X) = - ∫∞ p(x) log2 p(x) dx
where
∞
• ∫∞ p(x) dx = 1 CPS
∞
• H(Y) = - ∫∞ p(y) log2 p(y) dy
∞∞
• H(X,Y) = - ∫∞ ∫∞ p(x,y) log2 p(x,y) dx dy
∞∞
• H(X/Y) = - ∫∞ ∫∞ p(x,y) log2 p(x/y) dx dy
∞∞
• H(Y/X) = - ∫∞ ∫∞ p(x,y) log2 p(y/x) dx dy
• a3b/3 = 1
B = 3/a3