Professional Documents
Culture Documents
• Mutual information
XX p(x, y)
I(X; Y ) , p(x, y) log
x y
p(x)p(y)
H(X|Y )
I(X; Y )
H(Y |X)
H(X) H(Y )
I(X; Y ) = I(Y ; X)
I(X; Y ) = H(Y ) − H(Y |X) = H(X) − H(X|Y )
I(X; Y ) = H(X) + H(Y ) − H(X, Y )
I(X; X) = H(X)
H(X, Y ) = H(X) + H(Y |X) = H(Y ) + H(X|Y )
• For E[X 2 ] = σ 2 ,
1
h(X) ≤ log(2πeσ 2 ) [bits]
2
with = only for X Gaussian
• Mutual information,
ZZ
f (x, y)
I(X; Y ) = f (x, y) log dxdy
f (x)f (y)
Fano’s Inequality
Pe = Pr(X̂ 6= X)
wm
encoder decoder
ω xm ym ω̂
α β
11 2
f (y|x) = √ exp − 2 (y − x)
2πσ 2 2σ
with n
X
n−1
x2m (i) ≤ P, i ∈ IM
m=1
2 Encoding: ω = i ⇒ xn n
1 = α(i) = x1 (i) transmitted
n n
3 Decoding: y1 received ⇒ ω̂ = β(y1 )
• A rate
log M
R,
n
is achievable (subject to the power constraint P ) if there exists a
sequence of (⌈2nR ⌉, n) codes with codewords satisfying the power
constraint, and such that the average probability of error
Pe(n) = Pr(ω̂ 6= ω)
tends to 0 as n → ∞.
• The capacity C is the supremum of all achievable rates.
x2
1
f (x) = p exp −
2π(P − ε) 2(P − ε)
f (y|x)
ZZ
Iε = f (y|x)f (x) log R dxdy
f (y|x)f (x)dx
P −ε
1
= log 1 +
2 σ2
• the mutual info between input and output when the channel is
“driven by” f (x) = N (0, P − ε)
n
1 f (y1n |xn1 ) 1 X f (ym |xm )
fn = fn (xn1 , y1n ) = log = log
n f (y1n ) n m=1 f (ym )
(n)
and let Tε be the set of (xn1 , y1n ) such that fn > Iε − ε. Declare
(n)
ω̂ = i if xn1 (i) is the only codeword such that (xn1 (i), y1n ) ∈ Tε ,
and in addition
Xn
n −1
x2m (i) ≤ P,
m=1
otherwise set ω̂ = 0.
and
xn1 (i), xn1 (1) + w1n ∈ Tε(n)
Ei =
then
M
X
πn = P (E0 ∪ E1c ∪ E2 ∪ · · · ∪ EM ) ≤ P (E0 ) + P (E1c ) + P (Ei )
i=2
1
h(ym ) ≤ log 2πe(σ 2 + Pm )
2
and hence I(xm (ω); ym ) ≤ 2−1 log(1 + Pm /σ 2 ).
Thus
n
n−1 m Pm
P
1 X 1 Pm 1
R≤ log 1 + 2 + αn ≤ log 1 + + αn
n m=1 2 σ 2 σ2
1 P 1 P
≤ log 1 + 2 + αn → log 1 + 2 as n → ∞
2 σ 2 σ
1 P
C ≤ log 1 + 2
2 σ
Theorem
A memoryless Gaussian channel with noise variance σ 2 and power
constraint P has capacity
1 P
C = log 1 + 2
2 σ
That is, all rates R < C and no rates R > C are achievable.
x(t) y(t)
H(f )
(−T /2, T /2) (−T /2, T /2)
• codebook
C = {x1 (t), . . . , xM (t)}
• power constraint
Z T /2
1
x2i (t)dt ≤ P
T −T /2
• rate
log M
R=
T
Mikael Skoglund, Theoretical Foundations of Wireless 19/23
1 |H(f )|2 · β
Z
C= log df
2 F (β) N (f )
N (f )
Z
P = β− df
F (β) |H(f )|2
where
F(β) = f : N (f ) · |H(f )|−2 ≤ β
where (
x, x ≥ 0
[x]+ =
0, x < 0
P
C = W log 1 + [bits per second]
W N0
1
Z
log β|H(f )|2 df
C=
2 F (β)
N0 1
Z
P = β− df
2 F (β) |H(f )|2
where F(β) = f : β ≥ |H(f )|−2
• Optimal signal spectrum
+
N0 1
S(f ) = β−
2 |H(f )|2