Professional Documents
Culture Documents
I X, Y H X H Y H X, Y H X H X / Y H Y H Y / X
The capacity of discrete channels:
C = max I X, Y max H X H X / Y max H Y H Y / X bits / symbol
H X / Y H Y / X 0
C max I ( X , Y ) max H X log 2 n [bits / symbol ]
H X, Y H X H Y ; H X / Y H X H Y / X H Y ; I X, Y 0
C 0 [bits / symbol ]
Discrete symmetric channel: The capacity of a discrete channel is reached for a uniform distribution
of the probabilities p(x1) = p(x2) = = p(xn) = 1/n
n
C
i 1
1
n
p( y
j 1
p
1 p
[ P (Y / X )]
1 p
p
0
q
1 q
[ P (Y / X )]
1 q q
0
C = 1-q [bits/symbol] for p(x1) = p(x2) = .
The binary channel with errors and erasure
1 p q
[ P (Y / X )]
p
p
1 p q
q
q
1 p
[ P (Y / X )] q
1 p
q
q
q
1 p
Exercises
Calculate:
a)
b)
c)
d)
e)
3
4
, p ( x2 )
1
3
2
3
1
4
Calculate:
a) The average error H(Y/X)
b) The transinformation I(X,Y)
c) The channel capacity C
5. Two binary channels with the transition graphs below are linked in series
x1
x2
0.8
y1
z1
0.5
0.7
y2
z2
w1
w2
Determine:
a) The transition graph of the equivalent channel
b) The value of q such that the equivalent channel is a binary symmetrical channel
c) The capacity of the equivalent channel
6. Two binary channels with the transition graphs below are linked in series
x1
0.9
y1
z1
0.4
w1
q
x2
0.8
y2
z2
w2
Determine:
a) The transition graph of the equivalent channel
b) The value of q such that the equivalent channel is a binary symmetrical channel
Solutions
1.
a) We calculate the sources entropy:
We need to determine H(Y). For this we need to calculate the probabilities p(yj).
4
We have:
2.
We know that:
We have:
Because
Then the matrix P(Y/X) is calculated from the matrix P(X,Y) by dividing the elements of every line
by p(xi). We have:
5
In a similar way we calculate the elements of the matrix P(X/Y) by dividing the elements of each
column j of the matrix P(X,Y) by p(yj).
3.
a) We need to calculate the probabilities p(xi).
So we have
We have
c)
Conclusion: This channel has very high perturbations and so it does not transmit information.
4.
a)
3 2 1
4 3 2
3 1 1
p ( x1 , y 2 ) p ( x1 ) p ( y 2 / x1 )
4 3 4
1 1
1
p ( x2 , y1 ) p ( x2 ) p ( y1 / x2 )
4 3 12
1 2 1
p ( x2 , y 2 ) p ( x2 ) p ( y 2 / x2 )
4 3 6
p ( x1 , y1 ) p ( x1 ) p ( y1 / x1 )
1
2
P( X ,Y )
1
12
1
4
1
6
2
H (Y / X ) p ( xi , y j ) log p ( y j / xi )
i 1 j 1
H (Y / X )
1
2
log
2
3
1
4
log
1
3
1
12
log
1
3
1
6
log
2
3
0.92 [bits/symbol]
b)
I ( X , Y ) H ( X ) H (Y ) H ( X , Y ) H ( X ) H ( X / Y ) H (Y ) H (Y / X )
Using the third relation, we need to determine H(Y)
2
i 1
12
p ( y1 ) p ( xi ) p ( y1 / xi ) p ( x1 ) p ( y1 / x1 ) p ( x2 ) p ( y1 / x 2 )
2
i 1
12
p ( y 2 ) p ( xi ) p ( y 2 / xi ) p ( x1 ) p ( y 2 / x1 ) p ( x 2 ) p ( y 2 / x 2 )
Or:
3
P (Y ) P ( X ) P (Y / X )
4
2
2
1 3
4 1
3
H (Y ) p ( y j ) log p ( y j )
j 1
7
12
1
3 7
2 12
3
log
7
12
5
12
5
12
log
5
12
0.98 [bits/symbol]
So we have:
I ( X , Y ) H (Y ) H (Y / X ) 0.98 0.92 0.06 [bits/symbol]
c)
The channel is a binary symmetrical channel so we have:
2
2 1
1
C 1 log log 0.082 [bits/symbol]
3
3 3
3
5.
a)
p ( y1 / x1 ) p ( y 2 / x1 ) 1
p ( y 2 / x1 ) 1 0.8 0.2
p ( y1 / x2 ) 1 0.7 0.3
p ( w2 / z1 ) 1 0.5 0.5
p ( w1 / z 2 ) 1 q
P (W / X ) P (Y / X ) P (W / Z )
0.8
P (W / X )
0.3
x1
x2
0.6-0.2q
0.15+0.7q
0.2 0.5
0.7 1 q
q 0.85 0.7 q
w1
w2
b)
0.6 0.2 q 0.15 0.7 q
0.45
q
0.5
0.9
0.4 0.2 q 0.85 0.7 q
0.5
P (W / X )
0.5
c)
0.5
0.5
0.4 0.2 q
0.15 0.7 q
p ( w2 / z 2 ) 1 q
P (W / X ) P (Y / X ) P (W / Z )
0.9
P (W / X )
0.2
x1
x2
0.36+0.1q
0.92-0.8q
w2
b)
0.36 0.1q 0.92 0.8 q
0.56
q
0.622
0.9
0.64 0.1q 0.08 0.8 q
0.422
P (W / X )
0.578
0.578
0.422
0.64 0.1q
0.92 0.8 q