You are on page 1of 9

S3.

Information Transmission Theory


The channel transforms the input symbols of [X] into the output symbols of [Y].
n

The input entropy: H X p ( xi ) log 2 p ( xi ) [bits / symbol ]


i 1

The output entropy: H Y p ( yi ) log 2 p ( yi ) [bits / symbol ]


i 1

The joint entropy: H X , Y p ( xi , y j ) log 2 p ( xi , y j ) [ bits / symbol ]


i 1 j 1

The equivocation H(X/Y): H X / Y p ( xi , y j ) log 2 p ( xi / y j ) [ bits / symbol ]


i 1 j 1

The mean error: H Y / X p ( xi , y j ) log 2 p ( y j / xi ) [bits / symbol ]


i 1 j 1

The transinformation (mutual information):

I X, Y H X H Y H X, Y H X H X / Y H Y H Y / X
The capacity of discrete channels:
C = max I X, Y max H X H X / Y max H Y H Y / X bits / symbol

Discrete channel without perturbations: H X, Y H X H Y I X, Y

H X / Y H Y / X 0
C max I ( X , Y ) max H X log 2 n [bits / symbol ]

Discrete channel with heavy perturbations:

H X, Y H X H Y ; H X / Y H X H Y / X H Y ; I X, Y 0
C 0 [bits / symbol ]
Discrete symmetric channel: The capacity of a discrete channel is reached for a uniform distribution
of the probabilities p(x1) = p(x2) = = p(xn) = 1/n
n

C
i 1

1
n

p( y

/ xi ) log 2 p ( y j / xi ) log 2 m [bits / symbol ]

j 1

Types of discrete channels

The discrete binary channel

p
1 p
[ P (Y / X )]

1 p
p

C 1 p log 2 p 1 p log 2 1 p [bits / symbol ]

The binary erasure channel

0
q
1 q
[ P (Y / X )]

1 q q
0
C = 1-q [bits/symbol] for p(x1) = p(x2) = .
The binary channel with errors and erasure

1 p q
[ P (Y / X )]
p

p
1 p q

C 1 q p log 2 p 1 q log 2 1 q (1 p q ) log 2 (1 p q )[bits / symbol ]


for p(x1) = p(x2) = .

The ternary symmetric channel

q
q
1 p

[ P (Y / X )] q
1 p
q

q
q
1 p

C log 2 3 1 p log 2 1 p 2 q log 2 q[bits / symbol ]


for p(x1) = p(x2) = p(x3) = 1/3.

Exercises

1. A transmission channel has the following noise matrix:

The input symbols are x1, x2 and x3 with the probabilities:

a) Calculate the mean information quantity given by a received symbol


b) Calculate the information quantity that is lost when transmitting a message with 400 symbols
c) Calculate the information quantity received using the previous assumptions
2. We have the following joint probability matrix of the input and output of a channel:

Calculate the conditioned entropies H(Y/X) and H(X/Y).


3. The joint probability matrix of a transmission channel is:

Calculate:
a)
b)
c)
d)
e)

The input entropy


The output entropy
The joint entropy
The mean error and equivocation
The transinformation and channel capacity

4. A binary symmetrical channel is characterized by the transition matrix:


2
3
P (Y / X )
1
3

The input probabilities are: p ( x1 )

3
4

, p ( x2 )

1
3

2
3
1
4

Calculate:
a) The average error H(Y/X)
b) The transinformation I(X,Y)
c) The channel capacity C
5. Two binary channels with the transition graphs below are linked in series
x1

x2

0.8

y1

z1

0.5

0.7

y2

z2

w1

w2

Determine:
a) The transition graph of the equivalent channel
b) The value of q such that the equivalent channel is a binary symmetrical channel
c) The capacity of the equivalent channel
6. Two binary channels with the transition graphs below are linked in series
x1

0.9

y1

z1

0.4

w1
q

x2

0.8

y2

z2

w2

Determine:
a) The transition graph of the equivalent channel
b) The value of q such that the equivalent channel is a binary symmetrical channel
Solutions
1.
a) We calculate the sources entropy:

b) We calculate the mean error for receiving one symbol:

The error for a message with n = 400 symbols is:

c) The total information quantity received will be:

We need to determine H(Y). For this we need to calculate the probabilities p(yj).
4

We have:

We calculate the output entropy:

The total information quantity received will be:

2.
We know that:

We have:

Because

Then the matrix P(Y/X) is calculated from the matrix P(X,Y) by dividing the elements of every line
by p(xi). We have:
5

We calculate the mean error:

In a similar way we calculate the elements of the matrix P(X/Y) by dividing the elements of each
column j of the matrix P(X,Y) by p(yj).

The equivocation is:

3.
a) We need to calculate the probabilities p(xi).

So we have

The input entropy is:

b) Using the relation:

We have

And the output entropy:

c)

d) The conditioned entropies will be:

e) We compute the transinformation and channel capacity and we have:

Conclusion: This channel has very high perturbations and so it does not transmit information.
4.
a)
3 2 1

4 3 2
3 1 1
p ( x1 , y 2 ) p ( x1 ) p ( y 2 / x1 )
4 3 4
1 1
1
p ( x2 , y1 ) p ( x2 ) p ( y1 / x2 )
4 3 12
1 2 1
p ( x2 , y 2 ) p ( x2 ) p ( y 2 / x2 )
4 3 6

p ( x1 , y1 ) p ( x1 ) p ( y1 / x1 )

1
2
P( X ,Y )
1
12

1
4

1
6
2

H (Y / X ) p ( xi , y j ) log p ( y j / xi )
i 1 j 1

H (Y / X )

1
2

log

2
3

1
4

log

1
3

1
12

log

1
3

1
6

log

2
3

0.92 [bits/symbol]

b)

I ( X , Y ) H ( X ) H (Y ) H ( X , Y ) H ( X ) H ( X / Y ) H (Y ) H (Y / X )
Using the third relation, we need to determine H(Y)
2

i 1

12

p ( y1 ) p ( xi ) p ( y1 / xi ) p ( x1 ) p ( y1 / x1 ) p ( x2 ) p ( y1 / x 2 )
2

i 1

12

p ( y 2 ) p ( xi ) p ( y 2 / xi ) p ( x1 ) p ( y 2 / x1 ) p ( x 2 ) p ( y 2 / x 2 )

Or:
3
P (Y ) P ( X ) P (Y / X )
4
2

2
1 3

4 1
3

H (Y ) p ( y j ) log p ( y j )
j 1

7
12

1
3 7

2 12
3

log

7
12

5
12

5
12

log

5
12

0.98 [bits/symbol]

So we have:
I ( X , Y ) H (Y ) H (Y / X ) 0.98 0.92 0.06 [bits/symbol]
c)
The channel is a binary symmetrical channel so we have:
2
2 1
1
C 1 log log 0.082 [bits/symbol]
3
3 3
3
5.
a)
p ( y1 / x1 ) p ( y 2 / x1 ) 1
p ( y 2 / x1 ) 1 0.8 0.2
p ( y1 / x2 ) 1 0.7 0.3
p ( w2 / z1 ) 1 0.5 0.5

p ( w1 / z 2 ) 1 q

P (W / X ) P (Y / X ) P (W / Z )
0.8
P (W / X )
0.3
x1

x2

0.6-0.2q

0.15+0.7q

0.2 0.5

0.7 1 q

0.5 0.6 0.2 q

q 0.85 0.7 q

w1

w2

b)
0.6 0.2 q 0.15 0.7 q
0.45
q
0.5

0.9
0.4 0.2 q 0.85 0.7 q
0.5
P (W / X )
0.5
c)

0.5

0.5

C 1 0.5 log 0.5 0.5 log 0.5 0


6.
a)
p ( y1 / x1 ) p ( y 2 / x1 ) 1
p ( y 2 / x1 ) 1 0.9 0.1
p ( y1 / x2 ) 1 0.8 0.2
p ( w2 / z1 ) 1 0.4 0.6

0.4 0.2 q

0.15 0.7 q

p ( w2 / z 2 ) 1 q

P (W / X ) P (Y / X ) P (W / Z )
0.9
P (W / X )
0.2
x1

x2

0.36+0.1q

0.92-0.8q

0.1 0.4 0.6 0.36 0.1q

0.8 q 1 q 0.08 0.8 q


w1

w2

b)
0.36 0.1q 0.92 0.8 q
0.56
q
0.622

0.9
0.64 0.1q 0.08 0.8 q
0.422
P (W / X )
0.578

0.578

0.422

0.64 0.1q

0.92 0.8 q

You might also like