You are on page 1of 41

Fayoum University

Second Semester 2016/2017


Faculty of Engineering
Department of Electrical Engineering Dr. Tamer Barakat
Communication ECE 407 TA/Muhamad Dakheel
Midterm Preparation

[1]

Solution:

[2]

Solution:

1
[3]

Solution:

[4]

Solution:

[5]

2
Solution:

[6]

Solution:

[7]

3
Solution:

[8]

Solution:

4
[9]

Solution:

[10]

Solution:

[11]

5
Solution:

[12]

If = + , prove that,

() = () + ||

Solution:

6
[13]

Solution:

[14]

Solution:

7
[15]

8
[16]

Solution:

9
[17]

Solution:

10
[18]

Solution:

11
[19]

Solution:

12
[20]

Solution:

[21]

13
Solution:

[22]

Solution:

14
[23]

Solution:

15
[24]

Solution:

[25]

16
Solution:

17
[26]

Solution:

18
[27]

Solution:

19
20
[28]

Solution:

21
[29]

22
Solution:

[30]

23
Solution:

[31]

Solution:

[32]

Solution:

24
[33]

Solution:

25
[34]

Solution:

26
[35]

27
Solution:

[36]

28
Solution:

[37]

Solution:

29
[38]

Solution:

30
[39]

Solution:

[40]

31
Solution:

32
[41]

Solution:

33
[42]

Solution:

[43]

The binary erasure channel has four inputs and five outputs as described in figure, and
the corresponding transition matrix is described in next figure. The inputs are labeled

0, 1, 2, and 3 with probabilities , , , and respectively, and the outputs are
labeled 0, 1, 2, 3, and E.


[ ]

(i) Find to give a maximum (; ).


(ii) According to the result in (i) find the corresponding channel capacity.

34
Solution:


() = () ( ) ( )

Now, lets define the horseshoe function or entropy function, by definition, the
entropy of binary source with probability, is (), sometimes referred to
as (, ), or simply(),

() = (, ) = () ( ) ( )

And therefore,


(, ) = () ( ) ( )


And hence, () = (, )

Wed simply derive,

(, )
() = = ( )

And therefore, if the probabilities are not multiplied by any constants, the derivative
would always be the second probability over the first one. If both probabilities are
shifted by the same constant of curse the [ ] term would be multiplied by that
constant,


(, )
= ( )

Also, we calculate (|),


(|) = ( )

So, Mutual Information,


(; ) = () (|) = (, ) ( )

(i) Trying to maximize mutual information and using derivative of entropy


function,

35


(; )
= ( ) + = ( ) +

Solving the derivative equals zero,





( ) +=




( ) =




( ) = = .

(ii) Substituting into mutual information we find channel capacity,

= (; )|@=. = .

[44]

Find the channel capacity for the following cascaded channels:

36
Solution:

We find the overall channel,

We find all possible paths from one input to


one output, and then we sum the
probabilities of all the paths.

We notice that the overall channel is just a binary symmetric channel with crossover
probability of [e/2], which leaves us with two points,

1) Probabilities distribution of the inputs that introduces the maximum


information rate is uniform over the inputs, thats

( ) = ( ) =

2) Channel Capacity can be simply calculated using the well known BSC capacity,

= ( )

[45]

A channel has an input ensemble X consisting of the numbers, used with ( =



) = ( = ) = .

The output Y is the sum of the input X and an independent noise random variable Z
with,


() = { ,
,

The conditional probability density of Y


(/) = { , 2
,

i) Find and sketch the output probability density.


ii) Find the channel capacity.

37
Solution:

(i)

() = ( + ) = () ()

After Convolution,


, < 1


() = ,


{ , <

(ii) Simply using the relation,


(; ) = () (|)
We also notice that,
(|) = ()
So, simply by knowing the probability distribution and with the absence of
any unknowns,

() = () + () + () = .

Also,

(|) = () = () =

So,
= (; ) = . = .

38
[46] Three binary symmetric channels are connected in cascade, as shown in the
figure below. Assuming that all channels have the same crossover probability, p,

Binary Binary Binary


symmetric symmetric symmetric
channel 1 channel 2 channel 3

(i) Find the transition matrix for the overall channel in terms of p.

(ii) If p=0.25, Find the capacity of the cascaded connection.

(iii) Compare the capacity of the overall channel to the capacity of channel
1, and comment on the results.

Solution:

(i) To find the overall transition matrix, we find all paths from one input to
one output,

1st path: direct path ( )( )( ) = ( )

39
2nd path: ()()( ) = ( )

3rd path: ( )()() = ( )

4th path: ()( )() = ( )

So, over all probability from 0 to 0 is,

= ( ) + ( )

Similarly, we can find probability from 0 to 1,

= + ( )

40
So, the overall transition matrix is,

( ) + ( ) + ( )
=[ ]
+ ( ) ( ) + ( )

(ii) for =
. .
=[ ]
. .
And the channel capacity for any BSC is,
= () = (. ) = .
Which means this channel is almost useless.
(iii) For a single BSC,

= ( ) = .

The capacity is much better than cascaded channels but still bad.

[47] State whether the following are true or false by circling the correct choice
(Justification NOT required).

(a) T / F H(X|X) = 0.

H(X|X = x) = 0 for every x.

(b) T / F H(X) > H(X|Y).

When X and Y are independent, then H(X) = H(X|Y).

(c) T / F I(X; Y) H(Y).

I(X; Y) = H(Y) H(Y|X) H(Y).

(d) T / F If X and Y are independent, then H(X|Y) = H(Y|X).

If X and Y are independent, then H(X|Y) = H(X) and H(Y|X) = H(Y), which are
not equal in general.

(e) T / F If X and Y are not independent, then I(X;Y) > 0.

This is because I(X; Y) 0 with equality iff X and Y are independent.

41