You are on page 1of 4

Communications Systems II-Lec.

4 Fourth stage 2020-2021

4.1 Background

(4.1 a)

(4.1 b)

Equation s (4.1) are called Bayes’ rule. In Bayes’ rule, one conditional
probability is expressed in terms of the reversed conditional probability.
Example 4.1

Solution

Dr. Hasanain A. Hasan Department of Electrical Engineering, University of Misan


Communications Systems II-Lec.4 Fourth stage 2020-2021

Example 4.2 : A communication source produce information in the form of


encrypted data using three symbols (L,M,N) which related with the following
probabilities.
i P(i) i P(j/i)
L 1 j L M N
3 L 0 0.8 0.2
M 16
M 0.5 0.5 0
27 N 0.5 0.4 0.1
N 2
27

Find the conditional Entropy 𝐻(𝑗/𝑖)?

Solution:

The probability p(i,j) = p(i) p(j/i)


Then,
1
P(L,L) = P(L) * P(L/L) = 3 * 0 = 0
P(L,M) = P(L) * P(M/L) = 0.267
P(L,N) = P(L) * P(M/L) = 0.067
P(M,L) = P(L) * P(M/L) = 0.296

And so on for the other probabilities, which we will have the following table of
conditional probabilities:

i P(j/i)
j L M N
L 0 0.296 ?
M 0.267 ? ?
N 0.067 ? ?

Therefore:
𝑛
𝑚 1 𝑏𝑖𝑡
𝐻 (𝑋|𝑌) = ∑ ∑ 𝑃(𝑥𝑖 , 𝑦𝑗 )
𝑖=1 𝑙𝑜𝑔2 𝑃(𝑦𝑗 /𝑥𝑖 ) 𝑠𝑦𝑚𝑏𝑜𝑙
𝑗=1
The amount of Entropy is ? (Left as a Homework)

Dr. Hasanain A. Hasan Department of Electrical Engineering, University of Misan


Communications Systems II-Lec.4 Fourth stage 2020-2021

4.2 Channel Matrix and Mutual information

A channel whose output at a given time is a function of the channel input at that
time and is not function of previous. A discrete memoryless channel (DMC) is
specified by following three quantities

-An input alphabet, x.


-An output alphabet, y.
-The conditional probability distribution p(y/n).

It is called memoryless since the current output depends only the current input not the
previous.

(4.3 a)

(4.3 b)

(4.3 c)

Thus, if the input symbol probabilities P(xi) and the channel matrix are known,
a posteriori conditional probabilities can be computed from equation (4.3). The

Dr. Hasanain A. Hasan Department of Electrical Engineering, University of Misan


Communications Systems II-Lec.4 Fourth stage 2020-2021

posteriori conditional probability P(xi/yi) is the probability that xi was


transmitted when yi is received.

(4.3)

Mutual information can be derived and given in terms of input symbol


probabilities and the channel matrix.

(4.4)

Because of I(x;y) in Equation (4.4) is symmetrical with respect to x and y, it


follows that :
(4.5 a)

(4.5 b)

The quantity H (y|x) is the conditional entropy of y given x and its average
uncertainty about the received symbol when the transmitted symbol is known.
Equation (4.5 b) can be re-written

(4.6)

Dr. Hasanain A. Hasan Department of Electrical Engineering, University of Misan

You might also like