Professional Documents
Culture Documents
Lec 4
Lec 4
4.1 Background
(4.1 a)
(4.1 b)
Equation s (4.1) are called Bayes’ rule. In Bayes’ rule, one conditional
probability is expressed in terms of the reversed conditional probability.
Example 4.1
Solution
Solution:
And so on for the other probabilities, which we will have the following table of
conditional probabilities:
i P(j/i)
j L M N
L 0 0.296 ?
M 0.267 ? ?
N 0.067 ? ?
Therefore:
𝑛
𝑚 1 𝑏𝑖𝑡
𝐻 (𝑋|𝑌) = ∑ ∑ 𝑃(𝑥𝑖 , 𝑦𝑗 )
𝑖=1 𝑙𝑜𝑔2 𝑃(𝑦𝑗 /𝑥𝑖 ) 𝑠𝑦𝑚𝑏𝑜𝑙
𝑗=1
The amount of Entropy is ? (Left as a Homework)
A channel whose output at a given time is a function of the channel input at that
time and is not function of previous. A discrete memoryless channel (DMC) is
specified by following three quantities
It is called memoryless since the current output depends only the current input not the
previous.
(4.3 a)
(4.3 b)
(4.3 c)
Thus, if the input symbol probabilities P(xi) and the channel matrix are known,
a posteriori conditional probabilities can be computed from equation (4.3). The
(4.3)
(4.4)
(4.5 b)
The quantity H (y|x) is the conditional entropy of y given x and its average
uncertainty about the received symbol when the transmitted symbol is known.
Equation (4.5 b) can be re-written
(4.6)