You are on page 1of 2

Flores, Angelo M.

Principles of Communication
BSIT 302

07 Task Performance

Assume that there are 4 equiprobable input states, such as 𝑥1 = 35, 𝑥2 = 65, 𝑥3 = 95, and 𝑥4 = 125

and 3 equiprobable values for the channel noise, such as 𝜂1 = 5, 𝜂2 = 10, and 𝜂3 = 15, identify the

following:

a. Input Entropy

𝐻(𝑋) = − ∑𝑛𝑖=1 𝑝𝑖 ∙ 𝑙𝑜𝑔2 (𝑝𝑖)


4 1 1
𝐻(𝑋) = − ∑ ∙ 𝑙𝑜𝑔2 ( )
𝑖=1 4 4

1 1
𝐻(𝑋) = −4 ∙ ( ∙ 𝑙𝑜𝑔2 ( ))
4 4
1
𝐻(𝑋) = −𝑙𝑜𝑔2 ( )
4
𝐻(𝑋) = −(−2)
𝐻(𝑋) = 2
The input entropy is 2 bits.

b. Noise Entropy
𝑁𝑥 𝑁𝑛
𝐻 = −∑ ∑ 𝑃(𝑥𝑖, 𝜂𝑗) ∙ 𝑙𝑜𝑔2 [𝑃(𝑥𝑖, 𝜂𝑗)]
𝑖=1 𝑗=1

4 3 1 1
𝐻 = −∑ ∑ ∙ 𝑙𝑜𝑔2 ( )
𝑖=1 𝑗=1 12 12
1 1
𝐻 = −12 ∙ ∙ 𝑙𝑜𝑔2 ( )
12 12
1
𝐻 = −𝑙𝑜𝑔2 ( )
12
𝐻 = 𝑙𝑜𝑔2 (12)
The Noise Entropy is 𝑙𝑜𝑔2 (12) bits.
Flores, Angelo M. Principles of Communication
BSIT 302

c. Outputs of each equiprobable input states.

Output = Input + Noise

1. For x1 = 35:

• Output 1: 35 + 5 = 40
• Output 2: 35 + 10 = 45
• Output 3: 35 + 15 = 50

2. For x2 = 65

• Output 1: 65 + 5 = 70
• Output 2: 65 + 10 = 75
• Output 3: 65 + 15 = 80

3. For x3 = 95

• Output 1: 95 + 5 = 100
• Output 2: 95 + 10 = 105
• Output 3: 95 + 15 =110

4. For x4 = 125:

• Output 1: 125 + 5 = 130


• Output 2: 125 + 10 = 135
• Output 3: 125 + 15 = 140

d. Output Entropy
1 1
𝐻(𝑌) = − ∑ 𝑙𝑜𝑔2 ( )
𝑖𝑗 12 12
1
𝐻(𝑌) = − ∑ 𝑥(−3.585)
𝑖𝑗 12

1
𝐻(𝑌) = 3.585𝑥 𝑥12
12
𝐻(𝑌) = 3.585
The Output Entropy in bits is approximately 3.585

You might also like