Professional Documents
Culture Documents
Information Theory
Information Theory
web: www.en.polyu.edu.hk/~em/mypage.htm
Normal Office hr: 8:15am – 5:30pm
A.1
References
Textbook:
S. Haykin, Communication Systems, John Wiley, 2001.
References:
J. G. Proakis, Digital Communications, McGraw Hill, 2001.
A.2
1
Assessment
Assessment:
Examination 60%
Continuous work
Test 15%
Practical & Assignments 25%
A.3
Syllabus
Week 1-2 Information Theory
Week 1 Introduction
Information Contents
Entropy
A.4
2
Syllabus
Week 3-5 Source Coding and Channel Coding
Week 3 Huffman Coding
Lempel-Ziv Coding
A.5
Syllabus
Week 6-10 Baseband Transmission
Week 6 Matched filter
Error rate of PAM systems
3
Syllabus
Week 10 Eye Patterns
Digital Subscriber Lines
Channel Equalization
A.7
Syllabus
Week 11-14 Passband Transmission
Week 11 Coherent PSK
A.8
4
Other arrangements
Tutorial
– followed by each topic
– week 2, 3, 5, 9 and 14
Laboratory
– week 9 to week 14
Test
– week 8
Assignment
– after each lecture A.9
Noise message
message
5
Components
Information source
produces a message (or a sequence messages) to be
transmitted to the destination
Transmitter
operates on the message to produce a signal suitable for
transmission over the channel
Channel
medium used to transmit the signal from transmitter to
the receiver. Noise may be added.
A.11
Components
Receiver
performs the reverse function of the transmitter
Destination
the person or device for which the message is intended
A.12
6
A. INFORMATION THEORY
Introduction
The purpose of a communication system is to carry
information-bearing baseband signals from one place to
another over a communication channel.
A.13
Introduction
Examples:
Baseband signals: Audio signals and video signals
Channel: Optical fiber and free space
Questions:
Information-bearing baseband signals: ?
Signals containing no information: ?
A.14
7
Information Measurement
The amount of information about an event is closely related
to its probability of occurrence.
P↑I ↓ P↓I ↑
A.15
Information Measurement
P( Si ) = Pi for Pi ≥ 0 ∑ P =1
i
i
A.16
8
Example 1-2
Question
If a receiver receives the symbol S1 , how much information
is received?
A.17
Example 1-2
Conclusion
Information is related to the probability of transmission of a
symbol.
Probability = 1 ⇒ No information
A.18
9
Example 3
Information from 2 independent source will be the sum of
the information from each separately.
Example:
Consider n ≥ 2 , if a receiver receives a symbol S1 and then
a symbol S 2 , the received information is
I ( S1S 2 ) = I ( S1 ) + I ( S 2 )
A.19
Example 3
Conclusion
Information of a message of M symbols =
sum of information of each of the M symbols.
Question
Which function satisfies the above criteria (A.20 & A.18)?
a. first order polynomial: ax+b
b. sinusoidal function: cos or sin
c. logarithmic function: log
d. exponential function: e
A.20
10
Definition of Information
Consider a set of M symbols (messages) m1 , m2 ,..., mM with
probabilities of occurrence P1 , P2 ,..., PM , respectively,
and P1 + P2 + ... + PM = 1
th
The amount of information or information content in the k
symbols
I (mk ) = − log 2 Pk Unit: bit
A.21
Sum of information
Information of symbol m1 + Information of m2+ Information of
symbol m3 + …
11
Sum of Information
Therefore the information in M one-symbol message is equal
to the information in one M-symbol message
A.23
Example
1
A binary source m1 & m2 ; P(m1 ) = ; P(m2 ) = 1
2 2
log10 B
Note : log 2 B =
log10 2
A.24
12
Example 5-6
A binary source m1 & m2 ; P(m1 ) = 1 ; P(m2 ) = 3
4 4
I (m1 ) = − log 2 (1 / 4) = 2 bits
Entropy
A.26
13
Entropy
Consider a source emits one of M possible symbols
S1 , S 2 ,..., S M in a statistically independent manner with
probabilities P1 , P2 ,..., PM , respectively. (i.e. occurrence of a
symbol at any given time does not influence the symbol
emitted any other time )
A.27
Entropy
Si will occur, on average, Pi N times for N → ∞ .
14
Maximum Entropy
The entropy is maximum when P1 = P2 = ... = PM , i.e., it is
equally likely for the next emitted symbol be any one of {Si }.
H
The ratio is called the source efficiency or relatively
entropy. H max
A.29
Example
6
Source with 2 symbols, with probabilities P1 = ,
2 , N=8 8
P =
2
message
8 N =8
2
Total information: I t = ∑ NP I (S )
i =1
i i
H = I t / N = 0.811 bits/symbol
A.30
15
Example
1 3
Source with symbols '0' and '1' and probabilities P1 = , P2 =
4 4
2
H = ∑ Pi I (S i )
i =1
1 1 3 3
= − log 2 − log 2
4 4 4 4
= 0.811 bits/symbol
A.31
Example
Source produces symbols from source {S1 , S 2 , S 3 , S 4 }
with the following probabilities:
1 3 3 1
P1 = P2 = P3 = P4 =
8 8 8 8
4
H = −∑ Pi log 2 (Pi )
i =1
= 1.8 bits/symbol
1
Note that if P1 = P2 = P3 = P4 = ,
4
4
1
H = H max = ∑ log 2 (4 ) = 2 bits/symbol
i =1 4 A.32
16
Example
Consider source alphabet {0,1}
Probability of producing symbol ‘0’ = p, 0<p<1
Probability of producing symbol ‘1’ = 1-p
A.33
Example
H
H against p is
1
0 1 1 P
2
dH
H reaches maximum when =0
dp
1
and we get p = ; H max= 1 bit/symbol
2
A.34
17
Example
In general, when there are M symbols in the source alphabet,
maximum entropy occurs when all symbols are equally
probable, i.e. P1 = P2 = ... = PM
1
As P1 + P2 + ... + PM = 1 ⇒ P1 = P2 = ... = PM =
M
1
H max = ∑ Pi log 2
i Pi
1
=∑ log 2 M
i M
= log 2 M bits/symbol
A.35
R = rs H bits/sec.
A.36
18
Example
A source produces one symbol out of 256 at a rate of 8000
symbols/sec . Determine the information rate.
1
H = ∑ Pi log 2
256 Pi
1
=∑ log 2 256
i 256
= log 2 256
= 8 bits/symbol
A.37
Example
An analog signal is bandlimited to B Hz, sampled at the
Nyquist rate, quantized into 4 levels. The quantization levels
Q1 , Q2 , Q3 , Q4 are independent and occur with probabilities
1 3
P1 = P2 = P3 = P4 =
8 8
1
H = ∑ Pi log 2 = 1.8 bits/symbol
i Pi
rs = 2 B symbols/sec (Nyquist rate).
R = rs H = 3.6 B bits/s
If all quantified samples are equally probable,
H = 2 bits/symbol and R = rs H = 4 B bits/sec A.38
19
Summary
Entropy of a source:
M
H = −∑ Pi log 2 (Pi ) bits/symbol
i =1
A.39
Application
20