Professional Documents
Culture Documents
id/wasp
Information Theory
Telecommunication Engineering
Review
What we discussed in last lecture:
Communication systems as a whole
The components of communication systems
We have already known that communication systems
deliver information, so how do we measure information?
We also realize that there is a signal when its power
exists. Power is usually determined in units of dBm or
dBW. So, we will discuss about dB math in this lecture
Information Measure
The purpose of communication system is to
transmit information from a source to a sink
How do we measure information?
Compare:
“SBY is Indonesia President for 2009-2014”
“Gayus will be elected for President in 2014”
Which one has the biggest probability?
Information Measure
Information measurement from a digital source
n n
H Pj I j Pj log 2 Pj [bits/symbol]
j 1 j 1
Entropy
The maximum value of entropy is achieved when the
messages are equiprobable
The maximum entropy is defined as
H max log 2 M
The symbols are emitted at a fixed time rate rs
Then the source information rate (or simply source rate)
can de defined as (in bits/sec)
R rs H
H T
Where T is the time for emitting one symbol, or
T 1 rs
Redundancy
The difference between the actual entropy of a source
and the maximum entropy the source could have if its
symbols were independent and equiprobable is called
redundancy
For M symbol alphabet redundancy:
Red H max H
log 2 M H
Example
Find the entropy of a source that emits one of three
symbols A, B, and C in a statistically independent
sequence with probabilities 0.5, 0.25, 0.25, respectively
If the source emits one of those symbols once every
millisecond, find the source rate
Determine the redundancy of the source
Solution
The entropy can be calculated as
If the rs 1 ms , then
R 1.5 10 3
1500 bits/sec
The redundancy can be calculated as
Red log 2 3 H
1.5850 1.5
0.0850