You are on page 1of 13

www.ee.ui.ac.

id/wasp

Information Theory

Telecommunication Engineering
Review
 What we discussed in last lecture:
 Communication systems as a whole
 The components of communication systems
 We have already known that communication systems
deliver information, so how do we measure information?
 We also realize that there is a signal when its power
exists. Power is usually determined in units of dBm or
dBW. So, we will discuss about dB math in this lecture
Information Measure
 The purpose of communication system is to
transmit information from a source to a sink
 How do we measure information?
 Compare:
 “SBY is Indonesia President for 2009-2014”
 “Gayus will be elected for President in 2014”
 Which one has the biggest probability?
Information Measure
 Information measurement from a digital source

 Messages with small probability  more information


 Messages with big probability  less information
 P(message) = 1 carries zero information
 P(message) = 0 carries infinite information
Information Measure
 The base of logarithm determines the units for the
information measure.
 For base 2, it is used “bits”
 For base e (natural logarithm, ln), it is used “nats”
 For base 10, it is used “hartley”
Example
 A source puts out one of five possible messages during
each message interval. The probabilities of these messages
are:
p1 = 0.5
p2 = 0.25
p3 = 0.125
p4 = 0.0625
p5 = 0.0625
Solution
 The information value of those messages are
I m1    log 2 0.5  1 bit
I m2    log 2 0.25  2 bits
I m3    log 2 0.125  3 bits
I m4   I m5    log 2 0.0625  4 bits
Terminology
 Symbol: a member of a source alphabet
 Baud: rate of symbol transmission, e.g. 100 baud = 100
symbol/s
 Bit: quantity of information carried by a symbol with
selection probability P = 0.5
 Bit rate: rate of information transmission (bit/s)

In the special case, when 1 symbol = 1 bit, then baud rate


is the same as bit rate
Entropy
 The information content will vary from message to
message because the Pj will not be equal
 We need an average information measure for the source
 The average information measure (entropy) of a digital
source is

n n
H   Pj I j   Pj log 2 Pj [bits/symbol]
j 1 j 1
Entropy
 The maximum value of entropy is achieved when the
messages are equiprobable
 The maximum entropy is defined as
H max  log 2 M
 The symbols are emitted at a fixed time rate rs
 Then the source information rate (or simply source rate)
can de defined as (in bits/sec)
R  rs H
H T
 Where T is the time for emitting one symbol, or
T  1 rs
Redundancy
 The difference between the actual entropy of a source
and the maximum entropy the source could have if its
symbols were independent and equiprobable is called
redundancy
 For M symbol alphabet redundancy:
Red  H max  H
 log 2  M   H
Example
 Find the entropy of a source that emits one of three
symbols A, B, and C in a statistically independent
sequence with probabilities 0.5, 0.25, 0.25, respectively
 If the source emits one of those symbols once every
millisecond, find the source rate
 Determine the redundancy of the source
Solution
 The entropy can be calculated as

H  log 2 2  log 2 4  log 2 4


1 1 1
2 4 4
 1.5 bits/symbol

 If the rs  1 ms , then
R  1.5 10 3
 1500 bits/sec
 The redundancy can be calculated as
Red  log 2 3  H
 1.5850  1.5
 0.0850

You might also like