Professional Documents
Culture Documents
(Digital Communication)
Lecture on
Chapter-1: Introduction
By
Dr. M. Golam Rashed
(golamrashed@ru.ac.bd)
Analog or digital
Analog Communication System ICE M3011
Amplitude Phase
Frequency
Digital Communication System ICE M3011
A speech signal
Example of Analog signal
Computer data
Telegraph signals
Analog-to-Digital Conversion ICE M3011
Source of Info
are digital in
nature or
converted into
it by design
• It consists of ……
• a satellite in geostationary orbit
• An uplink from the ground station, and
• A downlink to another ground station
• The uplink and downlink operate at microwave
frequencies (uplink freq> downlink freq).
• Viewed as a repeater in the sky, permitting communication
(one ground station to another) over long distances at high
bandwidth and relatively low cost
Discrete Random Variable ICE M3011
• A variable is a quantity whose value changes.
• A discrete variable is a variable whose value is obtained by
counting.
• Examples: number of students present
number of red marbles in a jar
number of heads when flipping three coins
students’ grade level
• A continuous variable is a variable whose value is obtained
by measuring.
• Examples: height of students in class
weight of students in class
time it takes to get to school
distance traveled between classes
• A discrete random variable S has a countable number of
possible values.
Discrete Sources
ICE M3011
It is a DATA SOURCE whose output is a sequence of symbols from a
known discrete alphabet,
Each such block is restricted to have the same block Length L, which is
why the code is called a fixed-length code.
Fixed-length codes for discrete sources: Example
ICE M3011
• Let I(sk) is the discrete random variable that takes on the values
I(s0), I(s1) , I(s2)….. I(sK-1) with probability p0 , p1 , p2 ,………. ,pK-1 ,
respectively.
• The mean values of I(sk) over the source alphabet is given by…
H() = E[I(sk)]
=
=
• This important quantity, H(), is called the entropy of a discrete
memoryless source with source alphabet .
• It is the measure of the average information content per symbol.
Some Properties of Entropy ICE M3011
C = B * log2(1+ S/N)
where C is the achievable channel capacity,
B is the bandwidth of the line,
S is the average signal power and
N is the average noise power.
The signal-to-noise ratio (S/N) is usually expressed in decibels (dB)
given by the formula:
10 * log10(S/N)
Shannon’s Theorem (Cont…)
ICE M3011
C = 3000 * log2(1001)
which is a little less than 30 kbps.
Instant Test-2…. ICE M3011
C=10000000 * log2(101)
which is a little less than 60 Mbps.
Source Encoder
ICE M3011
An important problem in communication is the efficient
representation of data generated by a discrete source. The
process by which this representation is accomplished is called
source encoding.
Let denote the minimum possible value of L. We the define the code
efficiency of the source encoder as
With L>= , we clearly have <=1. The encoder is said to be efficient when
approaches unity.
Code Efficiency of the Source Encoder
ICE M3011
Let denote the minimum possible value of L. We the define the code
efficiency of the source encoder as
With L>= , we clearly have <=1. The encoder is said to be efficient when
approaches unity.
Discrete Memoryless Sources
ICE M3011
Suppose that a probabilistic experiment involves the
observation of the output emitted by a discrete
source every unit of time. The source output is
modelled as discrete random variable, S, which takes
on symbols from a fixed finite alphabet.
With probabilities
P(S=)= k=0,1,…., K-1
Discrete Memoryless Sources
ICE M3011
Of course, this set of probability must satisfy the
condition