You are on page 1of 47

ICE-3221

(Digital Communication)

Lecture on
Chapter-1: Introduction
By
Dr. M. Golam Rashed
(golamrashed@ru.ac.bd)

Department of Information and Communication Engineering (ICE)


University of Rajshahi, Rajshahi-6205, Bangladesh
ICE3221: Digital Communication
100 Marks [70% Exam, 20% Quizzes/Class Tests, 10% Attendance]

4 Credits, 4 Periods/week, Lectures: 44, Exam time: 3 hours

(Students should answer Six questions out of Eight


taking not more than Three from each)
Section-A
Introduction:
Sources and signals, Basic signal processing operation in digital communication, Channels
for digital communication, Channel capacity theorem, Channel coding theorem.
 
Detection and Estimation:
Model of digital communication system, Gram-Schmidt orthogonalization procedure,
Geometric interpretation of signal, Detection of signals in noise, Probability of error,
Correlation receiver, Matched filter receiver, Estimation: Concept and criteria, Maximum
Likelihood Estimation, Weiner filters, Adaptive filters, Linear prediction.
 
Error Control Coding:
Rationale for coding and types of codes, Discrete memory-less channels, Linear block
codes, Cyclic codes, Maximum likelihood decoding of convolution codes, Distance
properties of convolution codes, Trellis codes.
Section-B
Digital Modulation Techniques:
Digital modulation, Factors that influence the choice of digital modulation, Phase modulation,
Pulse modulation: Types, PWM, PPM, Linear modulation: BPSK, DPSK, QPSK,π/4 QPSK,
Offset QPSK, QAM; M-ary modulation techniques, Power spectrum, Bandwidth efficiency,
Spread spectrum modulation technique: FHSS and DSSS.
 
Waveform Coding Techniques:
Sampling theorem, Reconstruction of a message process from its samples, Signal distortion in
sampling, PAM, PCM, Channel noise, Quantization noise, SNR, Robust quantization.
 
Baseband Shaping for Data Transmission:
Power spectra of discrete PAM signals, Inter-symbol interference, Nyquist criterion, Correlation
coding, Eye pattern, Baseband M-ary PAM systems, Adaptive equalization for data transmission.
Recommended Books:
1. S Haykin: Digital Communication Systems
2. Kennedy-Davice :Electronic Communication Syatems
3. Theodore S. Rappaport : Wireless Communications: Principles
& Practice.
 
Purpose of a Communication System ICE M3011

………….is to transport an information-bearing signal from a source to a


user destination via a communication channel.

A communication system can be………………. type??

Analog or digital
Analog Communication System ICE M3011

In an analog communication system….


 the information-bearing signal is continuously
varying in both amplitude and time, and
 It is used directly to modify some
characteristics of a sinusoidal carrier wave,

Amplitude Phase

Frequency
Digital Communication System ICE M3011

In a digital communication system…………….


 The information bearing signal is processed so that it can be
represented by a sequence of discrete message.
Factors of Choosing Digital Communications
ICE M3011

 The impact of the computer, not only as a source of data


but also as a tool for communication, and the demands of
other digital services such as telex and facsimile.
 The use of digital communication offer flexibility and
compatibility in that the adoption of a common digital
format makes it possible for a transmission system to
sustain many different sources of information in a flexible
manner.
Factors of Choosing Digital Communications
ICE M3011

 The improved reliability made by use of digital


communications.
 The availability of wide-band channels provided by
geostationary satellites, optical fiber, and coaxial cables,
 The ever-increasing availability of integrated solid-state
electronic technology.
Sources of Information ICE M3011
A source of information generates a message, example of
which include …………..
 Human Voice,
 Television picture,
 Teletype data,
 Atmospheric temperature, and pressure.

Above mentioned message is not electrical in nature, and so a


TRANSDUCER is used to convert into an electrical waveform
called the message signal.
The waveform is also referred to as a baseband signal.
The term “baseband” is used to designate the band of
frequencies representing the message signal generated at the
source
Message Signals ICE M3011

Is it Analog or Digital types?


Both
An Analog Signal is one in which both amplitudes and
the time vary continuously over their respective
intervals.

 A speech signal
Example of Analog signal

 A television signal, and


A signal representing atmospheric
temperature or pressure at some
location
Digital Message Signals ICE M3011

In a Digital Signal, both amplitude and time take on discrete


values.
Example of Digital signal

Computer data
Telegraph signals
Analog-to-Digital Conversion ICE M3011

An analog signal can be always be converted into digital form by


combining three basic operation in sequence.

In the sampling operation

Only sample values of the


analog signal at uniformly
spaced discrete instants of
time are retained
Analog-to-Digital Conversion (Con’t) ICE M3011

In the quantizing operation

Each sample value is


approximated by the nearest
level in a finite set of discrete
levels
Analog-to-Digital Conversion (Con’t) ICE M3011

In the Encoding operation


The selected level is
represented by a code word
that consists of a prescribed
number of code elements.

Here code word consists of


four binary digits(bits) with the
last bit assigned the role of a
sign bit that signifies whether
the sample value in question is
positive or negative
Basic Signal Processing Operations in
Digital Communication ICE M3011

Source of Info
are digital in
nature or
converted into
it by design

THREE basic signal-processing operations are identified:


 Source Coding
 Channel Coding, and
 Modulation.
Basic Signal Processing Operations in
Digital Communication (Con’t) ICE M3011

 The encoder maps the digital signal generated at the source


output into another signal in digital form.
 The mapping is one-to-one, and the objective is to eliminate or
reduce redundancy so as to provide an efficient representation of
the source output.
Basic Signal Processing Operations in
Digital Communication (Con’t) ICE M3011

 The source decoder simply performs the inverse mapping, thereby


 Deliver to the user destination a reproduction of the original digital
source output.
 The primary benefit thus gained from the application of source
coding is a reduced bandwidth requirement.
Basic Signal Processing Operations in
Digital Communication (Con’t) ICE M3011

 The objective of the Channel Encoder(CE) to map the incoming


digital signal into a channel input and for the Channel Decoder(CD)
to map the channel output into output digital signal.
 CE and CD is used to minimize the effect of channel noise by
introducing controlled redundancy thereby providing reliable
Channels for Digital Communication
ICE M3011
There are 5 specific channels used in digital
communications: • Telephone channel
• Coaxial Cables Channel
• Optical Fibers Channel
• Microwave Radio Channel, and
• Satellite Channel
Channels for Digital Communication
ICE M3011
Telephone Channel
• It is designed to provide voice-grade communication,
• This channel has a band-pass characteristics occupying
the frequency range 300 to 3400 Hz
Channels for Digital Communication
Coaxial Cable Channel ICE M3011

• Consists of a single-wire conductor centered inside an outer


conductor, which are insulated (by insulator) from each other by
mean of die-electric material
• it has relatively wide bandwidth, and
• freedom from external interferences.
• Its operational characteristics require the use of closely
• spaced repeater

 Efficient digital transmission systems using coaxial cables have been


built to operate at a data rate of 274 Mb/s, with repeater spaced at
1 km interval
Channels for Digital Communication
Optical Fiber Channel ICE M3011

• It consists of a very fine inner core made of silica glass,


surrounded by a concentric layer called cladding that is also
made of glass.
• Refractive index of Core > Refractive index of Clad.
• Light passes through the core with the principle of total
internal reflection.
Compared to the coaxial cables, optical fibers are
smaller and they offer higher transmission
bandwidths and longer repeater separation
Channels for Digital Communication
Microwave Radio Channel ICE M3011

• Operating on a line-of-site link


• Consists basically of a transmitter and a receiver
that are equipped with antennas of their own.
• The antennas are placed on towers at sufficient height
• The operating frequencies range from about 1 to 30 GHz.
Channels for Digital Communication
Satellite Channel ICE M3011

• It consists of ……
• a satellite in geostationary orbit
• An uplink from the ground station, and
• A downlink to another ground station
• The uplink and downlink operate at microwave
frequencies (uplink freq> downlink freq).
• Viewed as a repeater in the sky, permitting communication
(one ground station to another) over long distances at high
bandwidth and relatively low cost
Discrete Random Variable ICE M3011
• A variable is a quantity whose value changes.
• A discrete variable is a variable whose value is obtained by
counting. 
• Examples:     number of students present
                          number of red marbles in a jar
                          number of heads when flipping three coins
                          students’ grade level
• A continuous variable is a variable whose value is obtained
by measuring.
 
• Examples:   height of students in class
                        weight of students in class
                        time it takes to get to school
                        distance traveled between classes
• A discrete random variable S has a countable number of
possible values.
Discrete Sources
ICE M3011
It is a DATA SOURCE whose output is a sequence of symbols from a
known discrete alphabet,

This alphabet could be the alphanumeric characters, the characters on


a computer keyboard, English letters.

Fixed-length codes for discrete sources:


The simplest approach to encoding a discrete source into binary digits
is to create a code C that maps each symbol x of the alphabet into a
distinct codeword C(x), where C(x) is a block of binary digits.

Each such block is restricted to have the same block Length L, which is
why the code is called a fixed-length code.
Fixed-length codes for discrete sources: Example
ICE M3011

If the alphabet consists of the 8 symbols (a, b, c, d, e, f, g), then the


following fixed-length code of block L=3 could be used.
C(a)=000
C(b)=001
C(c)=010
C(d)=011
C(e)=100
C(f)=101
C(g)=110
C(h)=111
The source output x1,x2,x3…, would be encoded into the encoded
output C(x1)C(x2)...and thus the encoded output contians L bits per
symbol.
For the above example, the source sequence bad....would be encoded
into 001000011.....
Note that the output bits are simple run together .
Discrete Memoryless Source ICE M3011
• Let source output is modeled as a discrete random variable, S, which
takes on symbols from a fixed alphabet
= {s0,s1,……….Sk-1}
with probabilities P(S=sk)=pK , k=01,……, K-1
• Of course, this set of probabilities must satisfy the condition

• We assume that the symbols emitted by the source during


successive signaling intervals are statistically independent.
• A source having the properties is called Discrete memoryless
source.
• Memoryless in the sense that the symbol emitted at any time is
independent of previous choices.
Surprise, Uncertainty, Information ICE M3011

• Consider the event S=sk , describing the emission of symbol

sk by the source with probability pk .

• Clearly, if the probability pk =1 and pi =0 for all i, then there


is no “surprise”, and therefore no information when
symbol is emitted sk.
• If the source symbols occurs with different probabilities,
and the probability pk is low, there is more “surprise”, and
there fore more “information” when symbol is emitted by
Surprise, Uncertainty, Information ICE M3011

• The words “uncertainty”, “Surprise”, and “Information” are


all related.
• Before the event S=sk occurs, there is the amount of
uncertainty.
• When the event S=sk occurs there is the amount of
surprise,
• After the occurrence of the event S=sk , there is a gain in
the amount of information.
Entropy ICE M3011

• Let I(sk) is the discrete random variable that takes on the values
I(s0), I(s1) , I(s2)….. I(sK-1) with probability p0 , p1 , p2 ,………. ,pK-1 ,
respectively.
• The mean values of I(sk) over the source alphabet is given by…

H() = E[I(sk)]
=
=
• This important quantity, H(), is called the entropy of a discrete
memoryless source with source alphabet .
• It is the measure of the average information content per symbol.
Some Properties of Entropy ICE M3011

• H()=0, if and only if the probability pk =1 for some k, and


the remaining probabilities in the set are all zero. This
lower bound on entropy corresponds to no uncertainty.
• H()=log2K, if and only if pk =1/K for all. This upper bound on
entropy corresponds to maximum uncertainty.
How FAST can we transmit information over a
communication channel…..? ICE M3011

Suppose a source sends r messages per second, and the


entropy of a message is H bits per message.
The information rate is R = r H bits/second.
# Channel Capacity Theorem
A given communication system has a maximum rate of information C
known as the channel capacity.
• If the information rate R is less than C, then one can approach
arbitrarily small error probabilities by using INTELLIGENT CODING
TECHNIQUES.
• To get lower error probabilities, the encoder has to work on
longer blocks of signal data. This entails longer delays and higher
computational requirements.
Thus, if R ≤ C then transmission may be accomplished without error in
the presence of noise
Shannon’s Theorem
ICE M3011
Shannon's Theorem gives an upper bound to the capacity of a link, in
bits per second (bps), as a function of the available bandwidth and the
signal-to-noise ratio of the link.

The Theorem can be stated as:

C = B * log2(1+ S/N)
where C is the achievable channel capacity,
B is the bandwidth of the line,
S is the average signal power and
N is the average noise power.
The signal-to-noise ratio (S/N) is usually expressed in decibels (dB)
given by the formula:
10 * log10(S/N)
Shannon’s Theorem (Cont…)
ICE M3011

Here is a graph showing the relationship between C/B and S/N


(in dB):
Instant Test-1…. ICE M3011

 Use of Shannon's Theorem.


Modem
For a typical telephone line with a signal-to-noise ratio of 30dB and an
audio bandwidth of 3kHz.

What will be the maximum data rate of the telephone line?

The Shannon Theorem stated as:


C = B * log2(1+ S/N)

C = 3000 * log2(1001)
which is a little less than 30 kbps.
Instant Test-2…. ICE M3011

 Use of Shannon's Theorem.


Satellite TV Channel

For a satellite TV channel with a signal-to noise ratio of 20 dB and a


video bandwidth of 10MHz,
What will be the maximum data rate of the Satellite TV Channel?
The Shannon Theorem stated as:
C = B * log2(1+ S/N)

C=10000000 * log2(101)
which is a little less than 60 Mbps.
Source Encoder
ICE M3011
An important problem in communication is the efficient
representation of data generated by a discrete source. The
process by which this representation is accomplished is called
source encoding.

The device that performs the representation is called a source


encoder.

Functional Requirements of Source Encoder:


1. The code-words produced by the encoder are in binary
form.
2. The source code is uniquely decodable, so that the
original source sequence can be reconstructed perfectly
from the encoded binary sequence.
Code Efficiency of the Source Encoder
ICE M3011

Consider the arrangement shown in figure that depicts a discrete


memoryless source whose output is converted by the source encoder
into a stream of 0s and 1s, denoted by bk. We assume that the source
has an alphabet with K different symbols, and the k-th symbol sk occurs
with probability , k=0,1,….., K-1. Let the binary code-word assigned to
symbol sk by the encoder have length , measured in bits.

Discrete Sk Source bk Binary Sequence


memoryless
Encoder
source
Code Efficiency of the Source Encoder
ICE M3011

We define the average code-word length, L, of the source encoder as

In physical terms, the parameter L represents the average number of


bits per source symbol used in the source encoding process

Let denote the minimum possible value of L. We the define the code
efficiency of the source encoder as

With L>= , we clearly have <=1. The encoder is said to be efficient when
approaches unity.
Code Efficiency of the Source Encoder
ICE M3011
Let denote the minimum possible value of L. We the define the code
efficiency of the source encoder as

With L>= , we clearly have <=1. The encoder is said to be efficient when
approaches unity.
Discrete Memoryless Sources
ICE M3011
Suppose that a probabilistic experiment involves the
observation of the output emitted by a discrete
source every unit of time. The source output is
modelled as discrete random variable, S, which takes
on symbols from a fixed finite alphabet.

With probabilities
P(S=)= k=0,1,…., K-1
Discrete Memoryless Sources
ICE M3011
Of course, this set of probability must satisfy the
condition

We assume that the symbols emitted by the source


during successive signaling interval are statistically
independent.
A source having the properties just described as
discrete memoryless source.
Discrete Memoryless Channels ICE M3011

• A discrete memoryless channel is a statistical model with an input X


and an output Y that is a noisy version of X; Both X and Y are
random variables.
• Every unit of time, the channel accepts an input symbol X selected
from an alphabet X, and in response, it emits an output symbol Y
from an alphabet Y.
• The channel is said to be “discrete” when both of the alphabets X
and Y have finite sizes. It is said to be memoryless when the current
output symbol depends only on the current symbol and not any of
the previous ones.
Discrete Memoryless Channels ICE M3011

Figure below depicts the view of a discrete memoryless


channel

The channel is described in terms of an input alphabet


X={x0,x1,x2,x3……..,xj-1}
An output alphabet
Y={y0,y1,y2,y3……,yk-1}
And a set of transition probabilities
P(y |x )= P(Y= y | X= x )

You might also like