You are on page 1of 71

EC 8501-DIGITAL COMMUNICATION

REGULATION 2017
V SEMESTER ECE A & B

AY-2020-21
BATCH: 2018-22

Handled
By
MS.P. MALATHI,
ASSOC.PROF., ECE
SCOPE OF THE COURSE
Core/ITES companies offering Opportunities in Government organization:
jobs with a strong knowledge in Indian
Electronic Circuits: engineering services (IES) Examination.
 Samsung BSNL JTO/TTA
Robert Bosch DRDO
Dell ISRO
Fujitsu BEL
IBM ONGC
Panasonic Power Grid Corporation of India Limited
Havells Hindustan Aeronautics Limited (HAL)
Microsoft National Thermal Power Corporation (NTPC)
Bajaj Electronics Doordarshan
Intel All India Radio (AIR)
Texas Instruments Bharat Heavy Electricals Limited (BHEL)
National Instruments  Higher Studies in India:
Microchip GATE
HCL,Wipro
PRE-REQUISITES
EC 8252- II SEMESTER- ELECTRONIC DEVICES
EC 8351- III SEMESTER- ELECTRONIC CIRCUITS I
EC 8392-III SEMESTER- DIGITAL ELECTRONICS
OBJECTIVES

To study the limits set by Information Theory

 To study the various waveform coding schemes

 To learn the various baseband transmission schemes

 To understand the various band pass signaling schemes

 To know the fundamentals of channel coding


COURSE OUTCOMES
At the end of the course the students would be able to Tools for assessment

CO-1 Design PCM systems Quiz, Assignments

Design and implement base band transmission Quiz, Assignments


CO-2 schemes

Design and implement band pass signaling Quiz, Assignments


CO-3 schemes

Analyze the spectral characteristics of band pass Quiz, Assignments


CO-4 signaling schemes and their noise performance

Quiz, Assignments, Case


CO-5 Design error control coding schemes Study
SYLLABUS-GE8071 DISASTER MANAGEMENT L- 3 T-0 P-0 C -3
UNIT I INFORMATION THEORY 9 UNIT IV DIGITAL MODULATION SCHEME 9
Discrete Memoryless source, Information, Entropy, Geometric Representation of signals -
Mutual Information - Discrete Memoryless channels – Generation, detection, PSD & BER of Coherent
Binary Symmetric Channel, Channel Capacity - Hartley - BPSK, BFSK & QPSK - QAM - Carrier
Shannon law - Source coding theorem - Shannon - Fano Synchronization - Structure of Non-coherent
& Huffman codes. Receivers - Principle of DPSK.

UNIT II WAVEFORM CODING & REPRESENTATION 9 UNIT V ERROR CONTROL CODING 9


Prediction filtering and DPCM - Delta Modulation - Channel coding theorem - Linear Block codes -
ADPCM & ADM principles-Linear Predictive Coding- Hamming codes - Cyclic codes - Convolutional
Properties of Line codes- Power Spectral Density of codes - Viterbi Decoder
Unipolar / Polar RZ & NRZ – Bipolar NRZ - Manchester

UNIT III BASEBAND TRANSMISSION & RECEPTION 9


ISI – Nyquist criterion for distortion less transmission –
Pulse shaping – Correlative coding - Eye pattern –
Receiving Filters- Matched Filter, Correlation receiver,
Adaptive Equalization
TEXT BOOKS,REFERENCE BOOKS, NPTEL LINKS & WEB RESOURCES
TEXT BOOKS:
1. S. Haykin, ―Digital Communications‖, John Wiley, 2005 (Unit I –V)
REFERENCE BOOKS:
1. B. Sklar, ―Digital Communication Fundamentals and Applications‖, 2nd Edition, Pearson Education,
2009
2. 2. B.P.Lathi, ―Modern Digital and Analog Communication Systems‖ 3rd Edition, Oxford University
Press 2007.
3. 3. H P Hsu, Schaum Outline Series - ―Analog and Digital Communications‖, TMH 2006
4. 4. J.G Proakis, ―Digital Communication‖, 4th Edition, Tata Mc Graw Hill Company, 2001.

NPTEL LINK:
1. https://nptel.ac.in/courses/117/101/117101051/
2. https://nptel.ac.in/courses/108/102/108102096/
3. https://nptel.ac.in/courses/108/101/108101113/

WEB RESOURCES:
1. https://www.youtube.com/watch?v=Z0Ylnk8zXRo
2. https://www.youtube.com/watch?v=qhjj6WG7Rgc
3. https://www.youtube.com/watch?v=S8X49TQxH-o
4. https://swayam.gov.in/nd1_noc20_ee17/preview
INTERLINKING OF ALL UNITS

Information Theory
UNIT 1

Error Control Codes UNIT 5 UNIT 2 Waveform coding

UNIT 4 UNIT 3 Baseband Transmission


and Reception

Digital modulation techniques


PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
EC 8501 DIGITAL COMMUNICATION
UNIT I INFORMATION THEORY 9
Discrete Memoryless source, Information, Entropy, Mutual

Information –

Discrete Memoryless channels – Binary Symmetric Channel,

Channel Capacity – Hartley - Shannon law

Source coding theorem

Shannon - Fano & Huffman codes.

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION- UNIT I-


CLASS 1
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


UNIT I-CLASS 1
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Mathematical models for information source


Discrete source
X  {x1 , x2 , , xL }
L
pk  P[ X  xk ] p
k 1
k 1

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


UNIT I-CLASS 1
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
Discrete Memoryless source

• Discrete memoryless source (DMS)


Source outputs are independent random variables
{ X i } i  1,2, , N
• Discrete stationary source
– Source outputs are statistically dependent
– Stationary: joint probabilities of x1 , x2 ,  xn and
x1 m , x2 m , xn  m are identical for all shifts m
– Characterized by joint PDF p( x , x ,  x )
1 2 m

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


11
UNIT I-CLASS 1
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Measure of information
• Entropy of random variable X
– A measure of uncertainty or ambiguity in X

X  {x1 , x2 , , xL }
L
H ( X )   P[ X  xk ] log P[ X  xk ]
k 1

– A measure of information that is required by knowledge of X, or information


content of X per symbol
– Unit: bits (log_2) or nats (log_e) per symbol
– We define
– Entropy depends on probabilities of X, not values of X
0 log 0  0
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
12
UNIT I-CLASS 1
PRATHYUSHA ENGINEERING COLLEGE
Shannon’s fundamental paper
DEPARTMENT in 1948
OF ECE
“A Mathematical Theory of Communication”
Can we define a quantity which will measure how much
information is “produced” by a process?
He wants this measureH ( p1 , p2 ,, pn ) to satisfy:
1) H should be continuous in pi
2) If all pi are equal, H should be monotonically
increasing with n
3) If a choice can be broken down into two
successive choices, the original H should be the
weighted sum of the individual values of H
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
13
UNIT I-CLASS 1
PRATHYUSHA ENGINEERING COLLEGE
Shannon’s fundamental paper
DEPARTMENT in 1948
OF ECE
“A Mathematical Theory of Communication”

1 1 1 1 1 1 2 1
H( , , )  H( , ) H( , )
2 3 6 2 2 2 3 3
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
14
UNIT I-CLASS 1
PRATHYUSHA ENGINEERING COLLEGE
Shannon’s fundamental paper
DEPARTMENT in 1948
OF ECE
“A Mathematical Theory of Communication”
The only H satisfying the three assumptions is of
the form:
n
H   K  pi log pi
i 1

K is a positive constant.

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


15
UNIT I-CLASS 1
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
Binary entropy function
H(p)

H ( X )   p log p  (1  p) log(1  p)

H=0: no uncertainty
H=1: most uncertainty
1 bit for binary information

V SEMESTER- REGProbability
2017- EC p 8501- DIGITAL COMMUNICATION-
16
UNIT I-CLASS 1
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
Mutual information
• Two discrete random variables: X and Y
I ( X ; Y )   P[ X  x, Y  y ]I ( x, y )
P[ x | y ]
  P[ X  x, Y  y ] log
P[ x]
P[ x, y ]
  P[ X  x, Y  y ] log
P[ x]P[ y ]
• Measures the information knowing either variables provides about the
other
• What if X and Y are fully independent or dependent?
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
17
UNIT I-CLASS 2
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

I ( X ;Y )  H ( X )  H ( X | Y )
 H (Y )  H (Y | X )
 H ( X )  H (Y )  H ( X , Y )

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


18
UNIT I-CLASS 2
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Some properties
I ( X ; Y )  I (Y ; X )
I ( X ;Y )  0
I(X ; X )  H (X )
I ( X ; Y )  min{H ( X ), H (Y )}
Entropy is maximized
0  H ( X )  log  when probabilities are equal
If Y  g ( X ), then H (Y )  H ( X )
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
UNIT I-CLASS 2 19
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Joint and conditional entropy


• Joint entropy

H ( X , Y )   P[ X  x, Y  y ] logP[ X  x, Y  y ]

• Conditional entropy of Y given X

H (Y | X )   P[ X  x]H (Y | X  x)
  P[ X  x, Y  y ] log P[Y  y | X  x]

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


20
UNIT I-CLASS 2
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Joint and conditional entropy


• Chain rule for entropies
H ( X 1 , X 2 , , X n )  H ( X 1 )  H ( X 2 | X 1 )  H ( X 3 | X 1 , X 2 )
   H ( X n X 1 , X 2 , , X n 1 )

• Therefore,
n
H ( X 1 , X 2 , , X n )   H ( X i )
i 1

• If Xi are iid
H ( X 1 , X REG
V SEMESTER- 2 ,  , X n )  nH ( X )
2017- EC 8501- DIGITAL COMMUNICATION-
21
UNIT I-CLASS 2
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
6.3 Lossless coding of information source

• Source sequence with length n


n is assumed to be large
x  [ X 1 , X 2 , , X n ]
X    {x1 , x2 , , xL }

pi  P[ X  xi ]

• Without any source coding


log L
we need bits per symbol
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
22
UNIT I-CLASS 2
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Lossless source coding


• Typical sequence
– Number of occurrence of xi is roughly npi
– When n   , any x will be “typical”
L N
log P[ x ]  log  ( pi ) npi   npi log pi  nH ( X )
i 1 i 1

P[ x ]  2  nH ( X ) All typical sequences have


the same probability
P[ x ]  1 when n  
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
23
UNIT I-CLASS 2
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Lossless source coding


• Typical sequence
1
Number of typical sequences =  2 nH ( X )
P[ xcertain
• Since typical sequences are almost ] to occur,
for the source output it is sufficient to consider only
these typical sequences
• How many bits per symbol we need now?

nH ( X )
R  H ( X )  log L
n
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
24
UNIT I-CLASS 2
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
Lossless source coding
Shannon’s First Theorem - Lossless Source Coding

Let X denote a discrete memoryless source.


There exists a lossless source code at rate R if

R  H(X ) bits per transmission

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


25
UNIT I-CLASS 2
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Lossless source coding


For discrete stationary source…

R  H ( X )
1
 lim H ( X 1 , X 2 , , X k )
k  k

 lim H ( X k | X 1 , X 2 , , X k 1 )
k 

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


26
UNIT I-CLASS 2
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Lossless source coding algorithms


• Variable-length coding algorithm
– Symbols with higher probability are assigned
shorter code words
L
min R   nk P ( xk )
{ nk }
k 1

– E.g. Huffman coding


• Fixed-length coding algorithm
E.g. Lempel-Ziv coding

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


27
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
Huffman coding algorithm
DEPARTMENT OF ECE

P(x1)
P(x2)
P(x3)
P(x4)
P(x5) x1 00
P(x6) x2 01
P(x7) x3 10

x4 110

H(X)=2.11 x5 1110
R=2.21 bits per symbol x6 11110
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
UNIT I-CLASS 3 x 7
28
11111
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Channel models and channel capacity


• Channel models
input sequence x  ( x1 , x2 ,  , xn )
output sequence y  ( y1 , y2 , , yn )

A channel is memoryless if
n
P[ y | x ]   P[ yi | xi ]
i 1

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


29
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Binary symmetric channel (BSC) model

Source Output
data data

Channel Binary Demodulator Channel


Channel
encoder modulator and detector decoder

Composite discrete-input discrete output channel

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


30
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Binary symmetric channel (BSC) model


1-p
0 0
p
Input Output
p
1 1
1-p
P[Y  0 | X  1]  P[Y  1 | X  0]  p
P[Y  1 | X  1]  P[Y  0 | X  0]  1  p

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


31
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Discrete memoryless channel (DMC)


{X} {Y}
Input x0 y0 Output
x1 y1

xM-1
…… P[ y | x ]
can be arranged
yQ-1
in a matrix
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
32
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Discrete-input continuous-output channel


Y  X N
If N is additive white Gaussian noise…
( y  x )2
1 
2 2
p ( y | x)  e
2
2
n
p( y1 , y2 ,  , yn | x1 , x2 , , xn )   p( yi | xi )
i 1

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


33
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Discrete-time AWGN channel


yi  xi  ni

2
E[ X ]  P
• Power constraint x  ( x1 , x2 , , xn )
• For input sequence with large
n 1 n 1 2
n
 x
i 1
2
i 
n
x P

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


34
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

AWGN waveform channel


Source Output
data data

Channel Physical Demodulator Channel


Modulator
encoder channel and detector decoder

Input Output
waveform waveform

• Assume channel has bandwidth W, with


frequency response C(f)=1, [-W, +W]
y (t )  x(t )  n(t )
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
35
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

AWGN waveform channel

• Power constraint
E[ X 2 (t )]  P
1 T2 2
lim  x (t )dt  P
T  T T 2

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


36
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

AWGN waveform channel


• How to define probabilities that characterize
the channel?
x(t )   x j j (t )
j

n(t )   n j j (t ) yi  x j  n j
j
Equivalent to 2W

y (t ) 
j
y j j (t ) uses per second of
a discrete-time
{ (t
V SEMESTER-
j ), j  1 , 2,  , 2WT } channel
REG 2017- EC 8501- DIGITAL COMMUNICATION-
37
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

AWGN waveform channel


• Power constraint becomes...
1 T2 2 1 2WT 2 1
lim  x (t )dt  lim  x j  lim  2WT  E[ X ]
2
T  T T 2 T  T T  T
j 1

 2WE[ X 2 ]
P
• Hence,
2 P
E[ X ] 
2W
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
38
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
Hartley Shannon Law

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


39
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Channel capacity
• After source coding, we have binary sequency
of length n
• Channel causes probability of bit error p
• When n->inf, the number of sequences that
have np errors

n n! nH b ( p )
   2
 np  (np )!(n(1  p))!

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


40
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Channel capacity
• To reduce errors, we use a subset of all
possible sequences
2n
M  2 n (1 H b ( p ))
2 nH b ( p )
• Information rate [bits per transmission]

1 Capacity of
R  log 2 M  1  H b ( p ) binary channel
n

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


41
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Channel capacity
We cannot transmit more
0  R  1  H b ( p)  1 than 1 bit per channel use

Channel encoder: add redundancy


2n different binary
sequencies of length n
contain information

We use 2m different binary


sequencies of length m for
transmission
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
42
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Channel capacity
• Capacity for abitray discrete memoryless channel

C  max I ( X ; Y )
p

• Maximize mutual information between input and


p  ( p1 , p2 , , p X )
output, over all
• Shannon’s Second Theorem – noisy channel coding
- R < C, reliable communication is possible
- R > C, reliable communication is impossible

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


43
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Channel capacity 1
For binary symmetric channel P[ X  1]  P[ X  0] 
2
C  1  p log 2 p  (1  p) log 2(1  p)  1  H ( p)

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


44
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Channel capacity
Discrete-time AWGN channel with an input
power constraint
2
Y  X N E[ X ]  P
For large n,
1 2
y  E[ X 2 ]  E[ N 2 ]  P   2
n
1 2 1 2
y  x  n  2
n n
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
45
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Channel capacity
Discrete-time AWGN channel with an input power constraint
2
Y  X number
Maximum N E[symbols
of X ]  Pto transmit

 n( P   ) 
n
2 n
P
M  (1  ) 2

 n 2
Transmission rate  n
 2

Can be obtained by directly


1 1 P
R  log 2 M  log 2 (1  2 ) maximizing I(X;Y), subject to
n 2  power constraint
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
46
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
Channel capacity
DEPARTMENT OF ECE

Band-limited waveform AWGN channel with


input power constraint
- Equivalent to 2W use per second of discrete-
time channel
P
1 2W 1 P
C  log2 (1  )  log2 (1  ) bits/channel use
2 N 0 2 N 0W
2
1 P P
C  2W  log2 (1  )  W log2 (1  ) bits/s
2 N 0W N 0W
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
47
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Channel capacity
P
C  W log 2 (1  )
N 0W
P C 
P
W   C  1.44
N0

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


48
UNIT I-CLASS 4
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
Channel capacity
• Bandwidth efficiency
R P  PTs P
r   log2 (1  ) b   
W N 0W log 2 M log 2 M R
bR b
r  log2 (1  )  log2 (1  r )
N 0W N0
• Relation of bandwidth efficiency and power
efficiency
 b 2 1r b
 r  0,  ln 2  1.6 dB
N r N0
0

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


49
UNIT I-CLASS 4
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


50
UNIT I-CLASS 4
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Shannon Fano Code

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


51
UNIT I-CLASS 4
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Shannon Fano Code

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


52
UNIT I-CLASS 4
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Shannon Fano Code

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


53
UNIT I-CLASS 4
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Shannon Fano Code

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


54
UNIT I-CLASS 4
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE

Shannon Fano Code

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


55
UNIT I-CLASS 4
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
Shannon Fano Code

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


56
UNIT I-CLASS 4
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
Shannon Fano Code

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


57
UNIT I-CLASS 4
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
Shannon Fano Code

V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-


58
UNIT I-CLASS 4
QUIZ AND ASSIGNMENTS

Link will be shared


QUIZ- 1: UNIT 1
QUIZ-2: UNIT 2,3
QUIZ-3:UNIT
QUI 4,5

Link will be shared


A1- UNIT 1,2
A2- UNIT 3
A3- UNIT 4
IES QUESTIONS
IES QUESTIONS
IES QUESTIONS
GATE QUESTIONS

Adopted from Gatestudy.com


GATE QUESTIONS

Adopted from
Gatestudy.com
GATE QUESTIONS

Adopted from
Gatestudy.com
GATE QUESTIONS

Adopted from
Gatestudy.com
PREVIOUS YEAR AU QUESTIONS-PART A
1. What is entropy and give its mathematical 11. Describe the concept of discrete
equation. memoryless channel
2. Define source coding. State the significance of 12. List out the properties of Hamming
source coding distance.
3. What is BSC? 13. Evaluate the Hamming distance
4. Why is Huffman code called as minimum between the following code words
redundancy code? C1={1,0,0,0,1,1,1} and
5. An event has six possible outcomes with C2={0,0,0,1,0,1,1}.
probabilities {1/2, 1/4, 1/8, 1/16, 14. State the properties of mutual
1/32, 1/32}. Solve for the entropy of the information
system. 15. Examine the types of discrete
6. Outline the concept of discrete memoryless memoryless channel
source. 16. Give the main idea of Channel Capacity
7. Calculate the amount of information if 𝑝𝑘 =1/4. 17. Summarizeshannon’s law
8. Identify the properties of entropy 18. Formulate the steps involved in
9. Describe information rate? Shannon Fano coding
10. Interpret the theory of mutual information 19. Distinguish the various source coding
techniques.
20. Revise the steps involved in Huffman coding
PREVIOUS YEAR AU QUESTIONS-PART B
Enumerate Shannon’s Fano algorithm and Huffman coding with a suitable
1
example.
Five symbols of the alphabet of discrete memory less source and their
2 probabilities are given below, S={S0, S1,S2,S3,S4)
P(S)={0.4,0.19,0.16,0.15,0.15}. Predict the symbols using Huffman coding and
calculate the average codeword length and efficiency.
Illustrate the following with equations
(i) Uncertainity
3
(ii) Information
(iii) Entropy and it’s properties

(i) Infer Hamming codes. Analyse the conditions which hamming


codes has to satisfy.
4
(ii) Examine the following terms - Code efficiency, Channel data rate
and code rate.
PREVIOUS YEAR AU QUESTIONS-PART B
Five symbols of the alphabet of discrete memory less source and their
probabilities are given below, S={S0, S1,S2,S3,S4)
5. P(S)={0.4,0.19,0.16,0.15,0.15}. Point out the symbols using Shannon Fano
coding and calculate the average codeword length and
efficiency
(i) Summarize Source Coding with block diagram and mention it’s
6. functional requirements.
(ii) Deduce the equations for average codeword length and coding efficiency
using entropy.
(i) Give the main idea of discrete memoryless channel and it’s matrix form
involving transition probabilities.
7. (ii) Relate the concept of Binary symmetric channel with Binary communication
channel & Binary erasure channel

Interpret the following


8. (i) Mutual information and its properties.
(ii) Channel capacity and it’s equation.
PREVIOUS YEAR AU QUESTIONS-PART B
Five symbols of the alphabet of discrete memory less source and their probabilities are
9. given below, S={S0, S1,S2,S3,S4) P(S)={0.4,0.2,0.2,0.1,0.1}. Construct the symbols
using Huffman coding and calculate the average codeword length and
efficiency.
Brief the properties of entropy.
10 Describe the BSC and BEC with their channel diagram and transition matrix.

Five symbols of the alphabet of discrete memory less source and their
11 probabilities are given below, S={S0, S1,S2,S3,S4)
P(S)={0.4,0.2,0.2,0.1,0.1}. Show the symbols using Shannon Fano Coding
and calculate the average codeword length and efficiency.
A telephone channel has a bandwidth of 3 kHz .
12 Predict channel capacity of the telephone channel for a SNR of 20 Db.
Estimate minimum SNR required to support a rate of 5 kbps.
A telephone channel has a bandwidth of 3 kHz and output SNR of 20 dB. The
source has a total of 512 symbols and the occurrence of all symbols are
13 equiprobable. Point out the following
channel capacity
Information content per symbol.
maximum symbol rate for which error free transmission is possible.
PREVIOUS YEAR AU QUESTIONS-PART C
The source of information A generates the symbols {A0, A1, A2, A3&
1 A4} with the corresponding probabilities {0.4, 0.3, 0.15, 0.1 and
0.05}. Evaluate the code for source symbols using Huffman and
Shannon-Fano encoder and compare its efficiency.
The source of information A generates the symbols {A0, A1, A2, A3 ,
2 A4 , A5} with the corresponding probabilities {0.45, 0.41, 0.4, 0.3 ,
0.29 and 0.05}. Evaluate the code for source symbols using Huffman
and Shannon-Fano encoder and compare its efficiency.

You might also like