Professional Documents
Culture Documents
REGULATION 2017
V SEMESTER ECE A & B
AY-2020-21
BATCH: 2018-22
Handled
By
MS.P. MALATHI,
ASSOC.PROF., ECE
SCOPE OF THE COURSE
Core/ITES companies offering Opportunities in Government organization:
jobs with a strong knowledge in Indian
Electronic Circuits: engineering services (IES) Examination.
Samsung BSNL JTO/TTA
Robert Bosch DRDO
Dell ISRO
Fujitsu BEL
IBM ONGC
Panasonic Power Grid Corporation of India Limited
Havells Hindustan Aeronautics Limited (HAL)
Microsoft National Thermal Power Corporation (NTPC)
Bajaj Electronics Doordarshan
Intel All India Radio (AIR)
Texas Instruments Bharat Heavy Electricals Limited (BHEL)
National Instruments Higher Studies in India:
Microchip GATE
HCL,Wipro
PRE-REQUISITES
EC 8252- II SEMESTER- ELECTRONIC DEVICES
EC 8351- III SEMESTER- ELECTRONIC CIRCUITS I
EC 8392-III SEMESTER- DIGITAL ELECTRONICS
OBJECTIVES
NPTEL LINK:
1. https://nptel.ac.in/courses/117/101/117101051/
2. https://nptel.ac.in/courses/108/102/108102096/
3. https://nptel.ac.in/courses/108/101/108101113/
WEB RESOURCES:
1. https://www.youtube.com/watch?v=Z0Ylnk8zXRo
2. https://www.youtube.com/watch?v=qhjj6WG7Rgc
3. https://www.youtube.com/watch?v=S8X49TQxH-o
4. https://swayam.gov.in/nd1_noc20_ee17/preview
INTERLINKING OF ALL UNITS
Information Theory
UNIT 1
Information –
Measure of information
• Entropy of random variable X
– A measure of uncertainty or ambiguity in X
X {x1 , x2 , , xL }
L
H ( X ) P[ X xk ] log P[ X xk ]
k 1
1 1 1 1 1 1 2 1
H( , , ) H( , ) H( , )
2 3 6 2 2 2 3 3
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
14
UNIT I-CLASS 1
PRATHYUSHA ENGINEERING COLLEGE
Shannon’s fundamental paper
DEPARTMENT in 1948
OF ECE
“A Mathematical Theory of Communication”
The only H satisfying the three assumptions is of
the form:
n
H K pi log pi
i 1
K is a positive constant.
H ( X ) p log p (1 p) log(1 p)
H=0: no uncertainty
H=1: most uncertainty
1 bit for binary information
V SEMESTER- REGProbability
2017- EC p 8501- DIGITAL COMMUNICATION-
16
UNIT I-CLASS 1
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
Mutual information
• Two discrete random variables: X and Y
I ( X ; Y ) P[ X x, Y y ]I ( x, y )
P[ x | y ]
P[ X x, Y y ] log
P[ x]
P[ x, y ]
P[ X x, Y y ] log
P[ x]P[ y ]
• Measures the information knowing either variables provides about the
other
• What if X and Y are fully independent or dependent?
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
17
UNIT I-CLASS 2
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
I ( X ;Y ) H ( X ) H ( X | Y )
H (Y ) H (Y | X )
H ( X ) H (Y ) H ( X , Y )
Some properties
I ( X ; Y ) I (Y ; X )
I ( X ;Y ) 0
I(X ; X ) H (X )
I ( X ; Y ) min{H ( X ), H (Y )}
Entropy is maximized
0 H ( X ) log when probabilities are equal
If Y g ( X ), then H (Y ) H ( X )
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
UNIT I-CLASS 2 19
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
H ( X , Y ) P[ X x, Y y ] logP[ X x, Y y ]
H (Y | X ) P[ X x]H (Y | X x)
P[ X x, Y y ] log P[Y y | X x]
• Therefore,
n
H ( X 1 , X 2 , , X n ) H ( X i )
i 1
• If Xi are iid
H ( X 1 , X REG
V SEMESTER- 2 , , X n ) nH ( X )
2017- EC 8501- DIGITAL COMMUNICATION-
21
UNIT I-CLASS 2
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
6.3 Lossless coding of information source
pi P[ X xi ]
nH ( X )
R H ( X ) log L
n
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
24
UNIT I-CLASS 2
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
Lossless source coding
Shannon’s First Theorem - Lossless Source Coding
R H ( X )
1
lim H ( X 1 , X 2 , , X k )
k k
lim H ( X k | X 1 , X 2 , , X k 1 )
k
P(x1)
P(x2)
P(x3)
P(x4)
P(x5) x1 00
P(x6) x2 01
P(x7) x3 10
x4 110
H(X)=2.11 x5 1110
R=2.21 bits per symbol x6 11110
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
UNIT I-CLASS 3 x 7
28
11111
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
A channel is memoryless if
n
P[ y | x ] P[ yi | xi ]
i 1
Source Output
data data
xM-1
…… P[ y | x ]
can be arranged
yQ-1
in a matrix
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
32
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
2
E[ X ] P
• Power constraint x ( x1 , x2 , , xn )
• For input sequence with large
n 1 n 1 2
n
x
i 1
2
i
n
x P
Input Output
waveform waveform
• Power constraint
E[ X 2 (t )] P
1 T2 2
lim x (t )dt P
T T T 2
n(t ) n j j (t ) yi x j n j
j
Equivalent to 2W
y (t )
j
y j j (t ) uses per second of
a discrete-time
{ (t
V SEMESTER-
j ), j 1 , 2, , 2WT } channel
REG 2017- EC 8501- DIGITAL COMMUNICATION-
37
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
2WE[ X 2 ]
P
• Hence,
2 P
E[ X ]
2W
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
38
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
Hartley Shannon Law
Channel capacity
• After source coding, we have binary sequency
of length n
• Channel causes probability of bit error p
• When n->inf, the number of sequences that
have np errors
n n! nH b ( p )
2
np (np )!(n(1 p))!
Channel capacity
• To reduce errors, we use a subset of all
possible sequences
2n
M 2 n (1 H b ( p ))
2 nH b ( p )
• Information rate [bits per transmission]
1 Capacity of
R log 2 M 1 H b ( p ) binary channel
n
Channel capacity
We cannot transmit more
0 R 1 H b ( p) 1 than 1 bit per channel use
Channel capacity
• Capacity for abitray discrete memoryless channel
C max I ( X ; Y )
p
Channel capacity 1
For binary symmetric channel P[ X 1] P[ X 0]
2
C 1 p log 2 p (1 p) log 2(1 p) 1 H ( p)
Channel capacity
Discrete-time AWGN channel with an input
power constraint
2
Y X N E[ X ] P
For large n,
1 2
y E[ X 2 ] E[ N 2 ] P 2
n
1 2 1 2
y x n 2
n n
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
45
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
Channel capacity
Discrete-time AWGN channel with an input power constraint
2
Y X number
Maximum N E[symbols
of X ] Pto transmit
n( P )
n
2 n
P
M (1 ) 2
n 2
Transmission rate n
2
Channel capacity
P
C W log 2 (1 )
N 0W
P C
P
W C 1.44
N0
Adopted from
Gatestudy.com
GATE QUESTIONS
Adopted from
Gatestudy.com
GATE QUESTIONS
Adopted from
Gatestudy.com
PREVIOUS YEAR AU QUESTIONS-PART A
1. What is entropy and give its mathematical 11. Describe the concept of discrete
equation. memoryless channel
2. Define source coding. State the significance of 12. List out the properties of Hamming
source coding distance.
3. What is BSC? 13. Evaluate the Hamming distance
4. Why is Huffman code called as minimum between the following code words
redundancy code? C1={1,0,0,0,1,1,1} and
5. An event has six possible outcomes with C2={0,0,0,1,0,1,1}.
probabilities {1/2, 1/4, 1/8, 1/16, 14. State the properties of mutual
1/32, 1/32}. Solve for the entropy of the information
system. 15. Examine the types of discrete
6. Outline the concept of discrete memoryless memoryless channel
source. 16. Give the main idea of Channel Capacity
7. Calculate the amount of information if 𝑝𝑘 =1/4. 17. Summarizeshannon’s law
8. Identify the properties of entropy 18. Formulate the steps involved in
9. Describe information rate? Shannon Fano coding
10. Interpret the theory of mutual information 19. Distinguish the various source coding
techniques.
20. Revise the steps involved in Huffman coding
PREVIOUS YEAR AU QUESTIONS-PART B
Enumerate Shannon’s Fano algorithm and Huffman coding with a suitable
1
example.
Five symbols of the alphabet of discrete memory less source and their
2 probabilities are given below, S={S0, S1,S2,S3,S4)
P(S)={0.4,0.19,0.16,0.15,0.15}. Predict the symbols using Huffman coding and
calculate the average codeword length and efficiency.
Illustrate the following with equations
(i) Uncertainity
3
(ii) Information
(iii) Entropy and it’s properties
Five symbols of the alphabet of discrete memory less source and their
11 probabilities are given below, S={S0, S1,S2,S3,S4)
P(S)={0.4,0.2,0.2,0.1,0.1}. Show the symbols using Shannon Fano Coding
and calculate the average codeword length and efficiency.
A telephone channel has a bandwidth of 3 kHz .
12 Predict channel capacity of the telephone channel for a SNR of 20 Db.
Estimate minimum SNR required to support a rate of 5 kbps.
A telephone channel has a bandwidth of 3 kHz and output SNR of 20 dB. The
source has a total of 512 symbols and the occurrence of all symbols are
13 equiprobable. Point out the following
channel capacity
Information content per symbol.
maximum symbol rate for which error free transmission is possible.
PREVIOUS YEAR AU QUESTIONS-PART C
The source of information A generates the symbols {A0, A1, A2, A3&
1 A4} with the corresponding probabilities {0.4, 0.3, 0.15, 0.1 and
0.05}. Evaluate the code for source symbols using Huffman and
Shannon-Fano encoder and compare its efficiency.
The source of information A generates the symbols {A0, A1, A2, A3 ,
2 A4 , A5} with the corresponding probabilities {0.45, 0.41, 0.4, 0.3 ,
0.29 and 0.05}. Evaluate the code for source symbols using Huffman
and Shannon-Fano encoder and compare its efficiency.