Professional Documents
Culture Documents
Assessments:
• C1: 30 Marks
– Class performance (10 Marks)
– Lab performance (10 Marks)
– Review test (10 Marks)
• C2: 30 Marks
– Class performance (10 Marks)
– Lab performance (10 Marks)
– Viva on theory + Lab (10 Marks)
• C3: 40 Marks
– Summative Theory Assessment
Electrical Communication system
The Digital Signals:
•The Source draws from an
alphabet of M>=2 different
symbols, and produces output
symbols at some average rate
r.
1011 110010
Message Source Pulse modulator
source coder Error control Line coder shaping
The Bell System Technical Journal, Vol. 27, pp. 379–423, July1948
Shannon’s Answer:
•The only mathematical function which
can retain the previously stated
properties of information for a symbol
produced by a discrete source is
Ii = log(1/Pi) bits
The base of log (if 2) define
the unit of information (then bits)
• Shannon said, “Von Neumann told me, ‘You should call it (average information) entropy, for two reasons.
– In the first place your uncertainty function has been used in statistical mechanics under that name, so
it already has a name.
– In the second place, and more important, nobody knows what entropy really is, so in a debate you will
always have the advantage.”
http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entrop.html
Coding for Memoryless source
• Generally the
• If the rate of symbol generation of a source, with entropy H(x), is r
information
source is not
symbols/sec. then
of designers R = r*H(x) and R<= r*log (M)
choice thus
source coding • If a binary encoder is used then
is done such
that it appears o/p rate = rb* Ω (p) and <= rb
equally likely (if the 0’s and 1’s are equally likely in coded seq)
to the
channel. • Thus as per basic principle of coding theory
• Coding should
R {= r*H(x)} <= rb ; H(x) <= rb/r ; H(x)<= N
neither
generate nor
destroy any • Code efficiency = H(x)/N <=100%
information
produced by
the source i.e.
the rate of
information at
I/P and O/P of
a source
coder should
be same.
Uniquely Decipherability (Kraft’s inequality)
• Kraft’s inequality
K = ∑2 -Ni <= 1
Source Coding Theorem
• We know that an optimum code requires K=1 and
H(x)<=N< H(x) + φ ; φ Pi=Qi
should be very small. • Thus, Pi = Qi = 2-Ni/K(=1) thus Ni = log(1/Pi)
Proof:
• Ni = Ii
• It is known that ∑ Pi* log (the length of code should be (inversely) proportional
(Qi/Pi) <= 0 to its information (probability))
• As per Kraft’s inequality 1 Samuel Morse applied this principle long before
= (1/K)∑2 -Ni , thus it can be Shannon has mathematically proved it
assumed that Qi = 2-Ni/K (so
that addition of all Qi =1).
• Thus, ∑ Pi*{log(1/Pi) – Ni –
log (K)} <=0
• Huffman
(adding two least symbol
probabilities and rearrangement
till two elements, back tracing
for code.)
• nth extension
(form a group by combining ‘n’
consecutive symbols then code
it.)
• Lempel – Ziv
(Table formation for
compressing binary data)
Predictive run encoding
• ‘run of n’ means ‘n’
successive 0’s followed
by a 1.
• m = 2k-1
• k-digit binary codeword is
sent in place of a ‘run of
n’ such that 0<=n<=m-1
Discrete Channel Examples
• Binary Erasure Channel (BEC)
2 source and 3 receiver symbols.
(two threshold detection)
• Shannon suggested
following expression for
MI which does satisfy
both the above
conditions
I(xi;yj) = log {P(xi|yj) / P(xi)}
I(X;Y) = ∑ P(xi,yj)*I(xi;yj);
(for all possible values of i and j)
Mutual information
of BSC
Discrete Channel Capacity
• Discrete Channel Capacity (Cs) = max I(X;Y)
• If ‘s’ symbols/sec is the maximum symbol rate allowed by the channel then channel capacity (C) = s*C s
bits/sec i.e. maximum rate of information transfer.
• In accordance with coding principle, the rate • Ω(α + p - 2*p*α) varies with source probability p and
of information generated by binary encoder reaches a maximum value of unity at (α+p -
should be equal to the rate of information
over the channel (if source would be 2*p*α)=1/2.
connected directly to the channel)
Ω(p)*rb = s*H(X) on taking maximum of both sides rb= •
s*k = C Ω(α + p - 2*p*α) =1 if p=1/2; irrespective of α (it is
already proved that Ω(1/2)=1).
• We have already proved that rb>=R otherwise
it will violate Kraft’s inequality thus C>=R
Hartley-Shannon Law
C= B*log(1+S/N)