Professional Documents
Culture Documents
Wassim Hamidouche
Associate Professor
INSA Rennes, CNRS, IETR - UMR 6164, FRANCE
whamidou@insa-rennes.fr
Lossless coding
Decode a signal is an exact replica of the original
There is no loss of information between the encoder and the
decoder
Exploit statistical redundancy to achieve a bit rate close to the
entropy of the source
H
E= ¯ (1)
l
A coding scheme can also be characterized by its redundancy, R:
l¯− H
R= 100% (2)
H
Letter Probability C1 C2 C3
a1 0.5 0 0 0
a2 0.25 0 1 10
a3 0.125 1 00 110
a4 0.125 10 11 111
Average length 1.25 1.25 1.75
Coding redundancy
Huffman algorithm
10: i ←i +1
11: until Leaf node reached
Ensure: Symbol
12: until i == K
N
X
2−li ≤ 1. (3)
i=1
N
X
l¯ = li Pi = H(S). (4)
i=1
N N
X 1 X 1
l¯ = log2 Pi < log2 + 1 Pi < H(S) + 1. (5)
Pi Pi
i=1 i=1
cQ = 8 5 3 0 0 2 0 0 0 1 0 0 0 0 0 0
h i
RLC
AC
Quantiser Huffman
tables tables
Tables
42 16 2 0 0 0 0 0
−21 −15 0 0
0 0 0 0
CQ
=
10 3 −3 0 0 0 0 0
−2 2 2 0 0 0 0 0
0 −1 0 0 0 0 0 0
" #
16 −21 10 −15 0 0 0 3 −2 0 · · ·
CQ =
· · · 2 −3 0 0 0 0 0 2 −1 0 · · ·
Unary codes:
Golomb codes:
Golomb–Rice codes:
1) ζi = ⌊log2 (i + 1)⌋ = 7
vi = 1 + i − 2ζi = 10110 = 11001012
The code is then : 00000001 1100101
2) Decode the code 000001 00111:
i = vi − 1 + 2ζi = 38 with ζi = 5 and vi = 001112 = 710
l(0) = Px (−1) = 0,
u(0) = Px (M − 1) = 1
l(1) = Px (x1 − 1)
u(1) = Px (x1 )
After the second symbol, i = 2, we have
1
lv = log2 Q +1 (12)
P(k)
0.875 0.4375
0.125