Professional Documents
Culture Documents
Mohamed Hamada
Software Engineering Lab
The University of Aizu
Email: hamada@u-aizu.ac.jp
URL: http://www.u-aizu.ac.jp/~hamada
Today’s Topics
1
INFORMATION TRANSFER ACROSS CHANNELS
Sent Received
messages messages
symbols
Channel Source
source encoderChannel
Source
Channel decoder receiver
coding coding decoding decoding
Capacity vs Efficiency
2
Digital Communication Systems
1. Huffman Code.
3. Lemple-Ziv Code.
Source Source
Encoder Decoder
Channel Channel
Encoder Decoder
Modulator De-Modulator
Channel
3
Digital Communication Systems
1. Memoryless information source.
2. Information source with memory:
a. stochastic process
Information b. Markov Process . User of
Source c. ergodic process .
Information
Source Source
Encoder Decoder
Channel Channel
Encoder Decoder
Modulator De-Modulator
Channel
4
Information Source
5
Information Source
6
1. Memoryless information source.
Source S
U1, U2 , U3 ,•••
Entropy H(S)
time
P(x=a|x1x2.....xn) = P(x=a) 7
Information Source
8
2. Information source with memory: stochastic process.
9
Information Source
10
3. Information source with memory: Markov Process.
11
Sources with memory
Output
General source: |S| states
S {1,2, , |S|} U1, U2, , UL subscript = time
Output
Markov source: |S| states
New state = f(old state, output) U1, U2, , UL subscript = time
S {1,2, , |S|}
12
Markov information source (Process ) (Chain)
13
Markov information source (Process ) (Chain)
Let an experiment has a finite number of n possible outcomes
A = { a1, ..., an } called states.
a1 a2 …….. an
a1 p(a1|a1) p(a2|a1) …..... p(an|a1)
a2 p(a1|a2) p(a2|a2) …..... p(an|a2)
T= ....
an p(a1|an) p(a2|an) …..... p(an|an)
14
Transition Diagram
Shannon Diagram:
Example : Consider the Nikkei stock index at certain day has the
following information source:
0.4 un 0.5
15
Transition Diagram
Example : Consider the Nikkei stock index at certain day has the
following information source:
0.4 un 0.5
Today Tomorrow ?
Today Tomorrow ?
0.6
0.5 up up ? 0.6
0.52
0.2 0.5 up up
0.5 0.5
0.2 dn 0.3 dn ?
0.2 0.2 dn
0.4 0.2
0.3 un 0.1 un ? 0.4
0.5
0.3 un
= 0.52
18
Transition Diagram
Example : Consider the Nikkei stock index at certain day has the
following information source:
Today Tomorrow ?
Today Tomorrow ?
0.6
0.5 up up 0.52
0.2 0.5 up 0.2
0.5
0.2 dn 0.3 dn ?
0.3
dn 0.19
0.2 0.2 dn
0.4 0.2
0.3 un 0.1 un ? 0.1
0.5
0.3 un
= 0.19 19
Transition Diagram
Example : Consider the Nikkei stock index at certain day has the
following information source:
Today Tomorrow ?
Today Tomorrow ?
0.6
0.5 up up 0.52
0.2 0.5 up
0.2
0.5
0.2 dn 0.3 dn 0.19
0.2 0.2 dn
0.4 0.2 0.2
0.3 un 0.1 un ? un 0.29
0.5
0.3 un 0.5
= 0.29 20
Transition Diagram
Example : Consider the Nikkei stock index at certain day has the
following information source:
Today Tomorrow ?
Today Tomorrow ?
0.6
0.5 up up ? 0.6
0.52
0.2 0.5 up up
0.5 0.2
0.2 dn 0.3 dn ?
0.5
0.19
0.2 0.2 dn 0.3 dn
0.4 0.2 0.2
0.4 0.2
0.3 un 0.1 un ? 0.29
0.5
0.3 un 0.1 un
0.5
Tomorrow: p(up) = 0.5 x 0.6 + 0.2 x 0.5 + 0.3 x 0.4 = 0.52
Tomorrow: p(dn) = 0.5 x 0.2 + 0.2 x 0.3 + 0.3 x 0.1 = 0.19
Tomorrow: p(un) = 0.5 x 0.2 + 0.2 x 0.2 + 0.3 x 0.5 = 0.29
21
Transition Diagram
Today Tomorrow ?
Today Tomorrow ?
0.6
0.5 up up ? 0.6
0.52
0.2 0.5 up up
0.5 0.2
0.2 dn 0.3 dn ?
0.5
0.19
0.2 0.2 dn 0.3 dn
0.4 0.2 0.2
0.4 0.2
0.3 un 0.1 un ? 0.29
0.5
0.3 un 0.1 un
0.5
Tomorrow: p(up) = 0.5 x 0.6 + 0.2 x 0.5 + 0.3 x 0.4 = 0.52
Tomorrow: p(dn) = 0.5 x 0.2 + 0.2 x 0.3 + 0.3 x 0.1 = 0.19
Tomorrow: p(un) = 0.5 x 0.2 + 0.2 x 0.2 + 0.3 x 0.5 = 0.29
Pr = P 0 x T r 23
Transition Diagram
Example : Consider the Nikkei stock index at certain day has the
following information source:
0.6 0.2 0.2
Index up down unchanged
T = 0.5 0.3 0.2
P0 Probability 0.5 0.2 0.3
0.4 0.1 0.5
Find the probability of the index after 2 days?
Today Tomorrow After Tomorrow ?
T
0.6 0.6
0.5 up up 0.52 up ?
0.2 0.2
0.5 0.5
0.2 dn 0.3 dn 0.19 0.3 dn ?
0.2 0.2
0.4 0.2 0.4 0.2
0.3 un 0.1 un 0.29 0.1 un ?
0.5 0.5
P2= P0 x T2 = (0.5, 0.2, 0.3)x 0.5 0.3 0.2 x 0.5 0.3 0.2
0.4 0.1 0.5 0.4 0.1 0.5 24
Information Source
25
Ergodic Markov Process
Stochastic
Markov
Ergodic
26
Ergodic Markov Process
2. if, from any state, we can reach all other states, then the
process is ergodic
3. if some state(s) are not reachable from any other state, then
the process is NOT ergodic
27
Ergodic Markov Process
Example 1:
Consider the process with the following transition matrix
a b c
a 0 0 1
Is this an ergodic process?
T = b 0.25 0 0.75
c 0 1 0
Answer
From the transition matrix we can build the following transition diagram:
0
0 b 0.25 b
0.25
0
a 0.75 1 a 0.75 1
1
c 0 1 c
0
0 b
0
0.25 b
0.25
1
1
a 0.75 0 a 0.75
0
c 0 1 c
1