Professional Documents
Culture Documents
1
2-2
2
2-3 Discrete time - Markov Chain
f (t )
t
3
2-4 Discrete time - Markov Chain
f (t )
t
4
2-5 Markov Chain - Dalam Random Walk
f (t )
turun
naik
naik turun
naik naik
turun
naik turun
turun
t
5
2-6 Markov Chain - Dalam Random Walk
f (t )
turun
naik
naik turun
naik naik
turun
naik turun
turun
naik
naik
turun
naik
turun
t
6
Markov Chain - Random Walk dan
2-7 Proses Bernoulli
Frequency of event
NAIK
TURUN
0 1 event
f (t )
t
8
2-9 Continuous Time - Markov Chain
f (t )
t
9
2-10 Markov Chain
Stochastic process that takes values in a countable set
Example: {0,1,2,,m}, or {0,1,2,}
Elements represent possible states
Chain jumps from state to state
Memoryless (Markov) Property: Given the present
state, future jumps of the chain are independent of
past history
Markov Chains: discrete- or continuous- time
10
2-11 Discrete-Time Markov Chain
Discrete-time stochastic process {Xn: n = 0,1,2,}
Takes values in {0,1,2,}
Memoryless property:
P{ X n 1 j | X n i, X n 1 in 1 ,..., X 0 i0 } P{ X n 1 j | X n i}
Pij P{ X n 1 j | X n i}
11
2-12 Two-State Markov Chain
Jika {Xn: n = 0,1,2,}, dan
Nilai Xn adalah dalam {0,1},
maka matriks transisi probabilitasnya adalah sebagai berikut
( n 1)
0 1
0 1
0 p00 p01 atau
P
1 p10 p11 P (n )
0 p00
p
p01
p11
1
1 10
12
2-13 Two-State Markov Chain
( n 1)
0 1
0 1
0 p00 p01 atau
P p00 p01
1 p10 p11
0
P (n ) p
1 10 p11
13
2-14 Two-State Markov Chain
( n 1)
0 1
0 1
0 p00 p01 atau
P p00 p01
1 p10 p11
0
P (n ) p
1 10 p11
14
2-15 Four-State Markov Chain
Jika {Xn: n = 0,1,2,}, dan
Nilai Xn adalah dalam {0,1,2,3},
maka matriks transisi probabilitasnya adalah sebagai berikut
( n 1)
0 1 2 3
0 1 2 3
p00
1
0 p01 p02 p03
p p00 p03
p13 atau
0 p01 p02
P
1 10 p11 p12
1 p p11 p12 p13
2 p20 p21 p22 p23 P (n ) 10
2 p20 p21 p22 p23
3 p30 p31 p32 p33
3 p30 p31 p32 p33
15
2-16 Chapman-Kolmogorov Equations
n step transition probabilities
Pijn P{X nm j | X m i}, n, m 0, i, j 0
Chapman-Kolmogorov equations
P nm
ij Pikn Pkjm , n, m 0, i, j 0
k 0
1, if i j
P
0
0, if i j
ij
16
2-17 Proof of Chapman-Kolmogorov
17
2-18 Chapman-Kolmogorov Equations
Jika diketahui matriks transisi probabilitas sbb:
p00 p01
P
p10 p11
maka
p00 p01 p00 p01
P
2
p10 p11 p10 p11
In matrix form:
n n 1P n 2 P 2 ... 0 P n
If time-dependent distribution converges to a limit
lim n
n
p is called the stationary distribution
P
Existence of the Stationary distribution depends on
the structure of Markov chain 19
2-20 State Probabilities Stationary Distribution
lim n n n 1P n 2 P 2 ... 0 P n n Pn
n
jika
p00 p01
P
p10 p11
Maka
lim n lim P n
n n
dengan n P n P P ( n 1)
20
Example 1: Transforming a Process
into a Markov Chain
2-21
2-22
22
Example 2: Transforming a Process
into a Markov Chain
2-23
2-24
24
2-25
25
Example 3: Camera Inventory
26
2-27
monday
27
2-28
28
2-29
29
2-30
30
2-31
31
2-32
32
Contoh 4: Probability Class
2-33
33
2-34
34
Contoh 5: Spiders and Fly
2-35
35
2-36
36
2-37 A Markov Chain in Finance
37
Examples:
Chapman-Kolmogorov Equations
38
Examples: Raining Example -
Chapman-Kolmogorov Equations
39
2-40
40
Examples: Two-days Raining Example -
Chapman-Kolmogorov Equations
41
2-42
42
Examples: Camera Inventory -
Chapman-Kolmogorov Equations
43
2-44
44
2-45
45
Lakukan Penghitungannya dengan
2-46 Minitab
Isikan data Transisi probabilitas ke C1 C2 C3 C4
Copy C1-C4 m1
DST
46