You are on page 1of 7

Example of Markov Chains

NEW STATE
Result of the new match
1 2 3

[ ]
1 1 1
2 4 4
1
CURRENT STATE 3 1 1
2
Result of the last match 5 5 5
3
1 1 1
4 4 2

i) What is the probability that at the next trial the system will move from state 1 to state
3?
ii) What is the probability that at the next trial the system will move from state 3 to state
2?

Solution:

1
i) We are looking at p13 which the element in row 1 and column 3, that is = 0.25
4
1
ii) We are looking at p32 which the element in row 3 and column 2, that is = 0.25.
4

Example:

Given a probability schedule regarding conditions of Dry (D) and wet (W) weather as follows:
NEW
D W
CURRENT
W
D
[ 0.4 0.6
0.3 0.7 ]
Determine the probability that,

i) If it is wet today tomorrow it will be dry?


ii) If today it is dry what is the probability that will be wet after two days?
Solution

Consider a transition matrix,

C= [ 0.4
0.3 0.7 ]
0.6

i) If it is wet today, then we are looking at probability value in row 2 and Column 1 that
is p21 which is 0.3.
ii) To answer this there need to use a probability tree. That is,

D =0.4

0.4(0.6) = 0.24
D =0.4 W =0.6

D =0.3

W =0.6

W =0.7 0.6(0.7) = 0.42

The probability that it will be wet after two days is 0.24 + 0.42 = 0.66

Using a transition Matrix

C= [ 0.4
0.3 0.7 ]
0.6

If today is Dry, then

S0 = [ 1 0 ]

S1 = [ 1 0 ] [0.40.3 0.6
0.7 ]
= [ 0.4 0.6 ]
S2 = [ 0.4 0.6 ] [ 0.4 0.6
0.3 0.7
=¿]
= [ 0. 3 4 0. 6 6 ]

The probability that it will be wet after two days is 0.66

Example:

If there are three major political parties in a country, National Movement (N), Platform party (P)
and Forum party (F). The following schedule of the probabilities that the parties will win
elections:
After Election
N F

N 0.7 0.2 0.1


F 0.5 0.2 0.3
Before P 0.4 0.2 0.4
Election
Solution

This gives rise to a transition Matrix,

[ ]
0.7 0.2 0.1
T = 0.5 0.2 0.3
0.4 0.2 0.4

If National Movement (N) is in power now, what is the probability that Platform party (P) will
be in power after two elections?

To answer this there need to use a probability tree. That is,

After first After Second


Now
Election
Election
N
0.7
0.2 F
N
0.7 0.1
P 0.7(0.1) = 0.07
N
0.2
N
0.5
F
0.1 0.2 F
0.3
W =0.3 0.2(0.3) = 0.06
P
0.4 N
P
F
0.2

0.4

P 0.1(0.4) = 0.04
Total = 0.17

If N is power now, then the probability that P will win elections after two elections is 0.17.

In general, if T denotes a transition matrix for a Markov Chain and Sk is the state matrix
after k trials. Then state matrix Sk + 1 after the next trial is given by,

Sk + 1 = Sk T

Let us use the example above where

[ ]
0.7 0.2 0.1
T = 0.5 0.2 0.3
0.4 0.2 0.4

The initial state matrix,

S0 = [ 1 0 0]

Therefore,

[ ]
0.7 0.2 0.1
N F P
S1 = S0 T = [ 1 0 0 ] 0.5 0.2 0.3 = [ 0.7 0.2 0.1 ]
0.4 0.2 0.4
[ ]
0.7 0.2 0.1
S2 = S1 T = [ 0.7 0.2 0.1 ] 0.5 0.2 0.3 = [ 0.63 0.2 0.17 ]
0.4 0.2 0.4

Therefore the probability that Platform party will be in power after second election is 0.17,
which is the same as we obtained above.

New State = State before x Transition matrix


Sk + 1 = Sk T
=> Sk + 1 = S0 Tk

Use of Matrix Diagonalization


A = PD P-1

This transformation constitutes another way of presenting A just like, in arithmetic 40 when
factorized can be written as 40 = 2 x 4 x 5

It is noted that

[ ][ ]
4
2 0 0 16 0 0
A2 = (PD P-1)( PD P-1) 0 3 0 = 0 81 0
0 0 1 0 0 1
= (PDID P-1)
= PD2P-1

Using same computation we find that,

Ak = PDkP-1

C= [ 0.4
0.3 0.7 ]
0.6

Let λ be the eigenvalue of C


The characteristic equation is
|C – λ I| = 0

=> |0.4−
0.3
λ 0.6
0.7−λ |
= 0 => (0.4−λ )(0.7−λ ¿ – 0.18 = λ 2 – 1.1 λ + 0.1

1.1± √ 1.1 −4 (0.1)


2
λ = 2
=> λ = 1 and λ = 0.1
1 2

This means the diagonal matrix is D = [ 1 0


0 0.1 ]
Corresponding eigenvectors

For λ 1 =1
(C – λ I )x = 0
[ 0.4 0.6
0.3 0.7 ] -[
1 0
0 1 ] =[
−0.6
0 .3
0.6
−0. 3 ]
This gives an augmented matrix

[ −0.6 0.6 0
0.3 −0.3 0 |]
1
- R1→ R1
0.6
1
R2→ R2
0.3

[ 1 −1 0
1 −1 0 |]
R2 –R1 →R2

[ 10 −10 |00]
Thus for x2 = t . x1 = t , thus for λ 1 = 1 the corresponding basic eigenvector is X1 = [ 11], similarly
for λ 2 = 0.1, we have

[ 0.4
0.3 0.7 ] [ 0 0.1 ] [ 0.3 0.6 ]
0.6
-
0.1 0
=
0.3 0.6
[ |]
0.3 0.6 0
0.3 0.6 0

[ |]
1 20
1 20

R2-R1

[ |]
1 20
0 00

x2 =t then x1 = -2t

thus X2 = [−21]
Therefore, P = [ 11 −21 ] => P = 13 [−11 21]
-1

Hence

[ ][ ] [ ] [ ]
2
1 −2 1 0 1 1 2 0.34 0.66
C2 = P D2 P-1 = =
1 1 0 0.1 3 −1 1 0 .33 0. 67

S2 = S0 C2 = [ 1 0 ] [ 0.34
0.33
0.66
0.67 ]
= [ 0.34 0.66 ]

The probability that it will be wet after two days is 0.66.


Therefore for Markov processes, probabilities can be determined using Probabilities,
Matrix multiplication and cases that require extended periods Transition Matrix
Diagonalization

You might also like