Professional Documents
Culture Documents
Xn= i
Markov is particularly
remembered for his
study of Markov chains,
sequences of random
variables in which the
future variable is
determined by the
present variable but is
independent of the way
in which the present
state arose from its
predecessors. This work
launched the theory of
stochastic processes
Terminologies of Markov Model
States
Transition
Probabilities
States in MM
Timestamps Today Tomorrow
Rain No-Rain
No-Rain Rain
Rain Rain
No-Rain No-Rain
States
States and Transition Probabilities
Matrix (TPM) in MM
Timestamps Today Tomorrow
Rain No-Rain
No-Rain Rain
Rain Rain
States
No-Rain No-Rain
TPM
Rain 0.5 No-
Tomorrow
0.5 Rain 0.5
Rain No-
Rain
0.5
Rain 0.5 0.5
Today
A A'
S0 t 1 t
First and second state matrices:
S1 S o P
S 2 S1 P S o P P S 0 P 2
Kth – State matrix
S k S k 1 P S 0 P k
An example: An insurance company classifies drivers as low-risk if they are accident-
free for one year. Past records indicate that 98% of the drivers in the low-risk category (L)
will remain in that category the next year, and 78% of the drivers who are not in the low-
risk category ( L’) one year will be in the low-risk category the next year.
L L'
L 0.98 0.02
P
L ' 0.78 0.22
S 0 0.90 0.10
P(X1=L)=
S1(L)=? S1 S o P L L'
S 2 0.972 0.028
L L'
S1 0.96 0.04
Finding the kth State matrix
S4 S0 P 4
4
0.98 0.02
0.90 0.10
=
0.78 0.22
.97488 0.02512
after four states, the percentage of low-risk drivers has
increased to .97488
Types of MC Problems:
How to calculate the probability after n-
steps?
Example: P(X3=2|X1=1)=p13(2)=?
Notations:
q0 Initial probabilities of the states
q1 Probabilities of the states after 1 time period
.
.
qn Probabilities of the states after n-time period
P(X2=B|X0=A)=pAB(2)=?
Example 1:
P(X2=B|X0=A)=pAB(2)=0.24
Types of MC Problems: