You are on page 1of 10

Probability and Random

Processes
Lecture no. (18)

Dr Sherif Rabia
Department of Engineering Mathematics and Physics,
Faculty of Engineering, Alexandria University
sherif.rabia@gmail.com
sherif.rabia@alexu.edu.eg
4.3 Markov chains
- 1-step transition matrix
- System dynamics
- Initial distribution
- m-step transition matrix
- Complete characterization

2 Math 9 - 2015/2016 (second term)


Transition probabilities
The probability to move from state i to state j directly:
pij = P[Xn+1 = j | Xn = i]

Assumed to be the same for all n (time homogeneous)


Example (Taxi position)
P[X3 = 6 | X2 = 5] = P[X10 = 6 | X9 = 5] = P[X19 = 6 | X18 = 5] = p56

The 1-step transition probability matrix


Example (Taxi position) p11 p12 p1L
0.1 0.4 0.4 0.1 p p22 p2 L
0.2 0.4 0.2 0.2 P 21

P

0.25 0.25 0.25 0.25 p L1 pL 2 p LL

0 0.1 0.1 0.8
3 Math 9 - 2015/2016 (second term)
Examples
Example 18.1
1 2
A particle moves through the shown grid.
At each time n, the particle makes a random 3 4
movement to another reachable position.
Let {Xn, n = 0, 1, 2, . . .} be the particle position at time n where X0 is its
initial position.
Is {Xn} a Markov chain? If yes, write its transition matrix.
Example 18.2
Consider repeated tosses of a die.
Let {Xn, n = 0, 1, 2, . . .} be the maximum of the numbers that appear in
the first (n +1) trials where X0 is the first appearing number.
Is {Xn} a Markov chain? If yes, write its transition matrix.

4 Math 9 - 2015/2016 (second term)


Dynamics of the Markov chain
The transition matrix contains almost all the information
needed to identify the state of the process at any time.
Another piece of information (initial distribution) will be
needed later.
Example 18.3
Let {Xn, n = 0, 1, 2, . . .} be a Markov chain with state space {1, 2} and
the shown transition matrix 0.50 0.50
Evaluate
P

(a) P[X1 = 2 | X0 = 1] 0.25 0.75
(b) P[X5 = 1 | X4 = 2, X3 = 1]
How to evaluate
(c) P[X6 = 1, X7 = 2 | X5 = 2] P[X1 = 2]?
P[X2 = 2 | X0 = 1]?
(d) P[X9 = 2, X10 = 1, X11 = 1 | X8 = 2]
5 Math 9 - 2015/2016 (second term)
Initial distribution (0)
The initial distribution is the pdf of X0.
It gives probabilistic information about the staring state
of the process.
x0 1 2 ... L
This pdf looks like
f0(x0) f0(1) f0(2) . . . f0(L)

(0) = [f0(1), f0(2), . . ., f0(L)]

For example, Example 18.3 (contd)


(0) = [0.5, 0.25, 0.25] Evaluate (given (0) = [0.2 0.8])
means P(X0 = 1) = 0.5, (e) P[X0 = 1, X1 = 2]
P(X0 = 2) = 0.25, (f) P[X1 = 2]
P(X0 = 3) = 0.25 (g) P[X1 = 2, X2 = 1]
6 Math 9 - 2015/2016 (second term)
m-step transition probabilities
pij (m) = P[Xn+m = j | Xn = i] p12(3) = P[Xn+3 = 2 | Xn = 1]
p43(5) = P[Xn+5 = 3 | Xn = 4]

0.50 0.50
Example 18.3 (contd) P

Evaluate 0.25 0.75
(h) P[X3 = 1 | X1 = 1] (i) P[X3 = 2 | X1 = 1]

State State

2 2

1 1

. . . Time 0 1 2 3 . . . Time
0 1 2 3
p11(2) = p11 p11 + p12 p21 p12(2) = p11 p12 + p12 p22
7 Math 9 - 2015/2016 (second term)
m-step transition probability matrix P(m)
2-step transition probability m-step transition probability matrix
matrix p ( 2) p ( 2) p (m ) p12( m ) p1L( m )
P ( 2) 11( 2) 12 11( m ) (m)
( 2) (m )
P ( m ) 21
p21 p22 p p22 p2L


In general P(m) = Pm pL1( m ) pL2 (m) (m)
pLL

Having the 1-step


Example 18.3 (contd)
0.50 0.50 transition matrix
Evaluate P
and the initial

0.25 0.75 distribution of a
(j) P[X3 = 1 | X1 = 2]
Markov chain, we
(k) P[X4 = 1 | X1 = 1] have a complete
(l) P[X0 = 1, X2 = 2] (given (0) = [0.2 0.8]) characterization.

8 Math 9 - 2015/2016 (second term)


Homework
Sections covered
12.1,
12.2 (partially)

Problems
12.1.1, 12.1.2, 12.1.4, 12.1.5

9 Math 9 - 2015/2016 (second term)


End of the course

10 )Math 9 - 2015/2016 (second term

You might also like