You are on page 1of 2

Home Work 4 (August 24, 2018 - August 31, 2019)

MTH 412 (2018)


Applied Stochastic Process

Topics Covered: We have proved that for a recurrent state a particle starts from a
recurrent state will come back to the same state infinitely often with probability 1. We have
(n) (n)
obtained lim pii and limij .

1. Consider the two dimensional random walk with the full infinite plane. Prove that if
all the probabilities are equal, i.e. starting from any state going up, down, forward and
backward are all 1/4, then it is a recurrent Markov Chain.

2. Consider the two dimensional random walk with the full infinite plane. Prove that if
all the probabilities are not equal and all are non-zero, then it is a transient Markov
Chain.

3. Consider the following transition probability matrix


 
1/2 1/2 0 0 0 0 ...
 1/2 0 1/2 0 0 0 . . . 


P =  1/2 0

0 1/2 0 0 . . .
.

.. .. .. .. .. ..
 
. . . . . .
Prove that the Markov Chain is an irreducible aperiodic recurrent Markov Chain. Find
(n)
lim p00 .

4. Consider the following transition probability matrix


 
2/3 1/3 0 0 0 0 ...
 2/3 0 1/3 0 0 0 . . . 


P=
 2/3 0
.
0 1/3 0 0 . . . 
.. .. .. .. .. ..
 
. . . . . .
Prove that the Markov Chain is an irreducible aperiodic recurrent Markov Chain. Find
(n)
lim p00 .

5. Consider the following transition probability matrix


 
p 1−p 0 0 0 0 ...
p

0 1−p 0 0 0 ...
P=
p 0 0 1 − p 0 0 ...
.
.. .. .. .. .. ..
 
. . . . . .
Here 0 < p < 1. Find the values of p such that the Markov Chain is an irreducible
(n)
aperiodic recurrent Markov Chain. Find lim p00 .

1
6. Consider the following transition probability matrix
 
p0 1 − p0 0 0 0 0 ...
 p1

0 1 − p1 0 0 0 ...
P=
 p2 0 0 1 − p2 0 0 ...
.
.. .. .. .. .. ..
 
. . . . . .
m+1
(n) X (n)
Here 0 < pi < 1, for all i. Find f00 . Show that f00 = 1 − um , where um =
n=1
m
Y P∞
(1 − pi ). Prove that um → 0 if and only if i=0 pi = ∞.
i=0

7. Consider the following transition probability matrix with the state space S = {0, 1, 2, 3, 4, 5}
 
1/3 2/3 0 0 0 0

 2/3 1/3 0 0 0 0 

0 0 1/4 3/4 0 0
 
 
P =  

 0 0 1/5 4/5 0 0 

1/4 0 1/4 0 1/4 1/4
 
 
1/6 1/6 1/6 1/6 1/6 1/6
(n)
Find the recurrent and transient classes. Find lim pij for all i, j.

8. Consider the following transition probability matrix with the state space S = {0, 1, 2, 3}

1/6 1/3 1/2 0


 
 1/2 1/2 0 0 
P =
 
1/6 1/3 1/2 0

 
0 1/6 1/3 1/2
(n)
Find the recurrent and transient classes. Find lim pij for all i, j.

9. A Markov Chain on a finite state space, for which all the recurrent states are absorbing,
is called an absorbing chain. Show that for a finite absorbing chain the transition
probability matrix P can be written as follows:
" #
I 0
P =
R1 Q

Find P n .

10. Let a Markov Chain contain m states, then prove that if a j can be reached from state
i, then it can be reached in m − 1 steps or less.

You might also like