You are on page 1of 1



MA225 Probability Theory and Random Processes July - November 2017
Problem Sheet 11 NS

1. Check whether the following processes are Markov chains or not. If they are, then find the one-step
state transition probability matrix P and draw the state transition probability diagram. Also, find the
initial distribution. Find the communicating classes and classify the states according to transient and
recurrent states. Determine the limiting and the stationary distributions if it exists.

(a) Consider the experiment of tossing a coin repeatedly. Let p be the probability of head in a single
trial. The state of the system at time n is the number of heads minus the number of tails in first
n tosses.
(b) Consider a sequence of trials each consisting in placing a ball at random in any of N given cells.
Let there be an indefinite supply of balls and let Xn be the number of empty cells after n balls
are placed.

2. Let, in a Markov chain, 0 be an absorbing state (that is p00 = 1). For j > 0, let pjj = p and pj,j−1 = q,
where p + q = 1. Find the probability fj0 , that absorption at state 0, takes place exactly at the nth

3. The one-step
 transition probability matrix
 for a Markov chain with state space {0, 1, 2, 3, 4} is given
0 1 0 0 0
 1/3 2/3 0 0 0 
 
by P =  0 1/2 0 0 1/2  .
 
 
 0 0 0 1/4 3/4 
0 0 0 1/2 1/2

(a) Indicate the communicating classes and all the closed sets in this Markov chain. Also classify the
states into recurrent or transient states with proper justification.
(b) Find the period of state 0.
(c) Find if possible a proper subset of a closed communicating class which is again closed.
(d) Find if possible the stationary distribution of the Markov chain corresponding to a closed subset
C of S containing state 3. What will be the limiting distribution of the Markov chain with state
space C?
(n) (n) (n)
(e) Find the values of f01 , f22 , f23 , f24 , f20 , f21 , and limn→∞ p01 , limn→∞ p33 , limn→∞ p22 .
(f) What is the expected number of times the system is in state 2?
(g) Suppose the chain starts in state 1, what is the expected number of steps until the system is in
state 4?
(h) Suppose the chain starts in state 2, what is the probability that the system will enter state 0
before it enters state 3?

4. Find an example of a finite state space Markov chain with at least one transient state and has a limiting
or long run distribution that is the same as its stationary distribution.