MTH6141 Random Processes, Spring 2012, Exercise Sheet 5

Please drop your answers to Questions 1 and 2 in the red box on the second floor of the maths building by 13:00 on Tuesday 28th February 2012. You are strongly encouraged to attempt all questions. Please send comments and corrections to m.jerrum@qmul.ac.uk. 1. Consider the Markov chain on state trix  0 2 3 1 space S = {1, 2, 3} with transition ma 1 0  0 1 3 0 0

For each state i ∈ S calculate directly the distribution of Ri (the time of first return to i) and E(Ri ). Use these values, together with Theorem 1.15, to write down the equilibrium distribution w = (w1 , w2 , w3 ) for the chain. Check your answer directly against the definition of equilibrium distribution. 2. Suppose that a and b are states in the same communicating class of a Markov chain. Prove that there exist constants u ∈ N and α > 0 such that
(t+u) paa ≥ α pbb , (t)

for all t ∈ N.

Hence deduce that null recurrence is a class property (that is, if a is a null recurrent state then b is also null recurrent). 3. Consider the Markov chain on state space S = {0, 1, 2, . . . } with transition probabilities  i+1   i+2 , if j = i + 1; 1 pi,j = i+2 , if j = 0;   0, otherwise. Calculate f00 and f00 for this chain and deduce from this that the chain is recurrent. Calculate E(R0 ) and decide whether the chain is positive recurrent or null recurrent.
(t)

1

e. In lectures we considered an asymmetric random walk on the integers Z with and 1 ) for moving up and down. 2 Hint. Consider the random walk on N with a reflecting barrier at 0. More generally. 1 Show that state 0 is transient whenever 0 < p < 1 and p = 2 . and   0. 2 . if j = i − 1. Try comparing the above random walk with the symmetric walk on Z that we analysed in lectures. Prove that state 0 is recurrent. and   0. if i = 0 and |j − i| = 1. i. Is state 999 recurrent? Hint.4.. Repeat the analysis used in the lectures for the special case p = 3 . if i = 0 and j = 1. and then finish with a little calculus. otherwise. • Revise the properties of the Poisson and exponential distributions from MTH 5121 Probability Models. p. the Markov chain with transition probabilities  1   2 . pij = 1. Other activities for reading week • Try some of the exercises on the earlier sheets that you missed out first time. pij = 1 − p. we specific probabilities ( 2 3 3 could consider the walk with transition probabilities   if j = i + 1. otherwise. 5∗ .

Sign up to vote on this title
UsefulNot useful