You are on page 1of 4

# Stat 150 Stochastic Processes

Spring 2009

## Lecture 8: First passage and occupation times for random walk

Lecturer: Jim Pitman

## First passage and occupation times for random walk

Gamblers ruin problem on {0, 1, . . . , b} with P (a, a 1) = P (a, a + 1) = 0 and b absorbing. As dened in last class, T0,b = rst n : Xn {0, b} ma,b := Ea (T0,b ) m0,b = mb,b = 0 Solve this system of equations ma,b = 1 + 1 (ma+1,b + ma1,b ). 2
1 Recall harmonic equation h(a) = 1 h(a + 1) + 2 h(a 1): 2 1 2

and

h(a+1)

h(a-1)

a-1

a+1

1 Now graphically h(a) = 1 + 2 h(a + 1) + 1 h(a 1) looks like this: a concave 2 function whose value at a exceeds by 1 the average of values at a 1

## Lecture 8: First passage and occupation times for random walk

h(a) h(a+1)

h(a-1) h(a+2)

a-1

a+1

a+2

Simplify the main equation to (ma+1,b + ma1,b )/2 ma,b = 1 which shows that a ma,b is a concave function. The simplest concave function is a parabola, that is a quadratic function of a. The function must vanish at a = 0 and a = b, which suggests the solution ma,b = ca(b a) for some c > 0. Simple algebra shows this works for c = 1, meaning that ma,b := a(b a) solves the equations. The solution is obviously unique subject to the boundary conditions, hence Ea (T0,b ) = a(b a). Note that the maximum is attained for a = b/2 if b is even and for a = (b 1)/2 if b is odd. Exercise: Do this by a martingale argument. For Sn = a + X1 + + Xn , 2 with no barrier at 0 and b, and Xn = 1 (1/2, 1/2), check that (Sn n) is a MG. Now argue that this process stopped at time T0,b is still a martingale, and deduce the result. Occupation times In the previous example, take a state j with 1 j b and calculate the expected number of visits to j before T0,b . Denote Nj := # times hitting j .
b1

Observe that
j =1

## = T0,b . To make this true, we must agree that for j = a, we

count the visit at time 0. (e.g. 0 < a = 1 < b = 2, must hit 0 or b at time 1)

## Lecture 8: First passage and occupation times for random walk

Formally, Nj :=
n=0
b

1(Sn = j )

5 visits to j
j

Discuss the case j = a. Key observations: At every return to a, start a fresh Markov chain from a. If we let p = Pa (T0,b before returning to a) = P (a, b), we see Pa (Na = 1) = p Pa (Na = 2) = (1 p)p Pa (Na = k ) = (1 p)k1 p

Then Ea (Na ) =
k=1

k (1 p)k1 p = 1/p.

Now nd p = P (a, b). Method: condition on the rst step. With probability 1/2 of going up, starting at a + 1: Pa+1 (hit b before a) = P1 (hit (b a) before 0) = Similarly, Pa1 (hit 0 before a) = 1 Pa1 (hit a before 0) = 1 a1 1 = a a 1 ba

1 1 1 b 2a(b a) Thus P (a, b) = ( + )= . So Ea (Na ) = . 2 a ba 2a(b a) b Now consider the case 0 < j < a.

## Lecture 8: First passage and occupation times for random walk

Two cases

Ea (Nj ) =Pa (hit j before b) Ej (Nj ) + Pa (hit b before j ) Ea (Nj |hit b before j ) b a 2j (b j ) , since (Ea (Nj |hit b before j ) = 0) = bj b (b a) =2j , for 1 j a b The graph below shows that for a j b we must get Ea (Nj ) =
E_a(N_j)

2a (b j ). b

a
b1 j =1 a1

Final Check:
b1

## Ea (Nj ) = Ea (T0,b ) = a(b a)

a b1

Ea (Nj ) =
j =1 j =1 a1

Ea (Nj ) +
j =a

Ea (Nj ) +
j =a+1 b1

Ea (Nj )

=
j =1

2j

ba ba a + 2a + 2(b j ) b b b j =a+1