You are on page 1of 6

Stochastic Processes Spring Semester, 2018-2019

Tutorial 3: April 11
Lecturer: Leonid Mytnik, TA: Segev Shlomov Scribe: TA

3.1 Reminder
3.1.1 Stopping Time
1. Let T be stopping time with respect to a filtration {Ft } if the event
{T ≤ t} ∈ {Ft } ∀t

2. Filtration {Ft } is a filtration if {Ft } is a σ-algebra and it is decreasing

F1 ⊆ F2 ⊆ ... ⊆ Fn

3.1.2 Branching Process


1. Xki : number of children of particle k in generation i

2. The number of children is distributed and iid

3. Zn : size of population at time n

4. Each branching process starts with a single particle: Z0 = 1

5. The population size at time n = 1, then, will be the number of children


from the first particle Z1 = X11 . In the second generation, each particle
from the previous
PZ1 generation will or will not generate children, so we
have Z2 = k=1 Xk2 . Generalizing, we have that:
PZn−1 n
Zn = k=1 Xk

3.1.3 Wald’s Theorem


Theorem 3.1 Let X1 , X2 , ..., be iid. Let T be stopping
PT time with respect to
Fk , Fk = σ(X1 , ..., Xk ) and E(T ) < ∞. Then E( i=1 Xi ) = E(Xi )E(T ).

1
2 Lecture 3: April 11

Proof
PT
Let X : i=1 Xi and µ : E(Xi ).

Notice that:

XT X∞
E(X) = E( Xi ) = E( Xi 1{T ≥i} )
i=1 i=1

So using the monotone convergence we have that:


X∞ ∞
X
E( Xi 1{T ≥i} ) = E(E( Xi 1{T ≥i} |σ(X1 , ..., Xi−1 )))
i=1 i=1

Notice that [{T ≥ i} = {T ≤ i − 1} ∈ σ(X1 , ..., Xi−1 )] is measurable

Then, by independence and recalling that µ , E(Xi ) we have that:

X∞ ∞
X
E( Xi 1{T ≥i} ) = µE(1{T ≥i} )
i=1 i=1

Now, knowing that E(1{T ≥i} ) is the equivalent of calculating the probability
of an event with T ≥ i, we have that:

X ∞
X
µE(1{T ≥i} ) = µ P (T ≥ i)
i=1 i=1

Using the Tail formula, the final sum equals:



X
µ P (T ≥ i) = µE(T )
i=1

Remark
Tail formula: If P (T ≥ 0) = 1, then E(T ) = ∞
P
i=1 P (T ≥ i)
3.2. EXERCISES 3

Then, we reached:

E(X) = µE(T )
Hence, we proved the theorem by using the monotone convergence and
independence. 

3.2 Exercises
E1
Let N, M denote two random variables and they are two stopping times with
respect to {Ft }, then prove the following:
1. M IN {N, M } is also a stopping time

2. M AX{N, M } is also a stopping time

3. the sum N + M is also a stopping time

Sol
Starting with (2), we have to show that At = M AX{N, M } is a stopping
time:

At = {M AX{N, M ≤ t} ∈ Ft ∀t

For {N ≤ t} and {M ≤ t} ∈ Ft :

At = {N ≤ t} ∩ {M ≤ t} ∈ Ft
So the result follows from the definition.

Now for (1), we have that At = M IN {N, M }


It is enough to show that At = {M IN {N, M } > t}
If N, M are stopping times, then so is {M IN {N, M } > t}.

For {N > t}, {M > t} ∈ Ft :

At = {N > t} ∩ {M > t} ∈ Ft
4 Lecture 3: April 11

We conclude by showing that N + M is also a stopping time.


We want to show that the event At = {N + M ≤ t} ∈ Ft
We claim this sum is also a stopping time if there are positive rationals r, s
with r + s < t and N < r, M < s, such that for {N < r}, {M < s} ∈ Ft :

At = ∪r,s∈Q+ ,r+s<t {N < r} ∩ {M < s}

Thus, he have shown that M IN {N, M ≤ t} = N ∩ M , M AX{N, M ≤ t} =


N ∪ M are stopping times from the definition and that the sum of N, M is
also a stopping time by using a trick with positive rational numbers.

E2
Show that T is a stopping (discrete) time if and only if {T = k} ∈ Fk ,
∀k = 1, 2, ...

Sol

Being an iff statement, we have to prove it in two parts.

First part
If {T = k} ∈ Fk , then {T = k − 1} ∈ Fk − 1, from the filtration definition
this implies that it is also in Fk .
Then, following the same principle, {T = k−2} ∈ Fk also and as consequence
we conclude that {T ≤ k} ∈ Fk .

Second part:
If T is a stopping time {T ≤ k} ∈ Ft ∀t > k, we have that {T > k} is also
in the same filtration Ft . Then, we can clearly see by the filtration definition
that {T ≤ k − 1} ∈ Ft . Following the same logic thus far, it is also implied
that {T > k} ∈ Fk .

So, as a result, we can state that for every stopping time {T ≤ k} ∩ {T ≥ k}


we have that {T = k} ∈ Ft , concluding our proof.
3.2. EXERCISES 5

E3
Given a branching process, calculate V ar(Zn ).

Sol
Let µ = 1 (E(x) = µ) and σ 2 = V ar(x), we claim V ar(Zn ) = nσ 2 .

Recall that the variance may be calculated as:

V ar(x) = E(V ar(X|Y )) + V ar(E(X|Y ))


We split it in two parts:

1. E(X|Y ) = E(Zn |Zn−1 ). From the definition of Zn we have that:


Zn−1
X
E(Zn |Zn−1 ) = E( Xkn |Zn−1 )
k=1

Because of the independence, then this equals to:


Zn−1
X
E( Xkn |Zn−1 ) = µZn−1
k=1

2. V ar(X|Y ) = V ar(Zn |Zn−1 ). From the definition of Zn we have that:


Zn−1
X
V ar(Zn |Zn−1 ) = V ar( Xkn |Zn−1 )
k=1

Considering that Zn is measurable and that the process is iid:


Zn−1
X
V ar( Xkn |Zn−1 ) = Zn−1 σ 2
k=1

Then, returning to the variance equation:

V ar(Zn ) = E(Zn−1 σ 2 ) + V ar(µZn−1 )


6 Lecture 3: April 11

Since E(Zn−1 σ 2 ) = µn−1 σ 2 , applying our initial claim and our assumption
of µ = 1, by the induction step we have that the following is true:

nσ 2 = µn−1 σ 2 + µ2 (n − 1)σ 2

Thus, we proved our initial claim that the variance of Zn is given by


2
nσ and we used conditional expectations and the independence principal in
order to do so.

You might also like