You are on page 1of 21

25

Martingales. Martingales is a very useful concept in probability theory, especially


in its application to Mathematical Finance but also in Physics. To illustrate its
power, consider the following problem: If at each second a Monkey types a letter on
a 26-letter keyboard, with the letter chosen uniformly at random and independently
from other letters being typed. Let τ1 be the first time the Monkey produces,
in consecutive order, the letters abcd, and let τ2 be the first time the Monday
produces,in consecutive order, the letters abba.
Question for you: How big do you think E[τ1 ] is ?
Second question: Is E[τ1 ] = E[τ2 ], E[τ2 ] > E[τ1 ] or E[τ2 ] < E[τ1 ]?
Martingale is a useful concept to help us calculate E[τ2 ].
Definition 49. Let (Ω, F, P) be a probability triple. A sequence (Fn )n≥0 of σ-
subalgebras of F is called a filtration if Fn ⊆ Fn+1 for all n.
Remark 50. A quadruple (Ω, F, (Fn )∞
n=1 , P) is called a filtered probability space.

Definition 51. We say a sequence of random variables on the probability triple


(Ω, F, P) is adapted with respect to the filtration (Fn )∞
n=1 if Xn ∈ mFn .

Definition 52. Let (Ω, F, P) be a probability triple. Let (Fn )n≥0 be a sequence
of filtrations on (Ω, F, P). A sequence of random variables (Mn )n≥0 is called a
martingale with respect to (Fn )n≥0 if (Mn )n≥0 is :
1. Integrable: E(|Mn |) < ∞ for all n;
2. Adapted: Mn ∈ mFn for all n;
3. “Fair game”:
E[Mn |Fn−1 ] = Mn−1 .
Definition 53. Let (Ω, F, P) be a probability triple. Let (Fn )n≥0 be a sequence
of filtrations on (Ω, F, P). A sequence of random variables (Mn )n≥0 is called a
supermartingale with respect to (Fn )n≥0 if (Mn )n≥0 is :
1. Integrable: E(|Mn |) < ∞ for all n;
2. Adapted: Mn ∈ mFn for all n;
3. “Fair game”: For n ≥ 1,
E[Mn |Fn−1 ] ≤ Mn−1 .
Definition 54. Let (Ω, F, P) be a probability triple. Let (Fn )n≥0 be a sequence
of filtrations on (Ω, F, P). A sequence of random variables (Mn )n≥0 is called a
submartingale with respect to (Fn )n≥0 if (Mn )n≥0 is :
1. Integrable: E(|Mn |) < ∞ for all n;
2. Adapted: Mn ∈ mFn for all n;
3. “Fair game”:
E[Mn |Fn−1 ] ≥ Mn−1 .
Definition 55. Let (Xn )∞ n=1 be a sequence of random variables on (Ω, F, P). The
natural filtration (Fn )∞ ∞
n=1 generated by (Xn )n=1 is defined as

Fn = σ(X1 , . . . , Xn ).
Remark 56. The natural filtration generated by (Xn )∞n=1 is the smallest filtration
(Fn )∞
n=1 such that (X )∞
n n=1 is adapted with respect to (Fn )∞
n=1 .

Example 57. Let (Xn )∞ 1


n=1 be an infinite sequence of independent L random

variables on (Ω, F, P) such that E[Xn ] = 0 for all n. Let (Fn )n=1 be the natural
26

Pn
filtration generated by (Xn )∞n=1 . If Sn =

k=1 Xk , then (Sn )n=1 is a martingale

with respect to (Fn )n=1 .
To see this, we check the conditions one by one:
1.
Xn
E[|Sn |] ≤ E [|Xk |] < ∞;
k=1
Pn
2. Since X1 , . . . , Xn ∈ mFn , we have Sn = k=1 Xk ∈ mFn .
3. Note that
E [Sn+1 |Fn ] = E [Sn + Xn+1 |Fn ]
= E [Sn |Fn ] + E [Xn+1 |Fn ]
= Sn + E [Xn+1 ] (Sn ∈ mFn , Xn indep of Fn )
= Sn .

Example 58. Let (Xn )∞ 1


n=1 be an infinite sequence of independent L random
n
variables on (Ω, F, P) such that E[Xn ] = 1 for all n. Let Mn = Πk=1 Xk . Let
(Fn )∞ ∞ ∞
n=1 be the natural filtration generated by (Xn )n=1 . We will show that (Mn )n=1

is a martingale with respect to (Fn )n=1 .
1. As X1 , . . . , Xn are independent,
E[|X1 · · · Xn |] = E [|X1 |] · · · E [|Xn |] < ∞.
2. Mn = X1 · · · Xn ∈ σ(Xk : k ∈ N).
3. Finally,
E [Mn+1 |Fn ] = E [X1 · · · Xn+1 |Fn ]
= X1 · · · Xn E [Xn+1 |Fn ] taking out what’s known
= X1 · · · Xn E [Xn+1 ] independence
= X1 · · · Xn
= Mn .

Example 59. Let X be a L1 random variable on (Ω, F, P). Let (Fn )∞ n=1 be a
filtration on (Ω, F, P). If Mn = E[X|Fn ], then (Mn )∞ n=1 is a martingale with
respect to (Fn )∞
n=1 .
1. By Prop 36.8,
|E[X|Fn ]| ≤ E[|X| Fn ] a.s.
and hence
 
E [|E[X|Fn ]|] ≤ E E[|X| Fn ]
= E[|X|] < ∞
and so Mn ∈ L1 .
2. By definition of conditional expectation, E[X|Fn ] ∈ mFn ;
3.
E [Mn+1 |Fn ] = E [E [X|Fn+1 ] |Fn ]
= E [X|Fn ]
= Mn .
27

Remark 60. In particular, given a random variable X : [0, 1) → R on ([0, 1), B([0, 1)), Leb),
we can define
k k+1
Gn = σ([ n , n ) : 0 ≤ k ≤ 2n − 1).
2 2
If we let
Xn := E[X|Gn ]
R k+1
2n
k XdLeb k k+1
2n
= if ω ∈ [ , ).
Leb([ 2kn , k+1
2n ))
2n 2n
then (Xn )∞
n=1 is a martingale.

Proposition 61. Let (Mn )∞ ∞


n=1 be a submartingale on (Ω, F, (Fn )n=1 , P). Then
1. for all 1 ≤ m ≤ n,
 
E Mn Fm ≥ Mm a.s.;
2. For all n,
E[Mn ] ≥ E [M1 ] ;
3. (−Mn )∞
n=1 is a supermartingale.
1.,2.,3 holds for supermartingales with inequality reversed and “supermartingale”
replaced by submartingale.
Proof. 1. We fix n ∈ N and will prove by induction on n − m. The case when
n − m = 1 is true due to point 3. in the definition of martingale.
Assume that
E [Mn |Fm ] ≥ Mm .
Then
 
E [Mn |Fm−1 ] = E E[Mn |Fm ] Fm−1 a.s. Tower
 
≥ E Mm Fm−1 a.s. induction hypo
≥ Mm−1 supermartingale point 3.
2. Take m = 1 in 1.,

E [Mn |F1 ] ≥ M1 a.s.


and take expectation
E[Mn ] ≥ E[M1 ].
3. If Mn ∈ L1 then −Mn ∈ L1 .
If Mn ∈ mFn then −Mn ∈ mFn .
Finally
E [−Mn+1 |Fn ] = −E [Mn+1 |Fn ]
≤ −Mn a.s.

Remark 62. If (Xn )∞ ∞


n=1 and (Yn )n=1 is a martingale, both with respect to the same
filtration (Fn )n=1 , then (Xn + Yn )∞
∞ ∞
n=1 is a martingale with respect to (Fn )n=1 .
∞ ∞
However, this is not true if (Xn )n=1 and (Yn )n=1 were martingales with respect
to different filtrations.
28

Suppose we invest in a stock whose price at time n, Mn , is a martingale. If I


bought Cn unit of stock at time n and sold all of them at time n + 1, then my profit
at time n + 1 due to stock purchased at time n would be
Cn (Mn+1 − Mn ).
Across the whole time from from 0 to time n, my total profit will be
Z  n−1
X
CdM = Ck (Mk+1 − Mk ).
n k=0

Obviously there will have to a condition on (Cn )∞


n=0 that it is based on information
up to and including time n, that is Cn ∈ mFn .
Lemma 63. Let (Ω, F, P) be a probability triple. Let (Fn )n≥0 be a sequence of
filtrations on (Ω, F, P). Let (Cn )n≥0 be adapted to (Fn )n≥0 and there exists A ∈ R
such that
|Cn | ≤ A a.s. ∀n ≥ 0.

R 
Let (Mn )n=0 be a martingale with respect to (Fn )n≥0 . The sequence [ CdM ]n n≥0
of random variables defined by
Z (
0, n=0
[ CdM ]n = Pn−1
k=0 C k (M k+1 − M k ), n ≥1
is a martingale.
R
Remark 64. The condition |Cn | ≤ A a.s. ∀n ≥ 0 is there to ensure that [ CdM ]n ∈
L1 . It can be replaced by both Cn ∈ L2 and Mn ∈ L2 because it will guarrantee
Cn (Mn+1 − Mn ) ∈ L1 (by Hölder inequality).
Proof. This is simply checking the definition item by item.
Integrability can be checked using
n−1
X n−1
X
Ck (Mk+1 − Mk ) ≤ A |Mk+1 | + |Mk | a.s.
k=0 k=0

and that (Mn )n≥1


R is integrable.
R Adapted: As [ CdM ]n is a continuous function of (C0 , . . . , Cn−1 ) and (M0 , . . . , Mn ),
[ CdM ]n ∈ mFn .
Fair game: For n ≥ 2,
Z  "n−1 #
X
E [ CdM ]n |Fn−1 = E Ck (Mk+1 − Mk ) Fn−1
k=0
"n−2 #
X
=E Ck (Mk+1 − Mk ) Fn−1 + E [Cn−1 (Mn − Mn−1 )|Fn−1 ]
k=0
n−2
X
= Ck (Mk+1 − Mk ) + Cn−1 E [Mn − Mn−1 |Fn−1 ]
k=0
Z
= [ CdM ]n−1 .
29

We now check the n = 1 case. Note that


Z 
E [ CdM ]1 |F0 = E [C0 (M1 − M0 )|F0 ]

= C0 (E [M1 |F0 ] − M0 )
= 0.

Exercise 65. Let (Ω, F, P) be a probability triple. Let (Fn )n≥0 be a sequence of
filtrations on (Ω, F, P). Let (Cn )n≥0 be adapted to (Fn )n≥0 and there exists A ∈ R
such that
0 ≤ Cn ≤ A a.s. ∀n ≥ 0.
Let (Mn )∞ n=0 be a supermartingale
R  (respectively submartingale) with respect to
(Fn )n≥0 . The sequence [ CdM ]n n≥0 of random variables defined by
Z (
0, n=0
[ CdM ]n = Pn−1
k=0 Ck (Mk+1 − Mk ), n ≥ 1

is a supermartingale (respectively submartingale).


Stopping times. One possible trading strategy is to place £1 on the stock until
you stop at time τ . The decision of whether to stop at time n or not, or equivalently
whether τ = n or not, has to be determined using information up to time n, or in
other words {τ = n} ∈ Fn for all n ∈ N ∪ {0}. For example, a non-stopping time
would be “stop just before the stock price is about to crash”, or mathematically,
τ := min{n : Xn+1 = 0.9Xn } because it requires us to look to the next step to
decide whether the price will crash in the future. A stopping time would be, for
example, τ2 := min{n : Xn = 100}, because we know to stop if the price hits 100
for the first time.
Definition 66. Let (Ω, F, P) be a probability triple. Let (Fn )n≥0 be a sequence
of filtrations on (Ω, F, P). A stopping time τ with respect to (Fn )n≥0 is a random
variable on (Ω, F, P) such that
{τ = n} ∈ Fn
for all n ≥ 0.
Example 67. Let (Xn )∞ ∞
n=1 be an adapted process on (Ω, F, (Fn )n=0 , P). For
A ∈ B(R), then let
τA = min{k : Xk ∈ A}.
Then τA is a stopping time because
(τA = n) = ∪n−1
k=1 (Xk ∈
/ A) ∩ (Xk ∈ A) ∈ Fn .
However,
τ̃A = max{k : Xk ∈ A}
is not a stopping time in general. To give an indication why (and this is not a
proof), note that
(τ̃A = n) = (Xn ∈ A) ∩ ∩∞
k=n+1 (Xk ∈
/ A)
which involves events like Xn+1 ∈
/ A that requires us to look beyond time n.
Lemma 68. τ is a stopping time if and only if (τ ≤ k) ∈ Fk for all k.
30

Proof. If (τ ≤ k) ∈ Fk for all k , then for all n > 0, (τ ≤ n) ∈ Fn and (τ ≤


n − 1) ∈ Fn−1 ⊆ Fn . Therefore, (τ = n) := (τ ≤ n)\(τ ≤ n − 1) ∈ Fn . For n = 0,
(τ = n) = (τ ≤ n) ∈ F0 .
If τ is a stopping time, then since Fk ⊆ Fn for all k ≤ n, we have (τ = k) ∈ Fn
for all k ≤ n. Therefore, (τ ≤ n) = ∪nk=0 (τ = k) ∈ Fn . □
Filtration at a stopping time. Sometimes it is useful to consider the information up
to a stopping time. The idea is that is to include all events A for which if we know
the value of τ , then we know whether A happened or not.
Definition 69. Let (Ω, F, P) be a probability triple. Let (Fn )n≥0 be a sequence
of filtrations on (Ω, F, P). Let σ be a stopping time with respect to (Fn )n≥0 . We
will use Fσ to denote the set
{A ∈ F : A ∩ {σ = n} ∈ Fn ∀n}.
Lemma. Let τ be a stopping time on a filtered space (Ω, F, (Fn )∞
n=0 , P). Then Fτ
is a σ-algebra.
Proof. Note that Ω ∈ Fτ because Ω ∩ {τ = n} = {τ = n} ∈ Fn . If A ∈ Fτ , then
Ac ∩ (τ = n) = (τ = n)\ [A ∩ (τ = n)] ∈ Fn . Let (Ak )∞
k=0 be a sequence of events
in Fτ , then Ak ∩ (τ = n) for all k and therefore
[∪∞ ∞
k=0 Ak ] ∩ (τ = n) = ∪k=0 (Ak ∩ (τ = n)) ∈ Fn .

Exercise. Let τ be a stopping time on a filtered space (Ω, F, (Fn )∞
n=0 , P). Then
A ∈ Fτ if and only if A ∩ (τ ≤ n) ∈ Fn for all n.
Solution: If A ∩ (τ ≤ n) ∈ Fn for all n, then
A ∩ (τ = n) = A ∩ (τ ≤ n)\[A ∩ (τ ≤ n − 1)]
∈ Fn
If A ∈ Fτ , then A ∩ (τ = k) ∈ Fk for all k ≤ n and
A ∩ (τ ≤ n) = ∪nk=0 [A ∩ (τ = k)] ∈ Fn .
Exercise. If τ, σ are stopping times, then τ ∧ σ, τ ∨ σ and τ + σ are stopping times.
Solution: Note that if τ, σ are stopping times then
(τ ∧ σ ≤ n) = (τ ≤ n) ∪ (σ ≤ n) ∈ Fn
(τ ∨ σ ≤ n) = (τ ≤ n) ∩ (σ ≤ n) ∈ Fn
(τ + σ ≤ n) = ∪nk=0 (τ = k) ∩ (σ = n − k) ∈ Fn .
Remark. τ − σ is in general not in stopping time. To give an indication why (this
is not a proof), note that
(τ − σ = n) = ∪∞
k=n (τ = k) ∩ (σ = k − n),
which involves events like (τ = n + 1) tha forces us to look beyond time n.
Example. Let (Xn )∞ n=0 be an adapted sequence of real-valued random variables
and A is a Borel set. Then
(Xτ ∈ A) ∩ (τ = n) = (Xn ∈ A) ∩ (τ = n) ∈ Fn
and hence (Xτ ∈ A) ∈ Fτ .
31

Theorem 70. (Optional Stopping Theorem) Let (Ω, F, P) be a probability triple.


Let (Fn )n≥0 be a sequence of filtrations on (Ω, F, P). Let σ and τ be stopping times
such that
σ≤τ a.s.
Suppose further that there exists N ∈ N such that

τ ≤N a.s.

Let (Mn )n≥0 be a martingale with respect to (Fn )n≥0 . Then:


(1) For all A ∈ Fσ ,

(1A (Mτ ∧n − Mσ∧n ))n=0
is a martingale with respect to (Fn )∞
n=0 .
(2) Mτ ∈ L1 and

(1.4) E[Mτ |Fσ ] = Mσ .

The also holds with “martingale” replaced by “supermartingale” or, respectively,


“submartingale” throughout, and E[Mτ |Fσ ] = Mσ replaced by E[Mτ |Fσ ] ≤ Mσ or
respectively E[Mτ |Fσ ] ≥ Mσ in (2).

Proof. Part (1) simply involves applying Lemma 63 with

Ck = 1{ω:σ(ω)≤k<τ (ω)} 1A

where A ∈ Fσ . It is evident that 0 ≤ Ck ≤ 1 but we need to check that Ck ∈ mFk


for all k. Note
Ck = 1A∩(σ≤k)∩(τ ≤k)c .
As A ∩ (σ ≤ k) ∈ Fk (since A ∈ Fσ ) and (τ ≤ k) ∈ Fk , we have Ck ∈ mFk .
Now
Z  n−1
X
CdM = Ck (Mk+1 − Mk )
n k=0
n−1
X
= 1{ω:σ(ω)≤k<τ (ω)} 1A (Mk+1 − Mk )
k=0
X∞
= 1(k<n−1) 1{ω:σ(ω)≤k<τ (ω)} 1A (Mk+1 − Mk )
k=0

X
= 1A 1(k<n−1)∩{ω:σ(ω)≤k<τ (ω)} (Mk+1 − Mk ).
k=0

Note that
(k < n) ∩ {σ ≤ k < τ } ⇐⇒ (σ ∧ n ≤ k < τ ∧ n).
To see the =⇒ direction, note that k < n and (σ ≤ k) implies σ ∧ n = σ and so
σ ∧ n ≤ k. Moreover, k < τ and k < n implies that k < τ ∧ n. For the reverse
direction, if k < τ ∧ n, then k < n and k < τ . Also since k < n and σ ∧ n ≤ k, we
must have σ ≤ k.
32

Thereore,
Z  ∞
X
CdM = 1A 1(σ∧n≤k<τ ∧n) (Mk+1 − Mk )
n k=0
(τ ∧n)−1
X
= 1A (Mk+1 − Mk )
k=σ∧n

with the “−1” due to the inequality “k < τ ∧ n” being equivalent to k ≤ (τ ∧ n) − 1.


Due to telescoping sum cancellation,
Z 
CdM = 1A (Mτ ∧n − Mσ∧n )
n

and so by Lemma 63, n → 1A (Mτ ∧n − Mσ∧n ) is a martingale.


For (2), to show Mτ ∈ L1 , we note that each term in the martingale is in L1 and
so if we take n = N , σ = 0 and A = Ω in (1), then the hypothesis of Theorem is
satisfied and we have
Mτ − M0 ∈ L1
but M0 ∈ L1 due to M being a martingale, so Mτ ∈ L1 . As τ is arbitrary stopping
time that is bounded by N , we can take τ as σ and have Mσ ∈ L1 .
To show (1.4): we note that
(a) For a Borel set B, we also have
(Mσ ∈ B) ∩ (σ = n) = (Mn ∈ B) ∩ (σ = n) ∈ Fn
and hence Mσ is Fσ -measurable.
(b) We already argued that Mσ ∈ L1 ;
(c) We need to show that by
E [Mτ 1A ] = E [Mσ 1A ] ∀A ∈ Fσ .
By (1), since n → 1A (Mτ ∧n − Mσ∧n ) is a martingale, we have for all n
E[1A (Mτ ∧n − Mσ∧n )] = E[1A (Mτ ∧0 − Mσ∧0 )]
= E[1A (M0 − M0 )]
= 0.
Now take n = N , then since τ, σ ≤ N a.s., we have τ ∧ N = τ and σ ∧ N = σ a.s.
and so
E[1A (Mτ − Mσ )],
as required.
For the submartingale case, the only difference when proving (1) is that we use
Exercise 65 instead of Lemma 63. For (2), we need to use Exercise 71 below instead
of the definition of conditional expectation. For (2), more precisely, we need to show
that
E[E[Mτ |Fσ ]1A ] ≥ E[Mσ 1A ]
for all A ∈ Fσ . By the definition of conditional expectation, we have E[E[Mτ |Fσ ]1A ] =
E[Mσ 1A ] and so the above is equivalent to
E[Mτ 1A ] ≥ E[Mσ 1A ].
33

However, from the submartingale version of (1) we have that n → 1A (Mτ ∧n −Mσ∧n )
being a submartingale and so by taking n = N (so that τ ∧ N = τ and σ ∧ N = σ),
we have
E[1A (Mτ − Mσ )] ≥ E[1A (Mτ ∧0 − Mσ∧0 )] = 0.
The supermartingale case is exactly the same as the submartingale case with in-
equalities reversed. □

Exercise 71. Let X, Y be L1 random variables on (Ω, F, P) and let G be a σ-


subalgebra of F. If for all A ∈ G, we have
E[(X − Y )1A ] ≥ 0,
then X ≥ Y a.s.
Solution: As X, Y are G-measurable, X − Y is G-measurable. Therefore, (X −
Y < 0) ∈ G and we have
E[(X − Y )1A ] ≤ 0,
and combining with the given inequality in the Exercise, we have E[(X −Y )1A ] = 0.
But since (X − Y )1A ≤ 0 for all ω, we have by Measure Theory/Maths of Random
Events that
(X − Y )1A = 0 a.s.
As X − Y < 0 on A, we must have P(A) = 0, or equivalently X ≥ Y a.s.

Corollary 72. Let (Mn )∞ ′
n=0 be a martingale on (Ω, F, (Fn )n=0 , P). Let τ be a
∞ ∞
stopping time. Then (Mτ ′ ∧n )n=0 is a martingale with respect to (Fn )n=0 . The same
holds with “martingale” replaced by “submartingale” or “supermartingale” through-
out.
Proof. Let N ∈ N. By taking τ as τ ′ ∧ N σ = 0 and A = Ω in Theorem 70,
n → Mτ ′ ∧N ∧n − M0 is a martingale. Since E[M0 |Fn ] = M0 as M0 ∈ mF0 ⊆ mFn
and M0 ∈ L1 , we have that n → Mτ ′ ∧N ∧n is a martingale for all N ∈ N. In
particular, we have Mτ ′ ∧n ∈ L1 (by taking N = n) and Mτ ′ ∧n ∈ mFn . Moreover,
taking N = n + 1 and using that Mτ ′ ∧N ∧n is a martingale, we have
E[Mτ ′ ∧N ∧(n+1) |Fn ] = Mτ ′ ∧N ∧n a.s.
which because N = n + 1, we have
E[Mτ ′ ∧(n+1) |Fn ] = Mτ ′ ∧n a.s.

Corollary 73. (Also called Optional Stopping Theorem) Let (Mn )∞ n=0 be a mar-

tingale on (Ω, F, (Fn )n=0 , P). Let τ be a stopping time satisfying at least ONE of
the three conditions below:
(i) There exists N ∈ N such that τ ≤ N a.s.
(ii) τ < ∞ a.s. and there exists K ∈ R such that |Mn | ≤ K for all n a.s.
(iii) E[τ ] < ∞ and there exists K ∈ R and |Mn+1 − Mn | ≤ K for all n .a.s.
Then E[Mτ ] = E[M0 ].
If “martingale” is replaced by “submartingale” or, respectively, supermartingale,
then we will have “E[Mτ ] ≥ E[M0 ]” or repectively “E[Mτ ] ≤ E[M0 ]”.
34

Proof. By Corollary 72, E[Mτ ∧n ] = E[M0 ] for all n.


For (i), we take n = N which ensures that Mτ ∧n = Mτ and so
E[Mτ ] = E[M0 ].
For (ii), since τ < ∞ a.s we have for almost all ω, that there exists Nω ∈ Nsuch
that for all n ≥ Nω ,
τ (ω) ∧ n = τ (ω).
As a result we have that for all n ≥ Nω
|Mτ (ω)∧n − Mτ (ω) (ω)| = 0 < ϵ
and this means Mτ ∧n → Mτ a.s. as n → ∞.
Moreover, for all n,
|Mn | ≤ K
and so by Dominated Convergence Theorem, as n → ∞,
E[Mτ ∧n ] → E[Mτ ].
Since E[Mτ ∧n ] = E[M0 ] for all n, we have
E[Mτ ] = E[M0 ].
For (iii), we note that
τ
X −1
Mτ ∧n − M0 = M(k+1)∧n − Mk∧n .
k=0
We have (
=0 if k ≥ n
|M(k+1)∧n − Mk∧n |
≤K if k < n.
which implies that
|Mτ ∧n − M0 | ≤ Kτ.
By the Dominated Convergence theorem, since E[τ ] < ∞ and we know from the
same argument as (ii) that Mτ ∧n → Mτ (E[τ ] < ∞ =⇒ τ < ∞ a.s.), as n → ∞,
E[Mτ ∧n − M0 ] → E[Mτ − M0 ]
However, E[Mτ ∧n − M0 ] = 0 and so E[Mτ − M0 ] = 0. □
Monkey typing Shakespeare. We now recall a motivating appication of martingale
which we mentioned earlier: At each second, a Monkey types 1 letter, uniformly
at random and independently of all other letters. Let τ be the first time Monkey
reaches the sequence of letters ANA. We wish to find E[τ ].
The trick is to construct a martingale Mn = An − n, where it is possible to
calculate Aτ quite easily, then Corollary 73 (assuming the conditions for the Corol-
lary is satisfied) implies that E[Aτ ] − E[A0 ] = E[τ ]. Following the construction
of (Mn )∞n=0 in Probability with Martingales by D Williams, we imagine a casino
where the aforementioned monkey is located. Just before the Monkey typing the
jth letter, exactly one gambler arrives, who we call Gambler j. Gambler j puts £1
on Monkey would type A at time j. If Moneky types any letter other than A at
time j, then Gambler j gets nothing back (so they lost the £1 stake) and leaves
the casino. If Monkey types A at time j, then Gambler j gets back £26 (which
includes the £1 Gambler j put in). In this case, Gambler j would put all £26 on
Monkey typing letter N at time j + 1. If Monkey does not type N at time j + 1,
35

then Gambler j lost the £26 they put in and leaves the casino. Otherwise, Gambler
j gets back £262 (which includes the £26 Gambler j puts in at time j + 1), which
Gambler will put into Monkey typing letter A at time j + 2. If Monkey types A at
time j + 2, then Gambler j gets back £263 and leaves, otherwise Gambler j leaves
with nothing. The martingale Mn is supposed to represent the total net profit of
all Gamblers up to just before the Monkey types the n + 1th letter.
To define the filtration, let

1 The n th letter the Monkey typed is A

Xn = 2 The n th letter the Monkey typed is N

3 The n th letter the Monkey typed is any of the other 24 letters

andd let Fn be the σ-algebra generated by X1 , X2 , . . . , Xn . This means we only


care about whether Monkey typed A, N or neither, rather than the exact letters
the Monkey types.
We fix j for the rest of this paragraph and first focus on the net profit of Gambler
j . Let
( (
j 25, if Monkey types A at time j j 25, if Monkey types N at time j + 1
ξj = , ξj+1 =
−1, otherwise −1, otherwise
(
j 25, if Monkey types A at time j + 2 j
ξj+2 = , ξn = 0 ∀n ∈
/ {j, j + 1, j + 2}.
−1, otherwise
Then for n, E[ξnj ] = 0 (and in fact the “25” is chosen to ensure exactly this). The
sequence (ξnj )∞n=0 is a sequence of independent random variables because for n ∈
{j, j + 1, j + 2}, ξnj depends only on Monkey’s nth letter which means ξjj , ξj+1
j j
, ξj+2
are independent random variables and hence for Borel sets Aj , Aj+1 , Aj+2
P(ξjj ∈ Aj , ξj+1
j j
∈ Aj+1 , ξj+2 j
∈ Aj+2 ) = P(ξjj ∈ Aj )P(ξj+1 j
∈ Aj+1 )P(ξj+2 ∈ Aj+2 )
and so
P(ξ1j ∈ A1 , . . . , ξj−1
j
∈ Aj−1 , ξjj ∈ Aj , ξj+1
j j
∈ Aj+1 , ξj+2 j
∈ Aj+2 , ξj+3 ∈ Aj+3 , . . . , ξnj ∈ An )
(
P(ξjj ∈ Aj )P(ξj+1 j
∈ Aj+1 )P(ξj+2j
∈ Aj+2 ), if all of A1 , . . . , Aj−1 , Aj+3 , . . . contains 0
=
0, otherwise,
showing that (ξnj )∞
n=0 is an independent sequence of random variable with zero
expectation. By Example 57, for each j, the process (Ynj )∞
n=0 defined by
(P
n j
k=1 ξk , n > 0
Ynj =
0, n = 0.
is a martingale with respect to (Fn )∞n=1 . Motivated by Gambler j’s gambling strat-
egy, we define Ckj to be the money Gambler j puts in just after the n-letter is
typed (
j 26k−j , ξjj , . . . , ξkj = 25
Ck =
0, otherwise.
Note
R j that |Ckj | ≤ 263 for all k and j and (Cnj )∞
n=0 Ris adapted. By Lemma 63,
j ∞
( C dY )j=0 is a martingale. The random variable ( C j dY j )n represents the net
profit of Gambler j immediately after the n-th letter has been typed.
36

Pn R j j
Now we allow j to vary. Let Mn = j=1 C dY n , representing the profit of
all Gamblers puts together up to time n. Note that (Mn )∞
n=0 is also a martingale
with respect to (Fn )∞
n=0 because
 
n+1
X Z 
E [Mn+1 |Fn ] = E  C j dY j |Fn 
j=1 n+1

n+1
"Z  #
X
= E C j dY j |Fn
j=1 n+1
n+1
X Z  Z 
j j j j
= C dY , as n → C dY is martingale.
j=1 n n

Note that as ξjn+1 = 0 for all j ≤ n, we have


R n+1 n+1 
C dY n
= 0. Therefore,
Xn Z 
E [Mn+1 |Fn ] = C j dY j = Mn a.s.
j=1 n

We now wish to apply Corollary 73(iii) on (Mn )∞


First we note that |Mn+1 −
n=0 .
Mn | is the net profit as a result of Monkey’s n-th letter. However, at each time n,
there are only at most 3 active gamblers, each having net profit or loss no more
than £263 and therefore for all n,
|Mn+1 − Mn | ≤ 3 · 263 .
To show that E[τ ] < ∞, we will use the following Exercise from Sheet 4:
Exercise 74. Let τ be a stopping time on (Ω, F, (Fn )∞
n=0 , P). Suppose there exists
K > 0 such that for all n
E[1(τ ≤n+K) |Fn ] ≥ ε a.s.
then E[τ ] < ∞.
Here note that
Xn+1 = 1, Xn+2 = 2, Xn+3 = 1 =⇒ τ ≤ n + 3
and therefore
1(Xn+1 =1,Xn+2 =2,Xn+3 =1) ≤ 1(τ ≤n+3)
and therefore
E[1(τ ≤n+3) |Fn ] ≥ E[1(Xn+1 =1,Xn+2 =2,Xn+3 =1) |Fn ].
By independence of Xn+1 , Xn+2 , Xn+3 with Fn , we have
 
E[1(Xn+1 =1,Xn+2 =2,Xn+3 =1) |Fn ] = E 1(Xn+1 =1,Xn+2 =2,Xn+3 =1)
= P(Xn+1 = 1, Xn+2 = 2, Xn+3 = 1)
1
= ( )3 .
26
Therefore by Exercise 74,
E[τ ] < ∞.
By Corollary 73(iii), we have
E[Mτ ] = E[M0 ].
37

R j j
Note that M0 = 0 (since C dY 0 = 0 for each j). Recall that Mτ is the net
profit of all gamblers up to just after Monkey typed the first sequence of ANA. At
this time, Gambler τ − 2 has net profit of 263 − 1, Gambler τ − 1 has net proft of
−1, Gambler τ has net profit of 26 − 1. Gamblers from 1 to τ − 3 has net profit of
−1 each. Therefore the total net profit is
Mτ = 263 + 26 − τ
and hence
0 = E[M0 ] = E[Mτ ] = 263 + 26 − E[τ ]
from which we obtain E[τ ].
Theorem 75. (Martingale Inequality) Let (Mn )n≥1 be a submartingale on (Ω, F, P, (Fn )n≥0 ).
Let λ > 0. Then for N ∈ N,
1  
P( max Mn ≥ λ) ≤ E MN 1(max0≤n≤N Mn ≥λ) .
0≤n≤N λ
Proof. We will apply Optional Stopping Theorem to
(
N, Mk < λ ∀0 ≤ k ≤ n
τ=
min{0 ≤ n ≤ N : Mn ≥ λ}, ∃0 ≤ k ≤ n : Mk < λ.
Since τ ≤ N , by Optional Stopping Theorem 70,
E [Mτ 1F ] ≤ E [MN 1F ]
for any F ∈ Fτ .
We will show that  
F = max Mn ≥ λ ∈ Fτ .
0≤n≤N

If k < N , since
τ ≤ k =⇒ max Mn ≥ λ
0≤n≤N

we therefore have
F ∩ (τ ≤ k) = (τ ≤ k) ∈ Fk .
Now since (τ ≤ N ) = Ω,
F ∩ (τ ≤ N ) = {ω : max Mn ≥ λ}
0≤n≤N

= ∪N
n=0 (Mn ≥ λ) ∈ FN .
Take F = (max0≤n≤N Mn ≥ λ). Note that on F , Mτ ≥ λ and therefore
E [Mτ 1F ] ≥ λP( max Mn ≥ λ).
0≤n≤N

Corollary 76. Let (Mn )n≥0 be a submartingale on (Ω, F, P, (Fn )n≥0 ). Let λ > 0.
Then for N ∈ N,
1  +
P( max Mn ≥ λ) ≤ E MN ,
0≤n≤N λ
+
where MN = max(MN , 0).
38

Proof. By Theorem 75,


1  
P( max Mn ≥ λ) ≤ E MN 1(max0≤n≤N Mn ≥λ)
0≤n≤N λ
1  + 
≤ E MN 1(max0≤n≤N Mn ≥λ)
λ
1  +
≤ E MN .
λ

We would like to prove a Lp version of the martingale inequality, using the
martingale inequality above. This requires the use of Hölder’s inequality.
As ST342 (Mathematics of Random Events) students have not yet covered Hölder
inequality, we will review this as follows (just as we did for MA359 students with
Monotone Class Theorem, which was covered in ST342):
Recall we showed the conditional Jensen’s inequality that for convex function g,
random variable X such that g(X), X ∈ L1 and a σ-subalgebra G of F, that
E[g(X)|G] ≥ g (E[X|G]) a.s..
Taking G = {Ω, ∅}, we have the unconditional version of Jensen’s inequality
E[g(X)] ≥ g (E[X]) .
Recall that a function g is convex if for all x < y < z
z−y y−x
g(y) ≤ g(x) + g(z),
z−x z−x
or equivalently
g(y) − g(x) g(z) − g(y)
(1.5) ≤ .
y−x z−y
Suppose that g is twice-continuously differentiable and g ′′ ≥ 0, then g ′ is increasing
and by Mean-Value theorem, (1.5) holds. In particular, if g(x) = xp with p ≥ 1,
then g ′′ (x) = p(p − 1)xp−2 ≥ 0 for x > 0 and so for all non-negative random
variables such that E[X] < ∞ and E[X p ] < ∞
1
E[X p ] p ≥ E[X].
Lemma 77. (Hölder inequality) Let X, Y be non-negative random variables and
p, q be such that p1 + 1q = 1, then
Z Z  p1 Z  q1
p q
XY dP ≤ X dP Y dP .
Ω Ω Ω

Proof. The inequality is trivial if Y = 0 a.s.


Otherwise,
Z  q1
q
∥Y ∥Lq = Y dP > 0.

We consider
Z Z
1 1
XY dP = XY 1(Y >0) dP
∥Y ∥qLq Ω ∥Y ∥q q
Z L Ω
Yq
= XY 1−q 1(Y >0) dP.
Ω ∥Y ∥qLq
39

q
Define Q(A) = A ∥YY∥q q dP. Then Q is a probability measure. It is immediate, by
R
L
linearity, that for all simple functions f ,
Yq
Z Z
(1.6) f q dP = f dQ
Ω ∥Y ∥Lq Ω
and Monotone Convergence theorem gives that for all non-negative f such that
Yq
Z
f q dP < ∞,
Ω ∥Y ∥Lq

we have (1.6). In particular,


Yq
Z Z
XY 1−q 1(Y >0) q dP = XY 1−q dQ.
Ω ∥Y ∥ L q Ω
R
Now by Jensen’s inequality applied to (· · · )dQ,
Z Z Z  p1
1 1−q p p−pq
XY dP = XY dQ ≤ X Y 1(Y >0) dQ
∥Y ∥qLq Ω Ω Ω
 p1
Yq
Z
p p−pq
= X Y 1(Y >0) dP
Ω ∥Y ∥qLq
Z  p1
p 1
≤ X 1(Y >0) dP q
Ω ∥Y ∥Lp q
∥X∥Lp
≤ q

∥Y ∥Lp q
and hence Z
XY dP ≤ ∥X∥Lp ∥Y ∥Lq .


Lemma 78. Let X and Y be non-negative random variables. If for all λ > 0,
E [Y 1X≥λ ]
P(X ≥ λ) ≤ ,
λ
then for all 1 < p < ∞, we have
p
∥X∥p ≤ ∥Y ∥p .
p−1
[Recall that
Z  p1
p
∥X∥p = |X(ω)| dP(ω) .]

Proof. We assume that ∥Y ∥p < ∞, otherwise the result is trivial.


Case 1: ∥X∥p < ∞
Note that
"Z #
X
E[X p ] = E pλp−1 dλ
0
Z ∞ 
p−1
=E pλ 1(λ≤X) dλ .
0
40

By Fubini’s theorem (used twice),


Z ∞
p
E[X ] = pλp−1 P [X ≥ λ] dλ
0
Z ∞
≤ pλp−2 E [Y 1X≥λ ] dλ
0
" Z #
X
=E Y pλp−2 dλ
0
p
E Y X p−1 .
 
=
p−1
By Hölder’s inequality,
1 p−1
E Y X p−1 ≤ E [Y p ] p E[X p ] p .
 

Therefore
1 p 1
E[X p ] p ≤ E[Y p ] p .
p−1
Case 2. ∥X∥p = ∞
In this case, for n ∈ N,Xn = X ∧ n ∈ Lp .
We first claim that
E [Y 1X∧n≥λ ]
P(X ∧ n ≥ λ) ≤
λ
Note that for n ≥ λ,
X ≥ λ ⇐⇒ X ∧ n ≥ λ.
Therefore, if n ≥ λ,
E [Y 1X∧n≥λ ]
P(X ∧ n ≥ λ) ≤ .
λ
If n < λ,
P(X ∧ n ≥ λ) = 0 = E [Y 1X∧n≥λ ] ,
and so the claim is proved.
Applying Case 1, we have
1 p 1
E[(Xn )p ] p ≤ E[Y p ] p .
p−1
By Monotone Convergence Theorem,
1 p 1
E[X p ] p ≤ E[Y p ] p
p−1
(in fact both sides are infinite). □
Corollary 79. (Martingale Lp inequality)Let N ∈ N ∪ {0}. Let (Mn )n≥0 be a
submartingale on (Ω, F, P, (Fn )n≥0 ) such that
max Mn ≥ 0 a.s.
0≤n≤N

Then for 1 < p < ∞,


p
∥ max Mn ∥p ≤ ∥MN ∥p .
0≤n≤N p−1
Proof. This comes from applying Martingale inequality and Lemma 78 with X =
max0≤n≤N Mn and Y = |MN |. □
41

Martingale Convergence Theorem. Given a sequence (an )n≥1 of real numbers,


define recursively
σ1 = min{n ≥ 0 : an < a}
τ1 = min{n ≥ τ1 : an > b}
..
.
σn = min{n ≥ σn−1 : an < a}
(1.7) τn = min{n ≥ τn : an > b},
with the convention that min ∅ = ∞. Define the number of upcrossing on or before
index n as
UN ((an )n≥0 , [a, b]) = max{n ∈ {0, 1, . . . , N } : σn < ∞},
with the maximum of an unbounded set defined as ∞, and similarly
U∞ ((an )n≥0 , [a, b]) = max{n ∈ N ∪ {0} : σn < ∞}.
If the set
max{n ∈ N ∪ {0} : σn < ∞}
is unbounded, then we write U∞ ((an )n≥0 , [a, b]) = ∞.
Corollary 80. Let (an )n≥0 be a sequence of real numbers. For all α < β if
U∞ ((an )n≥0 , [α, β]) < ∞ then at least one of the two sets
Uβ ={n ≥ 0 : an > β}
Lα ={n ≥ 0 : an < α}
is finite.
Proof. We will prove by contrapositive. Suppose that both Uβ and Lα are infinite
set. We will prove that U∞ ((an )n≥0 , [α, β]) = ∞. It is sufficient to show that
σi < ∞ and τi < ∞ for all i. Note that
1. σ1 < ∞ because Lα ̸= ∅ so there is at least one n such that an < α;
2. If σn < ∞, then τn < ∞ because Uβ is infinite so there is at least one number
k, satisfying k ≥ αn , and ak > β.
3. If τn < ∞, then σn+1 < ∞ because Lα is infinite so there is at least one
number k, satisfying k ≥ τn , and ak < α. □
By induction, U∞ ((an )n≥0 , [α, β]) = ∞.
Lemma 81. Let (an )n≥0 be a sequence of real numbers such that
1. lim inf n→∞ |an | < ∞;
2. For all rational numbers α < β,
U∞ ((an )n≥0 , [α, β]) < ∞.
Then (an )n≥1 converges.
Proof. By 1.,
m → inf |an |
n≥m
is a bounded sequence so there exists M ∈ R such that
inf |an | < M
n≥m
42

for all m. In particular, by approximation property, there is an infinite sequence of


natural numbers k1 < k2 < k3 < . . . such that for all m,
|akm | < M + 1.
By Bolzano-Weierstrass theorem, (akm )m≥0 has a convergent subsequence, con-
verging to l. Let ε > 0. There exists N ∈ N such that for all j ≥ N ,
ε ε
(1.8) l − < aknj < l + .
2 2
Let q1 < q2 be rational numbers such that q1 , q2 ∈ (l + 2ε , l + ε). As (1.8) implies
that Lq1 is infinite, this means Uq2 is finite. Likewise if r1 < r2 are rational numbers
with r1 , r2 ∈ (l − ε, l − 2ε ), then (1.8) implies Ur2 is an infinite set, so Lr1 is a finite
set. Therefore, for all but finitely many n,
l − ε < r2 < an < q2 < l + ε,
implying that (an )n≥0 converges. □
Lemma 82. Let (Mn )n≥0 be an adapted sequence of random variables on (Ω, F, (Fn )∞ n=0 , P).
Then for each i, τi and σi defined in (1.7) with (an )∞
n=0 replaced by (M ) ∞
n n=0 is a
stopping time.
Proof. We will prove this by strong induction. For k ∈ N, we have
(σ1 ≤ k) = ∪kj=0 (Xj < α) ∈ Fk
(τ1 ≤ k) = ∪kj=0 (σ1 = j) ∩ ∪km=j+1 (Xj > β) ∈ Fk .
Assume that τi is a stopping time.
(σi+1 ≤ k) = ∪kj=0 (τi = j) ∩ ∪km=j+1 (Xj > β) ∈ Fk
(τi+1 ≤ k) = ∪kj=0 (σi+1 = j) ∩ ∪km=j+1 (Xj > β) ∈ Fk .

The Lemma below enables us to take expectation of E [UN (Mn , [α, β])].
Corollary 83. Let (Mn )n≥0 be an adapted sequence of random variables on (Ω, F, (Fn )∞
n=0 , P).
Then for each α < β, UN (Mn , [α, β]) is a random variable.
Proof. Note that
(UN (Mn , [α, β]) ≥ j) = ∩jn=1 (σn < ∞) = ∩jn=1 ∪∞
m=0 (σn = m) ∈ F.


Lemma 84. (Upcrossing Lemma) Let (Mn )n≥0 be a supermartingale on (Ω, F, P, (Fn )n≥0 ).
Let N ∈ N. Then
E [(a − MN )+ ]
≥ E [UN (Mn , [a, b])] .
b−a
Proof. We will use martingale transform. Let (Cn )n≥0 be exactly 1 “during an
upcrossing of (Mn )n≥0 ” and 0 when (Mn )n≥0 is not in the process of an upcrossing.
More precisely, (Cn )n≥0 corresponds to the strategy of:
1. Place £1 at the first time (Mn )n≥0 first fall below a.
2. Keep placing £1 until (but not including) the first time (Mn )n≥0 goes above
b.
3. Stop placing bets until (Mn )n≥0 falls below a again.
43

4. Place £1 at the first time (Mn )n≥0 falls below a and then repeats 2.,3. and
4. indefinitely.
Mathematically, (
1, if ∃i : σi ≤ n < τi ;
Cn =
0, otherwise.
Since σi and τi are both stopping times,
(σi ≤ n < τi ) = (σi ≤ n) ∩ (τi ≤ n)c ∈ Fn
and so (Cn )∞
n=0 is an adapted sequence. Moreover,

Ci = 1(∪i (σi ≤n<τi )) ∈ {0, 1}


and so is bounded. As a result, Exercise 65 is applicable.
Let i∗ be the largest integer such that
τi∗ ≤ N,

or equivalently i = UN (Mn , [a, b]). This (Cn )n≥0 is chosen to ensure that

N
X −1 N
X −1 iX +1
Ck (Mk+1 − Mk ) = 1(σi ≤k<τi ) (Mk+1 − Mk )
k=0 k=0 i=1

iX +1 N
X −1
= 1(σi ≤k<τi ) (Mk+1 − Mk )
i=1 k=0

X i −1
i τX N
X −1
= (Mk+1 − Mk ) + 1(σi∗ +1 ≤k) (Mk+1 − Mk )
i=1 k=σi k=0

i
X N
X −1
= M τi − M σ i + 1(σi∗ +1 ≤k) (Mk+1 − Mk ).
i=1 k=0

Note that for all i,


Mτi − Mσi ≥ b − a.
If σi∗ +1 ≤ N − 1, then
N
X −1
1(σi∗ +1 ≤k) (Mk+1 − Mk ) = MN − Mσi∗ +1
k=0

Since Mσi∗ +1 ≤ a, we have


N
X −1
1(σi∗ +1 ≤k) (Mk+1 − Mk ) ≥ MN − a.
k=0

If σi∗ +1 > N − 1, then


N
X −1
1(σi∗ +1 ≤k) (Mk+1 − Mk ) = 0.
k=0

Therefore
N
X −1
1(σi∗ +1 ≤k) (Mk+1 − Mk ) ≥ min(MN − a, 0) = −(a − MN )+ .
k=0
44

Therefore,
N
X −1
Ck (Mk+1 − Mk ) ≥ UN (Mn , [a, b])(b − a) − (a − MN )+ .
k=0
PN −1
However, as N → k=0 Ck (Mk+1 − Mk ) is a supermartingale,
"N −1 # Z  
X
E Ck (Mk+1 − Mk ) ≤ E CdM =0
k=0 0

and hence
0 ≥ E [UN (Mn , [a, b])] (b − a) − E (a − MN )+
 

and therefore
E [(a − MN )+ ]
≥ E [UN (Mn , [a, b])] .
b−a

Lemma 85. Let (an )∞
n=0 be a sequence of real numbers. Then

lim UN (an , [α, β]) = U∞ (an , [α, β]).


N →∞

Proof. If U∞ (an , [α, β]) < ∞ and let i∗ = UN (an , [α, β]), then there is no upcross-
ing after time τi∗ , because there are i∗ upcrossings in total and τi∗ is the completion
time of the final upcrossing. Therefore, for all N ≥ τi∗ ,
UN (an , [α, β]) = U∞ (an , [α, β])
and so the Lemma follows in this case by letting N → ∞.
If U∞ (an , [α, β]) = ∞, then τi < ∞ for all i. Note that
Uτi (an , [α, β]) = i
as τi is the completion time of the i-th upcrossing. Therefore, given any M ∈ N,
for any N ≥ τM ,
UN (an , [α, β]) ≥ M
implying that UN (an , [α, β]) → ∞ as N → ∞.

Theorem 86. (Martingale Convergence Theorem) Let (Mn )n≥0 be a supermartin-
gale on (Ω, F, P, (Fn )n≥0 ). Suppose that
sup E[|Mn |] < ∞.
n≥0

Let M∞ := lim supn→∞ Mn . Then (Mn )n≥0 converges almost surely a.s. and the
limit is equal to M∞ a.s.. Moreover, M∞ ∈ L1 .
Proof. By Corollary 80, it is sufficient to show that
h i
P lim inf |Mn | < ∞ = 1
n→∞

and that for all rational numbers a < b


(1.9) P (U∞ (M, [a, b]) < ∞) = 1.
It is sufficient to show that E [lim inf n→∞ |Mn |] < ∞ and E[U∞ (M, [a, b])] < ∞.
45

By Fatou’s Lemma,
h i
(1.10) E lim inf |Mn | ≤ lim inf E [|Mn |] ≤ sup E [|Mn |] .
n→∞ n→∞ n≥0

By Lemma 84,
E [(a − MN )+ ] a + supn≥0 E [|Mn |]
E [UN (Mn , [a, b])] ≤ ≤ .
b−a b−a
By Monotone Convergence theorem,
a + supn≥0 E [|Mn |]
E [U∞ (Mn , [a, b])] ≤ .
b−a
That E [|M∞ |] < ∞ follows from (1.10). □

You might also like