Professional Documents
Culture Documents
Martingales
Thomas Verdebout
Universit Libre de Bruxelles
2015
1. A short introduction.
2. Basic probability review.
3. Martingales.
4. Markov chains.
5. Markov processes, Poisson processes.
6. Brownian motions.
1. A short introduction.
2. Basic probability review.
3. Martingales.
3.1. Definitions and examples.
3.2. Sub- and super- martingales, gambling strategies.
3.3. Stopping times and the optional stopping theorem.
3.4. Limiting results.
4. Markov chains.
5. Markov processes, Poisson processes.
6. Brownian motions.
Remarks:
I
Remark:
I
A lot of examples...
Example 1
Let X1 , X2 ,P
. . . be
integrable r.v.s, with common mean 0.
Let Yn := ni=1 Xi .
Then {Yn } is a martingale w.r.t. {Xn }.
Indeed,
I
(i) is trivial.
(ii) is trivial.
Similarly, if X1 , X2 , . . . areP
Example 2
Let X1 , X2 ,Q
. . . be
integrable r.v.s, with common mean 1.
Let Yn := ni=1 Xi .
Then {Yn } is a martingale w.r.t. {Xn }.
Indeed,
I
(i) is trivial.
(ii) is trivial.
Similarly, if X1 , X2 , . . . areQ
Example 2
The previous example is related to basic models for stock
prices.
Assume X1 ,Q
X2 , . . . are positive,
, and integrable r.v.s.
Let Yn := c ni=1 Xi , where c is the initial price.
The quantity Xi 1 is the change in the value of the stock over
a fixed time interval (one day, say) as a fraction of its current
value. This multiplicative model (i) ensures nonnegativeness of
the price and (ii) is compatible with the fact fluctuations in the
value of a stock are roughly proportional to its price.
Various models are obtained for various distributions of the Xi s:
I
Let X1 , X2 ,P
. . . be i.i.d., with P[Xi = 1] = p and P[Xi = 1] = q = 1 p
Let Yn := ni=1 Xi .
; The SP {Yn } is called a random walk.
Remarks:
I
OfP
course, from Example 1, we know that
{( ni=1 Xi ) n(p q)} is a martingale w.r.t. {Xn }.
q Yn
p
(i) is trivial.
(i) is trivial.
Remarks:
I
(exercise).
Zn
n
h i
h i
In particular, E Znn = E Z00 = 1. Hence, E[Zn ] = n for all n.
Indeed,
I
(i) is trivial.
1. A short introduction.
2. Basic probability review.
3. Martingales.
3.1. Definitions and examples.
3.2. Stopping times and the optional stopping theorem.
3.3. Sub- and super- martingales, Limiting results.
4. Markov chains.
5. Markov processes, Poisson processes.
6. Brownian motions.
Stopping times
(i) just makes (almost) sure that one will stop at some point.
(1)
(2)
[T = n] An n [T n] An n [T > n] An n.
Indeed,
(1)
follows from [T n] =
Sn
k =1 [T
= k ].
(1)
(2)
Stopped martingale
A key lemma:
Lemma: Let (Yn ) be a martingale w.r.t. (An ). Let T be a ST
w.r.t. (An ).
Then {Zn := Ymin(n,T ) } is a martingale w.r.t. (An ).
Proof: note that
Zn = Ymin(n,T ) =
n1
X
Yk I[T =k ] + Yn I[T n] .
k =0
So
I
I
Stopped martingale
(iii): we have
E[Zn+1 |An ] Zn = E[Zn+1 Zn |An ]
= E[(Yn+1 Yn )I[T n+1] |An ]
= E[(Yn+1 Yn )I[T >n] |An ]
= I[T >n] E[Yn+1 Yn |An ]
= I[T >n] E[Yn+1 |An ] Yn
Stopped martingale
Stopped martingale
E[YT ] = E[Y0 ] does not always hold.
Example: the doubling strategy, for which the winnings are
Yn =
n
X
Ci Xi ,
i=1
1
2
and Ci = 2i1 b.
mk
.
m
p
1
q m
p
q m .
p
E[RT ] = E[YT T (pq)] = E[YT ](pq)E[T ] = 0pk +m(1pk )
and
E[R0 ] = E[Y0 min(0, T )(p q)] = E[Y0 ] = E[k ] = k .
Hence,
q k
q m
m 1
k 1
m(1 pk ) k
p
p
E[T ] =
=
.
q m
pq
(p q) 1
p
1. A short introduction.
2. Basic probability review.
3. Martingales.
3.1. Definitions and examples.
3.2. Stopping times and the optional stopping theorem.
3.3. Sub- and super- martingales, Limiting results.
4. Markov chains.
5. Markov processes, Poisson processes.
6. Brownian motions.
Remarks:
I
n2
X
2i b + 2n1 b = b
i=0
X
P[T < ] =
P[n 1 first results are "odd", then "even"]
n=1
X
1 n1 1
n=1
X
1 n
n=1
= 1.
But
(c) The expected amount you lose just before you win is
!
n2
1 2
1 3
1 n
X
1
0 +b
+(b+2b)
+. . .+
2i b
+. . . =
2
2
2
2
i=0
A0 = {, ).
n
X
Ci Xi .
i=1
A natural question:
(C)
Is there any way to choose (Cn ) so that (Yn ) is "nice"?.
Consider the "blind" strategy Cn = 1 (for all n), thatP
consists in
betting 1 euro in each game, and denote by (Yn = ni=1 Xi ) the
corresponding process of winnings.
Then, here is the answer:
Theorem: Let (Cn ) be a gambling strategy with nonnegative
(C)
and bounded r.v.s. Then if (Yn ) is a martingale, so is (Yn ). If
(C)
(Yn ) is a submart., so is (Yn ). And if (Yn ) is a supermart., so
(C)
is (Yn ).
(i) is trivial.
(ii): |Yn |
(C)
Pn
i=1 ai |Xi |.
(C)
(C)
(C)
(C)
= Yn
= Yn
=
(C)
Yn
+ Cn+1 E[Yn+1 |An ] Yn ,
Convergence of martingales
Theorem: let (Yn ) be a submartingale w.r.t. (An ). Assume that,
for some M, E[Yn+ ] M for all n. Then
a.s.
(i) Y such that Yn Y as n .
(ii) If E[|Y0 |] < , E[|Y |] < .
2
1
2
Case 1: 1.
a.s.
If p0 := P[Xn,i = 0] > 0, Zn Z := 0 as n .
Proof:
Let k N0 . Then
probabilities
X
Zn
Z
Z
= E[ Xn+1,1 ] n =
k pk
= g() n = Zn
k =0