You are on page 1of 7

Chapter 5

Continuous-time Martingales

5.1 Definitions

Let (Ω, F, P ) be a probability space. The collection (Ft , t ≥ 0) are sub-σ-algebras on Ω and
Ft ⊂ F for all t ≥ 0. Let Xt ≡ X(t), t ≥ 0, be a stochastic process.

1
Definition 5.1

• (Ft , n ≥ 0) is a filtration if Fs ⊂ Ft , ∀ 0 ≤ s ≤ t
(an increasing stream of information).

• Given {Xt , t ≥ 0}, Ft = σ(Xs , 0 ≤ s ≤ t) is called the natural filtration.

• (Ω, F, {Ft , t ≥ 0}, P ) is a filtered probability space.

• X = (Xt , t ≥ 0) is adapted w.r.t. {Ft } if Xt ∈ Ft , ∀t ≥ 0.

• A stochastic process (Xt , t ≥ 0) is said to be previsible or predictable w.r.t. (Ft ) if


Xt ∈ Ft− for all t ≥ 0, where
 

!
[ [
Ft− = σ  Ft−h  = σ Ft−1/n .
h>0 n=1

(i.e., we only have information about X before t, but not at or after t.)

2
Definition 5.2 X is a martingale w.r.t. {Ft : t ≥ 0} if

(i) E|Xt | < ∞ for all t < ∞,


(ii) X is adapted to {Ft : t ≥ 0},
(iii) E[Xt |Fs ] = Xs for all 0 ≤ s ≤ t.

Remarks:

• X is a submartingale or supermartingale if (iii) above is replaced by

E[Xt |Fs ] ≥ Xs or E[Xt |Fs ] ≤ Xs .

• There are at least two ways to turn a submartingale (Bt2 or eσBt ) into a martingale:
subtraction and division. Both approaches are useful under different circumstances.

• The first approach is a simple application of the Doob-Meyer decomposition, while the
second is useful in change of measures (see Girsanov theorem later).

3
5.2 Processes derived from Brownian motion, Bt

Example 5.1 The following are martingales w.r.t. the natural filtration.

σ2
!
exp (σBt )
Bt , Bt2 − t, Bt3 − 3tBt , = exp σBt − t .
E exp (σBt ) 2

Proof. Let us just check Bt , and leave others as exercises.

• Clearly, Bt ∈ σ(Bs , 0 ≤ s ≤ t).

• E|Bt | < ∞ as Bt ∼ N (0, t).

• E(Bt |Fs ) = E[(Bt − Bs ) + Bs |Fs ] = Bs + E[(Bt − Bs )|Fs ] = Bs + E[(Bt − Bs )] = Bs .

4
5.3 Processes derived from Poisson process

Let Nt be a Poisson process with intensity λ. Let Ft = σ(Xs , 0 ≤ s ≤ t). Then Nt is a


submartingale while Nt − λt is a martingale (compensated Poisson process).

Proof. Recall that Nt ∼ P oisson(λt).

1. Nt is a submartingale since

(a) Nt ∈ σ(Ns , 0 ≤ s ≤ t);


(b) E|Nt | = λt < ∞;
(c) for 0 ≤ s < t, we have

E(Nt |Fs ) = E[(Nt − Ns ) + Ns |Fs ] = Ns + E[(Nt − Ns )|Fs ]


= Ns + E[(Nt − Ns )] = Ns + (t − s)λ ≥ Ns .

2. Nt − λt is a martingale since

(a) Nt − λt ∈ σ(Ns , 0 ≤ s ≤ t);


(b) E|Nt − λt| ≤ ENt + λt = 2λt < ∞;
(c) for 0 ≤ s < t, from (c) of (i), we have

E(Nt − λt|Fs ) = E(Nt |Fs ) − λt = Ns + (t − s)λ − λt = Ns − λs.

5
5.4 Doob-Meyer’s decomposition theorem

Theorem 5.1 (Doob-Meyer’s decomposition theorem)

1. If X is a (local) submartingale, there exists a local martingale M (t) and a unique increas-
ing predictable process A(t), locally integrable, such that

X(t) = X(0) + M (t) + A(t).

2. If X(t) is a submartingale of Dirichlet class (D), then the process A is integrable, that is,
supt EA(t) < ∞, and M (t) is a uniformly integrable martingale.

6
5.5 Exercises
1. Which of the following stochastic processes Xt are adapted to σ(Bs , 0 ≤ s ≤ t):

(i) Xt = 0t Bs ds,
R

(ii) Xt = max0≤s≤t Bs ,
(iii) Xt = Bt+1 − Bt .

2. Given a r.v. H ∈ FT and H ∈ L1 (T < ∞), show that Mt =: E(H|Ft ) is a martingale.

3. Suppose that {Xt }t≥0 is a process with independent increments and EXt ≡ C. Let
Ft = σ(Xs , 0 ≤ s ≤ t). Show that Xt a martingale w.r.t. Ft .

4. (Optional) Define Ft = σ({Bs , s ≤ t}).


1
(a) For any θ ∈ R, show that Xt = exp{θBt − θ2 t} is a martingale.
2

1 θn
(b) Define the Hermite polynomials Hn (t, x) by exp{θx − θ2 t} :=
X
Hn (t, x).
2 n=0 n!

i. From (a), E(Xt |Fs ) = Xs , ∀0 ≤ s ≤ t, show that


∞ ∞
X θn X θn
E (Hn (t, Bt )|Fs ) = Hn (s, Bs ),
n=0 n! n=0 n!

ii. Further, by comparing coefficients of θn , show that

Hn (t, Bt ) is a martingale for each n ≥ 1.

iii. Find Hn (t, Bt ) for n = 1, 2, 3, 4.

You might also like