You are on page 1of 7

Brownian Motion

Exercises
p
Exercise 9.1. Let Z be a standard normal random variable. For all t 0, let Xt = tZ.
The stochastic process X = fXt : t 0g has continuous paths and 8t 0, Xt N (0; t). Is
X a Brownian motion? Justify. (ref. Baxter and Rennie, p. 49)
Exercise 9.2. Let W and W f be two independent Brownian motion and is a constant
p
contained in the unit interval: For all t 0, let Xt = Wt + 1 ft . The stochastic
2W

process X = fXt : t 0g has continuous paths and 8t 0, Xt N (0; t). Is X a Brownian


motion? Justify. (ref Baxter and Rennie, p. 49)
Exercise 9.3.h Let W be ai Brownian motion on the …ltered probability space ( ; F; F = fFt : t 0g ; P ).
2
Let Xt = exp Wt 2
t . Show that X = fXt : t 0g is a martingale.
Exercise 9.4. Let W be a Brownian motion on the …ltered probability space ( ; F; F = fFt : t 0g ; P ).
. Show that fWt2 t : t 0g is a martingale.
Exercise 9.5. Let W be a Brownian motion. Show that

Cov [Wt ; Ws ] = min(s; t):

Exercise 9.6. Let W be a Brownian motion. Show that


(i) For all s > 0, fWt+s Ws : t 0g
(ii) f Wt : t 0g
n o
(iii) cW t2 : t 0
n c
o
(iv) V0 = 0 and Vt = tW 1 if t > 0 : t 0
t

are Brownian motions.


Exercise 9.7. Let B be a four-dimensional Brownian motion with
0 1
1 0:5 0:8 0:1
B 0:5 1 0:3 0:4 C
Corr [Bt ] = B C
@ 0:8 0:3 1 0:1 A :
0:1 0:4 0:1 1

Find the matrix A such that B = AW and W is a four-dimensional Brownian motion with
independent components.

1
Solutions
1 Exercise 9.1
No since 0 s t < 1;
hp p i
Var [Xt Xs ] = Var tZ sZ
p p 2
= t s V ar [Z]
pp
= t 2 t s+s
6 = t s:

2 Exercise 9.2
Yes. Su¢ ces to verify that (i) the time increments are independent and (ii) for all 0 s
t < 1; Xt Xs N (0; t s):
(ii)
p
Xt Xs = (Wt Ws ) + 1 2 Wft W fs
| {z } | {z }
N (0;t s)
| {z } | {z
N (0;t s)
}
N (0; 2 (t s))
N (0;(1 2 )(t s))

Since the two terms in the right hand side are two independent Gaussian random variables
with an expectation of zero, their sum is also a zero-expectation Gaussian random variable.
Finally
Var [Xt Xs ] = 2 (t s) + 1 2
(t s) = t s
which completes the …rst part.
(i) Let 0 t1 t2 t3 t4 < 1. Since Xt2 Xt1 and Xt4 Xt3 have a Gaussian
distribution,it su¢ ces to show that the covariance is null:

Cov [Xt2 Xt1 ; Xt4 Xt3 ]


h p p i
= Cov (Wt2 Wt1 ) + 1 2 ft2 W
W ft1 ; (Wt4 Wt3 ) + 1 ft4 W
2 W ft3
p h i
= 2
Cov [Wt2 Wt1 ; Wt4 Wt3 ] + 1 2 Cov W
t2 W t1 ; f
W t4
f
W t3
p h i h i
+ 1 2 Cov Wft2 ft1 ; Wt4 Wt3 + 1
W 2
Cov Wft2 W ft1 ; W
ft4 W ft3
= 0

2
since the Brownian increment independence implies that

Cov [Wt2 Wt1 ; Wt4 Wt3 ] = 0

and h i
ft2
Cov W ft1 ; W
W ft4 f
Wt3 = 0:
The independence between the two Brownian motions implies that
h i
f f
Cov Wt2 Wt1 ; Wt4 Wt3 = 0

and h i
Cov Wft2 ft1 ; Wt4
W Wt3 = 0:

3 Exercise 9.3
(i) Integrability
2 2
E [jXt j] = E exp Wt t = E exp Wt t
2 2
Z 1 2
1 1 w2
= exp w t p p exp dw
1 2 2 t 2t
Z 1
1 1 w2 2t w + 2 t2
= p p exp dw
1 2 t 2t
Z " #
1
1 1 (w t )2
= p p exp dw = 1 < 1
1 2 t 2t
| {z }
N (t ;t) density function

(ii) Since Xt is a continuous function of Ft measurable random variables, Xt is itself


Ft measurable.

3
(iii) For all 0 s t 1,

Xt
E [Xt j Fs ] = Xs E Fs car Xs > 0
Xs
2 h 2
i 3
exp Wt 2
t
= Xs E 4 2 Fs 5
exp Ws 2
s
2
= Xs E exp (Wt Ws ) (t s) Fs
2
2
= Xs E exp (Wt Ws ) (t s) since Wt Ws is independent of Fs .
2
Z 1 2
w2
1 1
= Xs exp w (t s) p p exp
dw
1 2 2 (t s)
2 t s
Z 1 " #
1 1 w2 2 (t s) w + 2 (t s)2
= Xs p p exp dw
1 2 t s 2 (t s)
Z 1 " #
1 1 (w (t s) )2
= Xs p p exp dw = Xs
1 2 t s 2 (t s)
| {z }
N ((t s) ;t s) density function

4 Exercise 9.4
First, Wt2 t is Ft measurable since it is a continuous function of Wt which is Ft measurable.
Second,
E Wt2 t E Wt2 + t = t + t = 2t < 1:
Third, 80 s t;

E Wt2 t jFs = E (Wt Ws + Ws )2 t jFs


2
= E (Wt Ws ) + 2Ws (Wt Ws ) + Ws2 t jFs
= E (Wt Ws )2 jFs + 2Ws E [Wt Ws jFs ] + Ws2 t
| {z } | {z }
2
=E[(Wt Ws ) ] =E[W t W s ]
=0
=t s

= Ws2 s.

4
5 Exercise 9.5
Without lost of generality, let 0 < s < t.

Cov [Wt ; Ws ] = Cov [Wt Ws + Ws ; Ws ]


= Cov [Wt Ws ; Ws ] + Cov [Ws ; Ws ]
= Cov [Wt Ws ; Ws W0 ] + Var [Ws ]
= 0 + s because the increments of W are independent
= min(s; t) since s < t.

6 Exercise 9.6
Let
Zt = Wt+s Ws :
(M B1) Z0 = Ws Ws = 0:
(M B2) Since Ztk Ztk 1 = (Wtk +s Ws ) Wtk 1 +s Ws = Wtk +s Wtk 1 +s and for
80 t0 < t1 < ::: < tk ; the random variables Wt1 +s Wt0 +s ; Wt2 +s Wt1 +s ; ..., Wtk +s Wtk 1 +s
are independent, then for 80 t0 < t1 < ::: < tk ; the random variables Zt1 Zt0 ; Zt2 Zt1 ;
..., Ztk Ztk 1 are independent.
(M B3) 8u; t 0 such that u < t; Zt Zu = (Wt+s Ws ) (Wu+s Ws ) = Wt+s Wu+s
has a zero-expectation Gaussian distribution 0 with variance (t + s) (u + s) = t u.
(M B4) 8! 2 ; the path t ! Zt (!) = Wt+s (!) Ws (!) is continuous since t ! Wt (!)
is continuous.
Let
Yt = Wt :
(M B1) Y0 = W0 = 0:
(M B2) Since Ytk Ytk 1 = Wtk 1 Wtk and 80 t0 < t1 < ::: < tk ; the random variables
Wt1 Wt0 ; Wt2 Wt1 ; ..., Wtk Wtk 1 are independent, then 80 t0 < t1 < ::: < tk ; the
random variables Wt0 Wt1 ; Wt1 Wt2 ; ..., Wtk 1 Wtk are independent, which implies that
Yt1 Yt0 ; Yt2 Yt1 ; ..., Ytk Ytk 1 are independent.
(M B3) 8s; t 0 such that s < t; Yt Ys = Ws Wt has a Gaussian distribution with
expectation 0 and variance t s.
(M B4) 8! 2 ; the path t ! Yt (!) = Wt (!) is continuous since t ! Wt (!) is
continuous.
Let
Xt = cW t2 :
c

5
(M B1) X0 = cW0 = 0:
(M B2) Since Xtk Xtk 1 = cW tk cW tk 1 and 80 t0 < t1 < ::: < tk ; the random vari-
c2 c2
ables W t1 W t0 ; W t2 W t1 ; ..., W tk W tk 1 are independent, then 80 t0 < t1 < ::: < tk ;
c2 c2 c2 c2 c2 c2
the random variables cW t1 cW t0 ; cW t2 cW t1 ; ..., cW tk cW tk 1 are independent, which
c2 c2 c2 c2 c2 c2
implies that Xt1 Xt0 ; Xt2 Xt1 ; ..., Xtk Xtk 1
are independent.
(M B3) Since cW is Gaussian if W is, 8s; t 0 such that s < t; Xt Xs = c W t2 W cs2
h i c h i
is Gaussian with E [Xt Xs ] = cE W t2 W cs2 = 0 and variance Var [Xt Xs ] = c2 Var W t2 W cs2 =
c c

c2 ct2 s
c2
= t s.
(M B4) 8! 2 ; the path t ! Xt (!) = cW t2 (!) is continuous since t ! Wt (!) is.
c

Let
0 if t = 0
Vt =
tW 1 if t > 0:
t

(M B1) V0 = 0 from de…nition of V .


(M B3) 8s; t 0 such that s < t;

Vt Vu = tW 1 sW 1
t s

= s W1 W 1 + (t s) W 1
s t t

is a linear combination of two independent Gaussian random variable. Therefore, Vt Vu is


Gaussian with
h i h i h i
E [Vt Vs ] = E tW 1 sW 1 = tE W 1 sE W 1 = 0
t s t s

and
h i
Var [Vt Vs ] = Var s W1 W 1 + (t s) W 1
s t t
h i h i
= Var s W1 W 1 + Var (t s) W 1
s t t

since W 1 W 1 is independent of W 1
h s t
i h ti
2 2
= s Var W 1 W 1 + (t s) Var W 1
s t t

1 1 1
= s2 + (t s)2
s t t
s2 s2
= s +t 2s +
t t
= t s:

6
If s = 0, then Vt = tW 1 is Gaussian with
t

h i h i
E [Vt ] = E tW 1 = tE W 1 = 0
t t

and h i h i 1
Var [Vt ] = Var tW 1 = t2 Var W 1 = t2 = t:
t t t

(M B2) It su¢ ces to show that 80 t1 < t2 t3 < t4 ; the covariance between Vt2 Vt1
and Vt4 Vt3 is zero since these two random variables have a Gaussian distribution.
If t1 > 0; then because 0 < t14 < t13 1
t2
< t11 ,
h i
Cov [Vt2 Vt1 ; Vt4 Vt3 ] = Cov t2 W 1 t1 W 1 ; t4 W 1 t3 W 1
t t t4 t
h 2 i1 h 3 i
= t2 t4 Cov W 1 ; W 1 t2 t3 Cov W 1 ; W 1
t t4 t t3
h 2 i h 2 i
t1 t4 Cov W 1 ; W 1 + t1 t3 Cov W 1 ; W 1
t1 t4 t1 t3

since Cov (Wt ; Ws ) = min(s; t):


1 1 1 1
= t2 t4 t2 t3 t1 t4 + t1 t3
t4 t3 t4 t3
= t2 t2 t1 + t1
= 0

If t1 = 0, then
h i
Cov [Vt2 Vt1 ; Vt4 Vt3 ] = Cov t2 W 1 ; t4 W 1 t3 W 1
t t4 t3
h 2 i h i
= t2 t4 Cov W 1 ; W 1 t2 t3 Cov W 1 ; W 1
t2 t4 t2 t3

1 1
= t2 t4 t2 t3
t4 t3
= t2 t2
= 0

(M B4) 8! 2 ; the path t ! Vt (!) = tW 1 (!) is continuous for all t > 0 since the func-
t
tions t ! Wt (!) and t ! t are continuous and so is their product. Since limt!0 tW 1 (!) = 0
t
almost-surely, the path t ! Vt (!) = tW 1 (!) is continuous for all t.
t

You might also like