You are on page 1of 33

Lecture 5: Stochastic Processes

Lecturer: Phạm Thị Hồng Thắm

Foundations of Mathematical Finance

PTHT Lecture 5 FMF 1 / 33


Table of Contents

1 Stochastic Processes

2 Brownian Motion

3 Properties of Wienner Processes

4 Functions of Wienner Processes

PTHT Lecture 5 FMF 2 / 33


Stochastic Processes

PTHT Lecture 5 FMF 3 / 33


Stochastic Processes

In this chapter we start to discuss stock prices.


The movement of stock prices can be described as a stochastic
process
{St }t∈T
where St represents the stock price at time t.
That is we assume St forms a family of random variables.
The set T is called the index set.
If T is a discrete set such as the natural numbers N, we say {St } is in
discrete time.
If T is an interval of the real line for e.g. [0, ∞), we say the process is
in continuous time.

PTHT Lecture 5 FMF 4 / 33


Random Walks

The first stochastic process we will consider is a random walk.


Let {Xn }n∈N be a stochastic process with index set N such that
n
X
Xn = Zi
i=1

where Z1 , .., Zn are identically and independently distributed.


We can think of Zi as the length of the ith step of a walk on the real
line.
Then Xn represents the position of the object after n steps.

PTHT Lecture 5 FMF 5 / 33


Example: Discrete Random Walks

Suppose the Zi in the random walk has the following distribution

P(Zi = 1) = 0.5, P(Zi = −1) = 0.5.

Hence, Xn takes value in the set Z and is called a discrete random


walk.
It is easy to deduce that

E(Zi ) = 0, Var(Zi ) = 1.

PTHT Lecture 5 FMF 6 / 33


Discrete Random Walks

Proposition
Consider the random walk
n
X
Xn = Zi
i=1

where the distribution of Zi is

P(Zi = 1) = 0.5, P(Zi = −1) = 0.5.

Then for n ≥ 1
E(Xn ) = 0, Var(Xn ) = n.

PTHT Lecture 5 FMF 7 / 33


PTHT Lecture 5 FMF 8 / 33
Example: Continuous Random Walk

Suppose the Zi in the random walk has the following distribution

Zi ∼ N (0, σ 2 )

Hence, Xn takes value in the set R and is called a continuous random


walk.
By definition
E(Zi ) = 0, Var(Zi ) = σ 2 .

PTHT Lecture 5 FMF 9 / 33


Continuous Random Walks

Proposition
Consider the random walk
n
X
Xn = Zi
i=1

where the distribution of Zi is N (0, σ 2 ). Then for n ≥ 1

Xn ∼ N (0, nσ 2 ).

In particular,
E(Xn ) = 0, Var(Xn ) = nσ 2 .

PTHT Lecture 5 FMF 10 / 33


PTHT Lecture 5 FMF 11 / 33
Brownian Motion

PTHT Lecture 5 FMF 12 / 33


Wienner Processes

Definition
The Wiener process is defined by:
W1) The starting value is zero with probability one,

P(W0 = 0) = 1.

W2) non-overlapping increments Wt1 − Wt0 , Wt2 − Wt1 ,..., Wtn − Wtn−1
are pairwise independent for

0 ≤ t0 ≤ t1 ≤ ... ≤ tn .

W3) the increments follow a Gaussian distribution with the variance


equalling the difference of the arguments,

Wt − Ws ∼ N (0, t − s) with 0 ≤ s < t.

PTHT Lecture 5 FMF 13 / 33


PTHT Lecture 5 FMF 14 / 33
Proposition
Cov(Wt , Ws ) = min(t, s).

PTHT Lecture 5 FMF 15 / 33


Brownian Motion

Definition
A stochastic process Bt = σWt for t > 0 and where σ is a positive
constant is called a Brownian motion.

PTHT Lecture 5 FMF 16 / 33


Properties of Wienner Processes

PTHT Lecture 5 FMF 17 / 33


Pathwise Properties

Proposition
A Brownian motion is continuous everywhere but not differentiable
anywhere.

PTHT Lecture 5 FMF 18 / 33


PTHT Lecture 5 FMF 19 / 33
Martingale & Markov Properties
A stochastic process is called a Markov process if all information of
the past about its future behavior is entirely concentrated in the
present.
Let us denote the set of information about the past of the process
available up to time t is denoted by It .
Usually It is generated by the past values of the stochastic process
Xt . Hence
It = σ{Xs : s ≤ t}.
A Markov process does not remember how it arrived at the present
state.
The probability that the process takes on a certain value at time t + s
depends only on the value at time t ("present") and does not depend
on the past behavior.
P (Xt+s ≤ x |It ) = P (Xt+s ≤ x |Xt )

PTHT Lecture 5 FMF 20 / 33


Martingale & Markov Properties

A process is called a martingale if the present value is the best


prediction for the future.
By the best prediction for the future we mean

E (Xt+s |It )

which is the conditional expectation of Xt+s given the information up


to time t.
Hence, the stochastic process Xt is a martingale if

E (Xt+s |It ) = Xt .

PTHT Lecture 5 FMF 21 / 33


Martingale & Markov Properties

Proposition
Let Bt be a Brownian motion. Then Bt is a martingale and Bt satisfies the
Markov property.

PTHT Lecture 5 FMF 22 / 33


Scale Invariance

Proposition
For any σ > 0, √
σWt ∼ Wσt
in distribution.

PTHT Lecture 5 FMF 23 / 33


Functions of Wienner Processes

PTHT Lecture 5 FMF 24 / 33


Functions of Wienner Processes

Let Wt be a Wienner process. This will be our basic building block to


construct other more complex stochastic processes.
Other stochastic processes are functions of this Wt .

PTHT Lecture 5 FMF 25 / 33


Brownian Motion

This is just a multiple of the Wienner Process.


For e.g. Bt = σWt .
Clearly, Bt satisfies properties W 1, W 2.
Property W 3 needs to be modified in the obvious way

Bt − Bs ∼ N (0, σ 2 (t − s))

for t > s.

PTHT Lecture 5 FMF 26 / 33


PTHT Lecture 5 FMF 27 / 33
Brownian Motion With Drift

A Brownian motion with drift has the following form

Xt = µt + σWt

where µ and σ are constants.


It is easy to check that

Xt ∼ N (µt, σ 2 t).

PTHT Lecture 5 FMF 28 / 33


PTHT Lecture 5 FMF 29 / 33
Geometric Brownian Motion

Definition
A stochastic process Xt is called a geometric Brownian motion if log(Xt ) is
a Brownian motion with drift. Equivalently,

log(Xt ) = µt + σWt ⇐⇒ Xt = exp (µt + σWt ) .

PTHT Lecture 5 FMF 30 / 33


PTHT Lecture 5 FMF 31 / 33
Log-Normal Distribution

Definition
A random variable Y is said to follow a log-normal distribution with drift µ
and volatility σ if log(Y ) has the normal distribution N (µ, σ 2 ). We write

Y ∼ LN(µ, σ 2 ).

PTHT Lecture 5 FMF 32 / 33


Properties of Log-Normal Distribution

Proposition
If Y ∼ LN(µ, σ 2 ) then

E(Y ) = exp µ + 0.5σ 2 , Var(Y ) = exp(σ 2 ) − 1 exp 2µ + σ 2 .


  

PTHT Lecture 5 FMF 33 / 33

You might also like