## Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

2 Stochastic Processes

Stochastic processes are, roughly speaking, nothing but random variables evolving over

time; for this reason we need the whole machinery presented in the previous chapter.

However, since now we are going to have an additional dimension to take care of, i.e.

time, we need to readjust a little our probability space. And this is what is coming up in

the next section. We will introduce then the most common processes you will deal with

over the rest of the year.

2.1 Some introductory deﬁnitions

Deﬁnition 1 (Filtration) Let Ω be a nonempty set. Let T be a ﬁxed positive number,

and assume that for each t ∈ [0, T] there is a σ-algebra F

t

. Assume further that F

s

⊂ F

t

for all 0 ≤ s < t < ∞. Then we call the collection of σ-algebras F

t

, 0 ≤ t ≤ T a ﬁltration;

and (Ω, F,P, F

t

) is a ﬁltered probability space.

We can consider the ﬁltration F

t

as the set of information available at time t. In other

words we can consider (F

t

)

t≥0

as describing the ﬂow of information over time, where we

suppose that we do not lose information as time passes (this is why we say F

s

⊂ F

t

for

s < t).

Example 1 Consider Example 1.3.3. The sets F

(1)

and F

(2)

contain the information

learned by observing respectively the ﬁrst one and the ﬁrst two movements of the stock

price. Alternatively, you can consider these two sets as containing the information gained

after 1 period of time, and 2 periods of time respectively elapsed. If so, then we can index

those σ-algebras by time and {F

0

, F

1

, F

2

} is an example of ﬁltration.

Hence, a ﬁltration tells us the information we will have at future times; more precisely,

when we get to time t, we will know for each set in F

t

whether the true ω lies in that set.

Deﬁnition 2 (Stochastic process) A stochastic process is a family X = (X

t

: t ≥ 0)

of random variables deﬁned on the ﬁltered probability space (Ω, F, P, F

t

).

Example 2 (Random walk) 1. A (simple) symmetric random walk is a stochastic

process S deﬁned as S

n

=

¸

n

i=1

X

i

, where {X}

i∈N

is a sequence of independent and

identically distributed (i.i.d.) random variables such that

X

i

=

1 with probability 1/2

−1 with probability 1/2.

2. The evolution of the stock price described in Example 1.7 is a stochastic process

deﬁned as

S

n

= S

0

2

¸

n

i=1

X

i

S

0

= 4,

0

c Laura Ballotta - Do not reproduce without permission.

2 2 STOCHASTIC PROCESSES

where {X}

i∈N

is a sequence of independent and identically distributed (i.i.d.) ran-

dom variables such that

X

i

=

1 with probability p

−1 with probability q = 1 −p.

We say that X is adapted to F

t

if X

t

is F

t

- measurable ∀t ≥ 0.

We use X

t

as shorthand notation for X (t, ω). Note that for each ﬁxed ω ∈ Ω, X(·, ω)

denotes the path of the stochastic process X.

Deﬁnition 3 (Natural ﬁltration) Let (Ω, F, P, F

t

) be a ﬁltered probability space and

X be a stochastic process adapted to the ﬁltration (F

t

)

t≥0

. The natural ﬁltration of X is

deﬁned by the set of σ-algebras F

t

= σ (X

s

: 0 ≤ s ≤ t) , 0 ≤ t < ∞.

Example 3 The natural ﬁltration of the process S deﬁned in Example 2.2 is given by

the sequence {F

0

, F

1

, F

2

, ...} discussed in Example 2.1.

2.2 Classes of processes

2.2.1 Martingale

A stochastic process X = (X

t

: t ≥ 0) is a martingale relative to (F,P) if:

1. X is adapted and E|X

t

| < ∞ ∀t ≥ 0;

2. E[X

t

|F

s

] = X

s

∀s ≤ t.

A property of martingales that comes as consequence of the deﬁnition is the following:

• E[X

t

−X

s

|F

s

] = 0 ∀s ≤ t

The martingale properties show that a martingale is a random process whose future

variations are completely unpredictable given the current information set. For this reason,

the best forecast of the change in X over an arbitrary interval is zero. A martingale

represents a fair game: given the knowledge we have, on average the return produced by

the bet is what we invested in it.

Example 4 Let Y be an integrable random variable. Then the stochastic process X

t

=

E[Y |F

t

] t ≥ 0 is a martingale relative to (F,P). In fact, E|X

t

| < ∞ by deﬁnition of

conditional expectation. Also, for ∀s ≤ t

E[X

t

|F

s

] = E[E(Y |F

t

) |F

s

]

= E(Y |F

s

) = X

s

where the last but one equality follows from the Tower property.

Example 5 A symmetric random walk is a martingale (very easy to check!)

2.2 Classes of processes 3

Example 6 In a simple discrete time model for the price of a share, the change in price

at time t, X

t

, is assumed to be independent of anything that has happened before time t

and to take distribution:

X

t

=

1 with probability p

−1 with probability q

0 with probability r = 1 −p −q

Let S

0

= n ∈ N be the original price of the share, S

m

= S

0

+

m

¸

t=1

X

t

be the price after m

periods and deﬁne

Y

m

=

q

p

Sm

.

Then Y

m

is a martingale. To check, proceed as follows: as p, q > 0, and Y

m

≥ 0,

E|Y

m

| = E

q

p

Sm

= E

q

p

Sm

= E

q

p

S

0

+

m

¸

t=1

Xt

=

q

p

n

E

q

p

Xt

m

=

q

p

n

¸

q

p

1

p +

q

p

−1

q +

q

p

0

r

¸

m

=

q

p

n

< ∞.

Further,

E[ Y

m

| F

m−1

] = E

¸

q

p

Sm

F

m−1

¸

= E

¸

q

p

S

m−1

+Xm

F

m−1

¸

= Y

m−1

E

¸

q

p

Xm

F

m−1

¸

= Y

m−1

E

¸

q

p

Xm

¸

= Y

m−1

.

Further remark: a martingale is always deﬁned with respect to some information set,

and with respect to some probability measure. If we change the information content

and/or the probabilities associated with the process, the process under consideration may

cease to be a martingale. The opposite is also true: given a process, which does not

behave like a martingale, we may be able to modify the relevant probability measure and

convert the process into a martingale. This will be illustrated later on in the course when

we will talk about the martingale problem and the Girsanov theorem.

4 2 STOCHASTIC PROCESSES

Exercise 1 Let F

t

be an increasing family of σ-algebras for t ≥ 0. Show that for every

random variable h, if Eh < ∞, then M

t

:= E[h|F

t

] is an F

t

-martingale for t ≥ 0.

Exercise 2 Deﬁne v

t

= V ar(M

t

), where {M

t

: t ≥ 0} is a zero-mean martingale. Show

that v

t

is non-decreasing.

2.2.2 Super/sub martingales

Consider the usual ﬁltered probability space. A stochastic process X is called a

• supermartingale if X

t

∈ F

t

, E|X

t

| < ∞ and E(X

t

|F

s

) ≤ X

s

∀0 ≤ s ≤ t < ∞;

• submartingale if X

t

∈ F

t

, E|X

t

| < ∞ and E(X

t

|F

s

) ≥ X

s

∀0 ≤ s ≤ t < ∞.

Exercise 3 Let S be a supermartingale. Show that

E(S

0

) ≥ E(S

T

) .

Exercise 4 Let S be a non-negative supermartingale such that E(S

0

) ≤ E(S

T

). Show

that S is a martingale.

2.2.3 Stopping time

Consider the usual ﬁltered probability space. A stopping time with respect to the ﬁltration

F

t

, t ≥ 0, is a random variable τ : Ω →[0, ∞) such that {τ ≤ t} := {ω ∈ Ω : τ (ω) ≤ t} ∈

F

t

∀t ≥ 0.

The measurability property means that the decision whether to stop a stochastic

process at time t ∈ [0, ∞) depends on the information up to time t only.

Example 7 Let τ

1

and τ

2

be two stopping times. Then τ

1

∧τ

2

= min {τ

1

, τ

2

} is a stopping

time. In fact, by deﬁnition {τ

1

≤ t} ∈ F

t

, {τ

2

≤ t} ∈ F

t

. Hence:

{τ

1

∧ τ

2

≤ t} = {τ

1

≤ t, or τ

2

≤ t, or both τ

1

, τ

2

≤ t}

= {τ

1

< t} ∪ {τ

2

< t} ∈ F

t

because of deﬁnition of σ-algebra.

Example 8 Let S

m

be a random walk, i.e. a process {S

m

: m = 0, 1, . . .} such that

S

m

= S

m−1

+ε

m

, where ε

m

are independent and identically distributed random variables.

Consider the interval [a, b] ∈ R. Then:

τ = inf {m|S

m

∈ [a, b] } ,

i.e. the time of the ﬁrst entry of S in [a, b], is a stopping time with respect to the natural

ﬁltration of S.

Example 9 Suppose S

t

is the stock price at time t. Let τ = sup {t |S

t

≥ 100}, i.e. the

latest date at which S

t

≥ 100. τ is not a stopping time because this event may occur later

than t.

Exercise 5 Consider the random time τ where X ﬁrst achieves its maximum in the

interval [0, T] . Is τ a stopping time with respect to the ﬂow of information obtained from

X?

2.2 Classes of processes 5

2.2.4 Local martingale

Let (Ω, F,P,F

t

) be a ﬁltered probability space and X be a stochastic process such that

X

0

= 0. If there is a sequence {τ

n

: n ∈ N} of non-decreasing stopping times with

P

lim

n→∞

τ

n

= ∞

= 1

such that X

τ

= X

τn∧t

is a martingale, i.e. E[X

τn∧t

|F

s

] = X

τn∧s

, s ≤ t, then we say that

X is a local martingale.

2.2.5 Gaussian process

Consider a time partition of [0, ∞) , say {t

1

, t

2

, ..., t

n

} ; collect the values of the process

X at each time point, so you get the set (X

t

1

, X

t

2

, ..., X

tn

). This is a random vector with

some distribution F (t

1

, t

2

, ..., t

n

) say. Now consider all the possible time-partitions of

[0, ∞), and the repeat the job. You get a class of all such distributions which is called

the class of all ﬁnite-dimensional distributions of X

Deﬁnition 4 A process X is Gaussian if all its ﬁnite-dimensional distributions are Gaus-

sian; such a process can be speciﬁed by:

1. a measurable function µ = µ

t

with E(X

t

) = µ

t

, the mean function;

2. a non-negative deﬁnite function σ (s, t) = cov (X

s

, X

t

), the covariance function.

2.2.6 Markov process

Deﬁnition 5 A process X is Markov if for each t, each A ∈ σ (X

s

: s > t) and B ∈

σ (X

s

: s < t):

P(A|X

t

, B) = P(A|X

t

, B) = P(A|F

t

) ,

where F

t

= σ (X

s

: s ≤ t).

In other words, if you know where you are at time t, how you get there does not matter

so far as predicting the future is concerned. Equivalently, past and future are conditional

independent given the present.

Any process with independent increments is Markov. In fact, for B ∈ σ (X

τ

: τ ≤ s)

P

X

t

∈ E

X

s

= x

¸

B

= P

X

t

−X

s

+ x ∈ E

X

s

= x

¸

B

= P(X

t

−X

s

+ x ∈ E |X

s

= x)

We distinguish Markov processes according to the nature of the state space, i.e. the

set of values that the random variables are capable of taking, and to whether we are

observing the process discretely in time or continuously in time. The classiﬁcation works

as represented in the following table.

6 2 STOCHASTIC PROCESSES

State space/Time DISCRETE TIME CONTINUOUS TIME

DISCRETE STATE SPACE Markov Chain Markov Jump Process

CONTINUOUS STATE SPACE Markov Process

Deﬁnition 6 A Markov chain is a sequence of random variables X

t

1

, X

t

2

, ..., X

tn

. . . with

the following property:

P(X

n

= j |X

0

= i

0

, X

1

= i

1

, ..., X

m

= i) = P(X

n

= j |X

m

= i) = p

(m,n)

ij

.

p

(m,n)

ij

is called transition probability (as it denotes the probability of transition from state

i at time m to state j at time n).

2.2.7 Point processes and the Poisson process

Consider a random series of events occurring on time. These events may be, for example,

the emission of radioactive particles; the arrival times of telephone calls at an exchange,

stock market crashes or devaluations.

Deﬁne S as the state space (underlying space in which the points sit). A Poisson

process then can be deﬁned via a countable set N of S; in particular, let the random

variable T

n

, n ∈ N, be the time at which the n

th

event occurs, then N = {T

1

, T

2

, T

3

, . . .}.

Since the sequence {T

n

}

n∈N

is random, N itself is random.

Generally a more accurate description of N can be obtained using a count function

N (A), which counts the number of events occurring in a prespeciﬁed set A ⊆ S. N

represents a point process, i.e. an adapted and increasing process whose jumps are equal

to 1 (i.e. the jump process ∆N takes only the values 0 and 1). Set A = (0, t] and

N (A) = N

t

, for simplicity.

A Poisson process is a special kind of point process: it is such that the inter-arrival

times between two events are exponentially distributed with rate λ. More precisely

Deﬁnition 7 (Poisson process) A Poisson process on (Ω, F, P, F

t

) is an adapted point

process N such that

1. for any 0 ≤ s < t < ∞, N

t

− N

s

is independent of F

s

, i.e. N has independent

increments;

2. N

t

has a Poisson distribution with parameter ν

t

, that is

P[N

t

= n] =

e

−νt

(ν

t

)

n

n!

and it is continuous in probability.

2.2 Classes of processes 7

time

time

N

A

1

1

1

1

1

2

3

4

T T T T

T T T T

1

1

2

2

3

3

4

4

t

Figure 1: The Poisson process and the Poisson counter.

A x B

X

T T T T 1 2 3 4

X

X

X

1

2

3

4

M

S

T

T

T

T

1 *

*

*

* 2

3

4

Figure 2: The compound Poisson process.

8 2 STOCHASTIC PROCESSES

Note that ν

t

is a non-decreasing function and generally is given in terms of rate or intensity

(rate if S has dimension 1, intensity for multidimensional S). This is a positive function

λ on S such that

ν

t

=

t

0

λ

τ

dτ

If λ ∈ R, ν

t

= λt. In this case, we have a time-homogenous Poisson process which has

the additional property

3. for any s < t and u < v such that t −s = v −u, the distribution on N

t

−N

s

is the

same as that of N

v

−N

u

, i.e. N has stationary increments.

The idea of the Poisson process is represented in Figure 1.

Further properties of the Poisson process:

MGF You go through the calculations in a very similar fashion to the case of the Poisson

random variable:

M

N

(k; t) = E

e

kN(t)

=

∞

¸

n=0

e

kn

e

−λt

(λt)

n

n!

= e

−λt

∞

¸

n=0

e

k

λt

n

n!

= e

λt(e

k

−1)

.

Note that you can diﬀerentiate M

N

(k) with respect to k to obtain the mean and

the variance of the process.

Mean So, let’s diﬀerentiate:

∂

∂k

M

N

(k; t)

k=0

= λte

k+λt(e

k

−1)

k=0

= λt.

Variance And diﬀerentiate again:

∂

2

∂k

2

M

N

(k; t)

k=0

= λte

k+λt(e

k

−1)

1 + λte

k

k=0

= λt + (λt)

2

.

Which implies that

Var (N

t

) = λt.

As just seen, the Poisson process counts the number of points in a certain subset of

S. Often each point may be labelled with some quantity, (the size of a market crash, for

example), random itself, say X. This gives what is called a marked Poisson process or

compound Poisson process

Nt

¸

k=1

X

k

,

2.2 Classes of processes 9

where {X

k

}

k∈N

is a sequence of independent and identically distributed random variables.

A possible representation of the compound Poisson process is given in Figure 2.

Also in this case, you can calculate the MGF from which you can extract mean and

variance. Assume that EX = µ and V ar (X) = σ

2

.

MGF In this case, the conditional expectation will be of great help:

M¯

N

(h; t) = E

e

h

¯

Nt

= E

E

e

h

¸

N(t)

k=1

X

k

N (t)

¸

= E

N(t)

¸

k=1

E

e

hX

k

N (t)

= E

(M

X

(h))

N(t)

= e

λt(M

X

(h)−1)

.

Mean Just for the sake of doing something diﬀerent, instead of diﬀerentiating the MGF,

we can use the cumulants generating function

K(h; t) ¯

N

= ln M¯

N

(h; t) = λt (M

X

(h) −1) .

Then

E

¯

N

t

=

∂

∂h

K(h; t) ¯

N

h=0

= λt

∂

∂h

M

X

(h)

h=0

= λµt.

Variance Similarly

Var

¯

N

t

=

∂

2

∂h

2

K(h; t) ¯

N

h=0

= λt

∂

2

∂h

2

M

X

(h)

h=0

= λ

µ

2

+ σ

2

t.

Exercise 6 1. Calculate the characteristic function of the Poisson process and the

compound Poisson process.

2. Use the previous results and the properties of the characteristic function to derive

the mean and the variance of the Poisson process.

3. Repeat the same exercise for the compound Poisson process.

Exercise 7 Let N be a Poisson process with rate λ and F

t

the associated ﬁltration.

a) Calculate the mean of the process N and its variance.

b) Let M

N

be the moment generating function of the process N, i.e. M

N

(k) = E

e

kN(t)

,

for k ∈ R. Show that

M

N

(k) = e

λt(e

k

−1)

.

c) Write down the conditional distribution of N (t + s) − N (t) given F

t

, where s > 0,

and use your answer to ﬁnd

E

θ

N(t+s)

|F

t

.

d) Find a process of the form M (t) = η (t) θ

N(t)

which is a martingale.

e) Consider a compound Poisson process. Find an expression for its mean, variance and

the moment generating function.

10 2 STOCHASTIC PROCESSES

2.2.8 Other classes of processes

A class of processes that are becoming more and more important in Finance is the so-called

class of L´evy processes.

Deﬁnition 8 An adapted, c`adl`ag (continu `a droit, limite `a gauche) process L := {L

t

: t ≥ 0}

with L

0

= 0 is a L´evy process if

1. L has increments independent of the past, i.e. L

t

−L

s

⊥F

s

∀0 ≤ S < t < ∞;

2. L has stationary increments, i.e. L

t

−L

s

D

= L

t−s

;

3. L is continuous in probability, i.e. lim

s→t

L

s

= L

t

in probability.

L´evy processes can be thought of as a mixture of a jump process, like for example the

Poisson process (but not only), and a diﬀusion (Gaussian) process called Brownian motion

or Wiener process. L´evy processes actually represent a generalization of both the Poisson

process and the Brownian motion, since the distribution followed by the increments is not

speciﬁed; we will study L´evy processes in more details next term; the Brownian motion

is instead the subject of the next unit, actually of the remaining of this module.

We conclude this section by brieﬂy mentioning that the more general class of stochastic

process for which rules of calculus have been developed so far, is the class of semimartin-

gales. Again, we will encounter these processes in Term 2.

- Stochastic Calculus, Filtering, And Stochastic Control
- 1978-014-003-03
- Stochastic Calculus Notes 1/5
- midterm.docx
- Stochastic Processes Notes
- Some results on multitype continuous time Markov Branching processes
- The Mathematics Of Financial Derivatives
- AbsCntntyMrkvProcGnrtrs
- Stochastic Calculus Notes 3/5
- ma20106_aasign_stochastic2016
- Stochastic Calculus Notes 4/5
- Stochastic Calculus
- Stochastic Calculus Notes 5/5
- Stochastic Processes
- Stochastic Calculus
- Fraction (1)
- nps3C5
- Jong 86 Book
- MarkovChainsQueueing
- Markov Chain
- 1159764
- Reliability Analysis Techniques
- Stat..
- Marco v Chain
- Brownian Motion
- Stochastic Process
- Process
- Notes Ctmc
- Introduction to Markov Chains - Behrends
- Problems 2

Read Free for 30 Days

Cancel anytime.

Close Dialog## Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

Loading