0% found this document useful (0 votes)
174 views3 pages

Poisson Process, Brownian Motion & Martingales

The document defines and describes five stochastic processes: 1) A Poisson process, which models random events over time and has independent and stationary increments. 2) Brownian motion, a continuous stochastic process with independent and normally distributed increments. 3) A martingale, a stochastic process where the conditional expectation of the present value given the past is equal to the past value. 4) A random variable, which assigns a numerical outcome to each experiment and has a distribution and probability function. 5) A continuous-time Markov chain, which is a stochastic process that transitions between states independently of the past according to a transition matrix and holding times.

Uploaded by

Dimpy Tyagi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
174 views3 pages

Poisson Process, Brownian Motion & Martingales

The document defines and describes five stochastic processes: 1) A Poisson process, which models random events over time and has independent and stationary increments. 2) Brownian motion, a continuous stochastic process with independent and normally distributed increments. 3) A martingale, a stochastic process where the conditional expectation of the present value given the past is equal to the past value. 4) A random variable, which assigns a numerical outcome to each experiment and has a distribution and probability function. 5) A continuous-time Markov chain, which is a stochastic process that transitions between states independently of the past according to a transition matrix and holding times.

Uploaded by

Dimpy Tyagi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

Quiz 1

Describe:

1. Poisson Process:

Let λ>0 be fixed. The counting process {N(t),t∈[0,∞)} is called a Poisson process with rates λ if all the
following conditions hold:

 N(0)=0;
 N(t) has independent increments;
 the number of arrivals in any interval of length τ>0 has Poisson(λτ) distribution.

Another Definition of Poisson distribution is Let λ>0 be fixed. The counting process {N(t),t∈[0,∞)} is
called a Poisson process with rate λ if all the following conditions hold:

 N(0)=0;
 N(t) has independent and stationary increments
 we have P(N(Δ)=0)=1−λΔ+o(Δ),P(N(Δ)=1)=λΔ+o(Δ),P(N(Δ)≥2)=o(Δ).

If N(t) is a Poisson process with rate λ, then the inter-arrival times X1, X2, ⋯ are independent and

Xi ∼ Exponential (λ), for i=1,2,3,⋯.

Remember that if XX is exponential with parameter λ>0λ>0, then XX is a memoryless random


variable, that is
P(X>x+ a|X>a) = P(X>x), for a,x≥0.P(X>x+ a|X>a)=P(X>x), for a,x≥0.

Thinking of the Poisson process, the memoryless property of the inter-arrival times is consistent with
the independent increment property of the Poisson distribution. In some sense, both are implying
that the numbers of arrivals in non-overlapping intervals are independent.

2. Brownian Motion:
• Brownian motion is a stochastic process {X(t), t ≥ 0 } with the following properties.

1. X(0) = 0, unless stated otherwise.

2. for any 0 ≤ t0 < t1 < · · · < tn, the random variables X(tk) − X(tk−1) for 1 ≤ k ≤ n are independent.b

3. for 0 ≤ s < t, X(t) − X(s) is normally distributed with mean µ(t − s) and variance σ2(t − s), where µ

and σ is not equal to 0 are real numbers.


Such a process will be called a (µ, σ) Brownian motion with drift µ and variance σ(square)

• The existence and uniqueness of such a process is guaranteed by Wiener’s theorem.

• Although Brownian motion is a continuous function of t with probability one, it is almost nowhere
differentiable.

• The (0, 1) Brownian motion is also called the Wiener process.


3. Martingale:
In full generality, a stochastic process {\displaystyle Y:T\times \Omega \to S}taking value in a Banach
space {\displaystyle S}is a martingale with respect to a filtration {\displaystyle \Sigma

_{*}}and probability measure {\displaystyle \mathbb {P} }  if

 Σ∗ is a filtration of the underlying probability space (Ω, Σ, {\displaystyle \mathbb {P} });

 Y is adapted to the filtration Σ∗, i.e., for each t in the index set T, the random variable Yt is a
Σt-measurable function;

 for each t, Yt lies in the Lp space L1(Ω, Σt, {\displaystyle \mathbb {P} } S), i.e.{\displaystyle


\mathbf {E} _{\mathbb {P} }(|Y_{t}|)<+\infty ;}

 for all s and t with s < t and all F ∈ Σs,{\displaystyle \mathbf {E} _{\mathbb {P} }\left([Y_{t}-

Y_{s}]\chi _{F}\right)=0,}

Where χF denotes the indicator function of the event F. In Grimmett and Stirzaker's Probability and


Random Processes, this last condition is denoted which is known as a general form of conditional
expectation.

It is important to note that the property of being a martingale involves both the filtration and the
probability measure (with respect to which the expectations are taken). It is possible that Y could be
a martingale with respect to one measure but not another one; the Girsanov theorem offers a way
to find a measure with respect to which an Itō process is a martingale.

4. Random Variable:
A random variable is a function that assigns a real value to each outcome in S (Numerical Outcome
from an experiment)

P(X ∈ A) = P(X-1 (A)) where A is set of real numbers

X-1 (A) is the event consisting of all points s ∈ S such that X(S) ∈ A where S = Sample Space

1. Distribution function F of the random variable X =


F(x) = P (X≤ x) = P(X ∈ (-∞, x]) where x is real number
2. Random Variable is set to be discrete when its set of possible values is countable. For
discrete random variables
F(x) = ∑ P( X = y)
y≤x

3. A random variable is said to be continuous if there is probability density function F(x). Joint
distribution system F of random variable X and Y is written as

F(x,y) = Fx(x) Fy(y) for all x and y.


5.Continuous Markov Process:
A stochastic process {X(t) : t ≥ 0} with discrete state space S is called a continuous-time Markov chain
(CTMC) if for all t ≥ 0, s ≥ 0, i ∈ S, j ∈ S, The equation is given by

P(X(s + t) = j|X(s) = i, {X (u) : 0 ≤ u < s}) = P(X(s + t) = j|X(s) = i) = Pij (t).

Where Pij (t) is the probability that the chain will be in state j, t time units from now, given it is in
state i now. For each t ≥ 0 there exists a transition matrix P(t) = (Pij (t)), and P(0) = I which is the
identity matrix.

A CTMC makes transitions from state to state, independent of the past, according to a discrete-time
Markov chain, but once entering a state remains in that state, independent of the past, for an
exponentially distributed amount of time before changing the state again. Thus a CTMC can simply
be defined as a transition matrix P = (Pij ), explaining how the chain changes state step-by-step at
transition epochs, together with a set of rates {ai : i ∈ S}, the holding time rates. Every time state i is
visited, the chain spends, on average, E(Hi) = 1/ai units of time there before moving on to other
state.

You might also like