Professional Documents
Culture Documents
1 / 96
Outline
2 / 96
Random process
3 / 96
Random process
3 / 96
Random process
3 / 96
Why study random processes?
4 / 96
Course outline
• Review probability
• Introduction to random process
• Introduction to some special random process
• Poisson process
• Markov chain
• Random walk
• Brownian motion
• Stochastic calculus
• Itô integral
• Itô’s formula for derivative of a function of Brownian motion and of an Itô process
• Solve stochastic differential equations
5 / 96
Plan
1 Probability space
Probability space
2 Random variables
Random variables
Simulation
Expectation and Variance
3 Random vectors
6 / 96
Probability models
8 / 96
Probability measure on finite sample space
Ω = {x1 , . . . , xn }
• X
P (A) = p(xi )
xi ∈A
9 / 96
Example 1
10 / 96
Additive rule
P( A ∪ B}
| {z ) = P (A) + P (B) − P ( A ∩ B}
| {z )
at least one event occurs both events occur
P (A ∪ B) = P (A) + P (B)
Complement rule
Complement of A is Ac = Ω − A - the event contains all outcomes that is not in A
P (Ac ) = 1 − P (A)
11 / 96
Conditional probability
P (A ∩ B)
P (A|B) =
P (B)
12 / 96
Conditional probability
P (A ∩ B)
P (A|B) =
P (B)
12 / 96
Example
Roll a fair dice twice. What is the probability that the first roll is a 2 given that the
sum of the roll is 7?
13 / 96
Example
Roll a fair dice twice. What is the probability that the first roll is a 2 given that the
sum of the roll is 7?
Solution
• Sample space Ω = {(i, j) : 1 ≤ i, j ≤ 6}
• A: outcome of the 1st dice
13 / 96
Example
Roll a fair dice twice. What is the probability that the first roll is a 2 given that the
sum of the roll is 7?
Solution
• Sample space Ω = {(i, j) : 1 ≤ i, j ≤ 6}
• A: outcome of the 1st dice
• B: sum of the roll is 7.
13 / 96
Example
Roll a fair dice twice. What is the probability that the first roll is a 2 given that the
sum of the roll is 7?
Solution
• Sample space Ω = {(i, j) : 1 ≤ i, j ≤ 6}
• A: outcome of the 1st dice
• B: sum of the roll is 7.
• Need to find P (A|B)
13 / 96
Example
Roll a fair dice twice. What is the probability that the first roll is a 2 given that the
sum of the roll is 7?
Solution
• Sample space Ω = {(i, j) : 1 ≤ i, j ≤ 6}
• A: outcome of the 1st dice
• B: sum of the roll is 7.
• Need to find P (A|B)
• B = {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)}
13 / 96
Example
Roll a fair dice twice. What is the probability that the first roll is a 2 given that the
sum of the roll is 7?
Solution
• Sample space Ω = {(i, j) : 1 ≤ i, j ≤ 6}
• A: outcome of the 1st dice
• B: sum of the roll is 7.
• Need to find P (A|B)
• B = {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)}
• A = {(2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6)}
13 / 96
Example
Roll a fair dice twice. What is the probability that the first roll is a 2 given that the
sum of the roll is 7?
Solution
• Sample space Ω = {(i, j) : 1 ≤ i, j ≤ 6}
• A: outcome of the 1st dice
• B: sum of the roll is 7.
• Need to find P (A|B)
• B = {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)}
• A = {(2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6)}
• AB = {(2, 5)}
P (AB) 1/36 1
P (A|B) = = =
P (B) 6/36 6
13 / 96
Practice
Roll a fair dice twice. What is the probability that the first roll is a 2 given that the
second roll is even?
Does the result of the first roll effect on the second?
14 / 96
Independence
15 / 96
Independence
15 / 96
Independence
15 / 96
Independence
15 / 96
Multiplication rule
1 P (AB) = P (B)P (A|B)
2 General case
P (A1 A2 ...Ak ) = P (A1 )P (A2 |A1 )P (A3 |A1 A2 )...P (Ak |A1 ...Ak−1 )
16 / 96
Law of total probability
Let A1 , ..., Ak be a partition the sample space
• Ai ∩ Aj = ∅ for all i ̸= j (mutually exclusive (disjoint))
• ∪i Ai = Ω
Then, for any event B, we have
k
X k
X
P (B) = P (B ∩ Ai ) = P (B|Ai )P (Ai ).
i=1 i=1
17 / 96
Plan
1 Probability space
Probability space
2 Random variables
Random variables
Simulation
Expectation and Variance
3 Random vectors
18 / 96
Random variables
Definition
Random variable A random variable a function on sample space
X: Ω→R
ω → X(ω)
19 / 96
Random variables
Definition
Random variable A random variable a function on sample space
X: Ω→R
ω → X(ω)
19 / 96
Example - Binomial Asset Pricing Model
u: up factor, d: down factor
20 / 96
Definition (Cumulative distribution function (cdf))
Probability that the values of X does not exceed a given value
F (x) = P (X ≤ x)
21 / 96
Definition (Cumulative distribution function (cdf))
Probability that the values of X does not exceed a given value
F (x) = P (X ≤ x)
21 / 96
Properties of cdf
• Increasing
•
lim F (x) = 0
x→−∞
and
lim F (x) = 1
x→∞
• has left-limit
• Right- continuous
22 / 96
Distribution of a discrete random variable
Range(X): countable set x1 , x2 , ...
Definition (Probability mass function (pmf))
pmf of X is a function pX given by
pX (xi ) = P (X = xi )
which satisfies
1 0 ≤ pX (xi ) ≤ 1 for all i
X
2 pX (xi ) = 1
i
Properties
X X
(1) P (a ≤ X ≤ b) = pX (xi ) (2) cdf F (x) = pX (xi )
a≤xi ≤b xi ≤x
23 / 96
Bernoulli distribution
P (X = 0) = 1 − p
P (X = 1) = p
Denote X ,→ Ber(p)
24 / 96
Example - Bernoulli distribution
x 0 1
P (X = x) P (T ) = 1/2 P (H) = 1/2
25 / 96
Binomial distribution
26 / 96
Example - Binomial distribution
27 / 96
Example - Binomial distribution
28 / 96
Example - Payoff of European Call Option
• European call option: confers the right to buy the stock at maturity or
expiration time T = 2 for strike price K = 14 dollars. It is worth
• S2 − K if S2 − K > 0
• 0 otherwise
• Value (payoff) of the option at maturity
where x+ = max(x, 0)
• Stock price: binomial model with S0 = 4, d = 1/2, u = 2,
p = p(H)1/2,q = p(T ) = 1 − p = 1/2
• Find probability distribution for payoff C2 of the European call option.
29 / 96
Solution
pmf of S2
x 1 4 16
P (S2 = x) 1/4 1/2 1/4
pmf of C2
x 0 2
P (C2 = x) 3/4 1/4
30 / 96
Example - European Put Option
• European put option: confers the right to sell the stock at maturity or
expiration time T = 2 for strike price K = 3 dollars. It is worth
• K − S2 if K − S2 > 0
• 0 otherwise
• Value (payoff) of the option at maturity
P2 = max(K − S2 , 0) = (K − S2 , 0)+
31 / 96
Solution
pmf of S2
x 1 4 16
P (S2 = x) 1/4 1/2 1/4
pmf of C2
x 0 2
P (P2 = x) 3/4 1/4
32 / 96
Practice
Find c.d.f of stock price S2 , European call option C2 and European put option P2 in
the previous example
33 / 96
Distribution of a continuous random variable
Range(X): uncountable
Definition (Probability density function (pdf))
The pdf of X is a function f that satisfies
1 f (x) ≥ 0 for all x
R∞
2 P (−∞ < X < ∞) = −∞ f (x)dx =1
Properties
Ra
• P (X = a) = P (a ≤ X ≤ a) = a f (x)dx = 0
Rb
• P (a ≤ X ≤ b) = a f (x)dx
• P (a ≤ X ≤ b) = F (b) − F (a)
• cdf Z a
F (a) = P (X ≤ a = f (x)dx)
−∞
34 / 96
Probability as an Area
Z a+ ϵ
ϵ ϵ
2
P a− ≤X ≤a+ = f (x)dx
2 2 a− 2ϵ
≈ ϵf (a)
f (a) is a measure of how likely it is that the random variable will be near a.
36 / 96
Continuous uniform distribution
37 / 96
Continuous uniform distribution
37 / 96
Exponential distribution with parameter λ
• pdf
(
0, x<0
f (x) = −λx
λe , x≥0
• cdf (
0, x<0
F (x) =
e−λx , x≥0
38 / 96
Normal distribution
Definition
Continuous RV X is said to be normally distributed with parameter µ and σ 2 if its pdf
is
1 2 2
f (x) = √ e−(x−µ) /2σ , −∞ < x < ∞
σ 2π
Denote X ∼ N (µ, σ 2 ).
Properties
1 E(X) = µ
2 V ar(X) = σ 2
39 / 96
Simulate random number
40 / 96
Practice
41 / 96
Definition (Expectation of random variable X)
∞
X
xk P (X = xk ), if X is discrete with countable values
k=1
E(X) = Z ∞
x fX (x) dx, if X is continuous with uncountable values
−∞ | {z }
pdf of X
42 / 96
Definition (Variance of random variable X)
Property
43 / 96
Some properties of expectation and variance
44 / 96
Example
Find expectation and variance for stock price S2 , European call option C2 and
European put option P2 in the previous example
45 / 96
Plan
1 Probability space
Probability space
2 Random variables
Random variables
Simulation
Expectation and Variance
3 Random vectors
46 / 96
Joint distribution of two discrete random variables
pX,Y (x, y) = P (X = x, Y = y)
Properties
For constants a < b and c < d,
b X
X d
P (a ≤ X ≤ b, c ≤ Y ≤ d) = P(X = x, Y = y)
x=a y=c
47 / 96
Marginal pmf
and ∞
X
PY (Y = y) = P(X = x, Y = y)
x=−∞
48 / 96
Example
Consider binomial asset pricing model
• S0 = 4, u = 2, d = 1/2
• p(H) = 1/3, p(T ) = 2/3
50 / 96
Probability as a volume
ZZ
P ((X, Y ) ∈ R = fX,Y (x, y)dxdy
R
51 / 96
Definition (Marginal pdf)
The marginal pdf of X and Y are given by
Z ∞
fX (x) = fX,Y (x, y)dy
−∞
and Z ∞
fY (y) = fX,Y (x, y)dx
−∞
∂2
fX,Y (x, y) = FX,Y (x, y)
∂x∂y
52 / 96
Independence of random variables
• X and Y are independence if and only if
for all x, y
• If X and Y are discrete RV then X and Y are independent if and only if
P (X = x, Y = y) = P (X = x)P (Y = y)
for all x, y
• If X and Y are continuous RV then X and Y are independent if and only if
for all x, y
53 / 96
Definition (Expectation of function of random vector)
P P
x y g(x, y)P (X = x, Y = y), if X, Y are discrete
Z ∞ Z ∞
E[g(X, Y )] =
g(x, y)fX,Y (x, y)dxdy, if X, Y are continuous
−∞ −∞
Example
P P
x y xyP (X = x, Y = y), if X, Y are discrete
Z ∞ Z ∞
E[XY ] =
xyfX,Y (x, y)dxdy, if X, Y are continuous
−∞ −∞
54 / 96
Covariance and correlation coefficient
cov(X, Y )
cor(X, Y ) =
σX σY
where σX , σY are standard deviations of X and Y
−1 ≤ corr(X, Y ) ≤ 1
Correlation coefficient is used to measure how strong linear relationship between
X and Y is
55 / 96
Bivariate normal distribution
Let X1 ,→ N (µ1 , σ12 ), X2 ,→ N (µ2 , σ22!) be 2 normal random variables with covariance
X1
σ12 then the random vector X = is a bivariate normal distribution if the joint
X2
pdf is given by
1 − 21 (x−µ)T Σ−1 (x−µ)
f (x1 , x2 ) = 1 e
2π det(Σ) 2
where !
µ1
• µ=
µ2
!
σ12 σ12
• Σ= is the variance - covariance matrix of X
σ12 σ22
56 / 96
57 / 96
Properties
• Cov(X, Y ) = Cov(Y, X)
• Cov(X, X) = V ar(X)
• Cov(aX, Y ) = aCov(X, Y )
• Cov(X + Y, Z) = Cov(X, Z) + Cov(Y, Z)
• If X and Y are independent then Cov(X, Y ) = 0
58 / 96
Variance of Sum
X1 , . . . , Xn : RVs
n
X n
X X
V ar( Xi ) = V ar(Xi ) + Cov(Xi , Xj )
i=1 i=1 i̸=j
59 / 96
Variance of Sum
X1 , . . . , Xn : RVs
n
X n
X X
V ar( Xi ) = V ar(Xi ) + Cov(Xi , Xj )
i=1 i=1 i̸=j
59 / 96
Sum of normal distributions
2 ),
• If (X, Y ) has multivariate normal distribution, X ,→ N (µX , σX
Y ,→ N (µY , σY2 ) and Cov(X, Y ) = σXY then
2 2
X + Y ,→ N (µX + µY , σX + σX + 2σXY )
2 ), Y ,→ N (µ , σ 2 ) and X and Y are independent then
• If X ,→ N (µX , σX Y Y
2 2
X + Y ,→ N (µX + µY , σX + σX )
60 / 96
Example
61 / 96
Solution
62 / 96
Plan
1 Probability space
Probability space
2 Random variables
Random variables
Simulation
Expectation and Variance
3 Random vectors
63 / 96
Conditioning a RV on an event
P ((X = x) ∩ A)
pX|A (x) = P (X = x|A) =
P (A)
if P (A) > 0
64 / 96
Example
Let X be the roll of a fair die and let A be the event that the roll is an even number.
Then
P (X = 1 and roll is even)
pX|A (1) = =0
P (roll is even)
65 / 96
Example
Let X be the roll of a fair die and let B be the event that the roll is an even number.
Then
P (X = 2 and roll is even) 1
pX|B (2) = =
P (roll is even) 3
66 / 96
Conditional of a discrete RV on another
• 2 RVs X and Y
• given Y = y with P (Y = y) > 0
• conditional pmf of X
67 / 96
For each y, we view the joint pmf along the slice Y = y and renormalize such that
X
pX|Y (x|y) = 1
x 68 / 96
Example
1
Consider a binomial asset pricing model with S0 = 4, d = 2, u = 2, p = 2/3 and
q = 1/3. Find conditional pmf of S2 given S1 = 2.
69 / 96
Solution
x 1 4 8
1 2
PS2 |S1 =2 (x|2) 3 3 0
70 / 96
Practice
71 / 96
Practice
72 / 96
Conditional distributions
• If X and Y are jointly distributed discrete random variables, then the conditional
probability mass function of Y given X = x is
P (X = x, Y = y)
P (Y = y|X = x) = defined when P (X = x) > 0.
P (X = x)
73 / 96
Conditional distributions
• If X and Y are jointly distributed discrete random variables, then the conditional
probability mass function of Y given X = x is
P (X = x, Y = y)
P (Y = y|X = x) = defined when P (X = x) > 0.
P (X = x)
• For continuous random variable X and Y , the conditional density function of Y
given X = x is
f (x, y)
fY |X (y|x) = ,
fX (x)
provide the likelihood that X takes values near x given that Y takes values near
by y
73 / 96
Example
Let X1 and X2 be jointly normal random variables with parameters µ1 , σ1 , µ2 , σ2 , and
ρ. The joint pdf is given by
(x1 −µ1 )2 (x −µ )2
1 (x −σ )(x −σ )
1 − + 2 22 −2ρ 1 σ1 σ 2 2
1−ρ2 2σ 2 2σ 1 2
f (x1 , x2 ) = p e 1 2
2πσ1 σ2 1 − ρ2
f (x1 , x2 )
fX2 |X1 (x2 |x1 ) = = ...
fX1 (x1 )
74 / 96
Another way to find conditional distribution of X2 given
X1 = x 1
Using the construction
(
X1 = µ1 + σ1 Z1
p
X2 = µ2 + σ2 (ρZ1 + 1 − ρ2 Z2 )
Given X1 = x1 , we have
x1 − µ 1
Z1 =
σ1
Then
x1 − µ 1 q x 1 − µ1
q
X2 = µ2 + σ2 ρ + 1 − ρ2 Z2 = µ2 + σ2 + σ2 1 − ρ2 Z2
σ1 σ1
75 / 96
Since Z2 and Z1 are independent, knowing Z1 does not provide any information on Z2 .
It implies that given X1 = x1 , X2 is a linear combination of Z2 , thus it is normal with
mean
x1 − µ 1
µ2 + σ2
σ1
and variance
σ22 (1 − ρ2 )
So
x 1 − µ1 2
X2 |(X1 = x1 ) ∼ N µ2 + σ2 , σ2 (1 − ρ2 )
σ1
76 / 96
Conditional expectation of Y |X = x
X
yP (Y = y|X = x), if X is discrete
y
• E(Y |X = x) = Z ∞
yfY |X (y|x)dy, if X is continuous
−∞
77 / 96
Conditional expectation of Y |X = x
X
yP (Y = y|X = x), if X is discrete
y
• E(Y |X = x) = Z ∞
yfY |X (y|x)dy, if X is continuous
−∞
• E(Y |X = x) is a function of x, i.e., the result depends on the value of x
77 / 96
Example
78 / 96
Example
Suppose that W1 ,→ N (0, 1) and W2 (0, 1) are log-return of the first and ther second
year of stock A. Suppose W1 and W2 are independent. The cumulative log-return of
this stock is
B1 = W1
and
B2 = W1 + W2 .
Given that B1 = 1, find conditional expectation of B2 .
79 / 96
Important note
80 / 96
σ − algebra: record of information
81 / 96
Example
Some important σ − algebra of Ω - sample space when toss a coint three times
1 F0 = {∅, Ω}: trial σ − algebra - contains no information. Knowing whether the
outcome w of the three tosses is in ∅ and whether it is in Ω tells you nothing
about w.
82 / 96
Example
Some important σ − algebra of Ω - sample space when toss a coint three times
1 F0 = {∅, Ω}: trial σ − algebra - contains no information. Knowing whether the
outcome w of the three tosses is in ∅ and whether it is in Ω tells you nothing
about w.
2
where
AH = {HHH, HHT, HT H, HT T } = { H on first toss }
AT = {T HH, T HT, T T H, T T T } = { H on first toss }
F1 : information of the first coin or ”information up to time 1”. For
example, you are told that the first coin is H and no more.
82 / 96
Example 2
3
where
AHH = {HHH, HHT } = {HH on the first two tosses}
AHT = {HT H, HT T } = {HT on the first two tosses}
AT H = {T HH, T HT } = {TH on the first two tosses}
AT T = {T T H, T T T } = {TT on the first two tosses}
F2 : information of the first two tosses or ”information up to time 2”
83 / 96
4 F3 = G set of all subsets of Ω: “full information” about the outcome of all three
tosses
84 / 96
F - measurable
Definition
A random variable X is called F - measurable if σ(X) ⊂ F
Meaning: the information in F is enough to determine the value of the random
variable X(w), even though it may not be enough to determine the value w of the
outcome of the random experiment.
Example
Consider a binomial asset pricing model and F2 is σ - algebra generated by information
of the first two tosses.
The asset price S1 and S2 are F2 - measurable but S3 is not F2 - measurable
85 / 96
Conditional distribution X given a σ - algebra F
Conditional expectation E(X|F) is a random variable which satisfies
• F - measurable
Meaning: Estimate E(X|F) of X is based on the information in F
• Partial - average property
Z Z
E(X|F)dP = XdP for all A ∈ F
A A
E(X|F) is indeed an estimate of X. It gives the same averages as X over all the
sets in F.
If F has many sets, which provide a fine resolution of the uncertainty inherent in
w, then this partial-averaging property over the ”small” sets in F says that
E(X|F) is a good estimator of X.
86 / 96
Conditional expectation given a random variable
Definition
87 / 96
Conditional expectation given a random variable
Definition
87 / 96
Example
Consider a binomial asset pricing model with S0 = 4, u = 1/d = 2, p(H) = 2/3 and
p(T ) = 1/3, find E(S2 |S1 )
88 / 96
Properties of conditional expectation
1 Linearity
E(aY + bZ|X) = aE(Y |X) + bE(Z|X)
2 Taking out of what we know (typical approach to verify Markov and martingal
property)
E(f (X)Y |X) = f (X)E(Y |X)
3 Iterated conditioning σ - algebra G ⊂ H
E(E(Z|H)|G) = E(Z|G)
90 / 96
Solution
91 / 96
• Given S1 = 8, (S2 , S3 ) takes pair values (16, 32), (16, 8), (4, 8), (4, 2). So
2
2 1 2
E(S2 + S3 |S1 = 8) = (16 + 32) + (16 + 8)
3 3 3
2
1 2 1
+ (4 + 8) + (4 + 2) = 30
3 3 3
• Hence E(S2 + S3 |S1 = 8) = E(S2 |S1 = 8) + E(S3 |S1 = 8) = 30
• Similary E(S2 + S3 |S1 = 2) = E(S2 |S1 = 2) + E(S3 |S1 = 2) = 7.5
• Regardless the outcome of S1 , we have
92 / 96
Example - Take out what is known
Compare
E(S1 S2 |S1 )
and
S1 E(S2 |S1 )
93 / 96
Example - Iterated conditioning
Compare
E(S3 |S1 )
and
E(E(S3 |(S1 , S2 ))|S1 )
94 / 96
Example-Independence
Compare
S2
E |S1
S1
and
S2
E
S1
95 / 96
Practice
Suppose that W1 ,→ N (0, 1) and W2 (0, 1) are log-return of the first and the second
year of stock A. Suppose W1 and W2 are independent. The cumulative log-return of
this stock is
B1 = W1
and
B2 = W1 + W2 .
Find E(B2 |B1 ).
96 / 96