Arnthor Axelsson
FRE6083  Quantitative Methods in Finance
February 5, 2014
Problem R2.6. Suppose ﬁve fair coins are tossed. Let E be the event that all coins land
heads. Deﬁne the random variable I
E
;
I
E
=
_
1, if E occurs
0, if E
c
occurs
For what outcome in the original sample space does I
E
equal 1? What is P{I
E
= 1}?
Solution: For sample space S, containing all possible outcomes of the ﬁve coin tosses,
the event E, for the speciﬁc outcome that all ﬁve tosses land heads, is given by
E = {(H, H, H, H, H)}
Where H represents heads and P({H}) = P({T}) =
1
2
. Thus I
E
= 1 for event E occuring.
From the binomial theoreum we deduct P{I
E
= 1}, that is the probability of the ﬁve
coins all landing heads.
P{I
E
= 1} = P{(H, H, H, H, H)}
=
_
n
i
_
p
i
(1 −p)
n−i
=
_
5
5
_
(
1
2
)
5
(1 −
1
2
)
0
= (
1
2
)
5
=
1
32
.
Problem R2.7. Suppose a coin having probability 0.7 of coming up heads is tossed three
times. Let X denote the number of heads that appear in the three tosses. Determine the
probability mass function of X.
Solution: The probability mass function is given by the binomial theoreum with a bi
nomial random variable having parameters (n, p) = (3, 0.7);
p(i) =
_
3
i
_
(0.7)
i
(0.3)
3−i
, i = 0, 1, 2, 3
1
thus,
p(i) =
_
¸
¸
_
¸
¸
_
0.027, i = 0
0.189, i = 1
0.441, i = 2
0.343, i = 3
meaning that the most likely outcome is two heads and one tail, with a probability of
44.1%. Note that in accordance with the binomial theoreum,
3
i=0
p(i) = 1.
Problem R2.8. Suppose the distribution function of X is given by
F(b) =
_
_
_
0, b < 0
1
/
2
, 0 ≤ b < 1
1, 1 ≤ b < ∞
What is the probability mass function of X?
Solution: From the relation
F(b) =
∀x
i
≤b
p(x
i
) = p(x
b
) +
∀x
i+1
≤b
p(x
i
)
⇔ p(x
b
) =
∀x
i
≤b
p(x
i
) −
∀x
i+1
≤b
p(x
i
)
we have a recursive formula implying that the PMF for a discreet random variable in a point
on the lower, closed end of a deﬁned interval of the CDF, is simply the CDF in that point
minus the CDF in the preceding interval. Thus we have;
p(b) =
_
1
/
2
, b = 0
1
/
2
, b = 1
Problem R2.9. If the distribution function of X is given by
F(b) =
_
¸
¸
¸
¸
¸
¸
_
¸
¸
¸
¸
¸
¸
_
0, b < 0
1
/
2
, 0 ≤ b < 1
3
/
5
, 1 ≤ b < 2
4
/
5
, 2 ≤ b < 3
9
/
10
, 3 ≤ b < 3.5
1, b ≥ 3.5
Calculate the probability mass function of X.
Solution: Using the same reasoning as deducted in Problem R2.8, we have;
p(b) =
_
¸
¸
¸
¸
_
¸
¸
¸
¸
_
1
/
2
, b = 0
1
/
10
, b = 1
1
/
5
, b = 2
1
/
10
, b = 3
1
/
10
, b = 3.5
2
Problem R2.31. Compare the Poisson approximation with the correct binomial probabil
ity for the following cases:
(a) P{X = 2} when n = 8, p = 0.1.
(b) P{X = 9} when n = 10, p = 0.95.
(c) P{X = 0} when n = 10, p = 0.1.
(d) P{X = 4} when n = 9, p = 0.2.
Solution: The binomial theoreum gives the probability as
P{X = i} =
_
n
i
_
p
i
(1 −p)
n−i
for the binomial random variable X with parameters (n, p). The Poisson approximation
introduces the parameter λ = np to approximate the probability as
P{X = i} ≈ e
−λ
λ
i
i!
(a) Binomial P{X = 2} =
_
8
2
_
(0.1)
2
(0.9)
6
= 0.149
Poisson P{X = 2} ≈ e
−0.8 0.8
2
2!
= 0.144
(b) Binomial P{X = 9} =
_
10
9
_
(0.95)
9
(0.05)
1
= 0.315
Poisson P{X = 9} ≈ e
−9.5 9.5
9
9!
= 0.130
(c) Binomial P{X = 0} =
_
10
0
_
(0.1)
0
(0.9)
10
= 0.349
Poisson P{X = 0} ≈ e
−1 1
0
0!
= 0.368
(d) Binomial P{X = 4} =
_
9
4
_
(0.2)
4
(0.8)
5
= 0.066
Poisson P{X = 4} ≈ e
−1.8 1.8
4
4!
= 0.072
These numbers show clearly that the Poisson values approach the binomial probabili
ties when n is large and p is small, which is elemental for the approximations assumed in
deducting the Poisson formula.
Problem R3.64. A and B roll a pair of dice in turn, with A rolling ﬁrst. A’s objective is
to obtain a sum of 6, and B’s is to obtain a sum of 7. The game ends when either player
reaches his or her objective, and that player is declared the winner.
(a) Find the probability that A is the winner.
(b) Find the expected number of rolls of the dice.
3
(c) Find the variance of the number of rolls of the dice.
Solution: Both players, A and B, have the same sample space
S
A
= S
B
= (∀i, j ∈ {1, . . . , 6}  {(i, j)})
from which the events of A = 6 and B = 7 are deﬁned as
E
A
= {(1, 5), (2, 4), (3, 3), (4, 2), (5, 1)} and
E
B
= {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)}
Thus, deﬁning the probability of A winning on the ﬁrst throw as P(A
1
), the probability of B
winning on his ﬁrst throw as P(B
1
), the probability of A winning as P(A
w
) and the number
of rolls until game over as R
#
, we have, as the size of the sample space in either case is 36;
(a) P(A
w
) = P(A
w
A
1
)P(A
1
) +P(A
w

¯
A
1
)P(
¯
A
1
)
= 1 ·
5
36
+
5
6
P(A
w
)
31
36
=
31
60
.
(b) E[R
#
] = E[R
#
A
1
]P(A
1
) +E[R
#
B
1
]P(B
1
) +E[R
#

¯
B
1
]P(
¯
B
1
)
= 1 ·
5
36
+ 2 · (
31
36
)(
1
6
) + (2 +E[R
#
])(
31
36
)(
5
6
)
=
402
61
.
(c) Note that
V ar(R
1
) = 0, E[R
1
] = 1
V ar(R
2
) = 0, E[R
2
] = 2
V ar(R
#\{1,2}
) = V ar(R
#
)
E[R
#\{1,2}
] = 2 +E[R
#
] =
524
61
, (from sec. b)
E[V ar(R
#
)] = P(
¯
A
1
)P(
¯
B
1
)V ar(R
#
) =
155
216
V ar(R
#
)
Thus we compute,
V ar(E[R
#
]) = E[R
1
]
2
P(A
1
) +E[R
2
]
2
P(
¯
A
1
)P(B
1
) +E[R
#\{1,2}
]
2
P(
¯
A
1
)P(
¯
B
1
) −(E[R
#
])
2
= 1
2 5
36
+ 2
2 31
216
+ (
524
61
)
2
(
155
216
) −(
402
61
)
2
≈ 10.234
Now we can solve from conditional variance,
V ar(R
#
) = E[V ar(R
#
)] +V ar(E[R
#
])
≈
155
216
V ar(R
#
) + 10.234
⇔ V ar(R
#
) = 36.24.
4
Problem adapted from Probability Essentials. Let X
1
, . . . , X
n
be independent and
exponentially distributed random variables with parameter λ. Consider the sum
Y
n
=
n
i=1
X
i
.
Show that Y
n
has a gamma distribution with parameters (n, λ).
Solution: A gamma distribution with parameters (n, λ) has a probability density func
tion
f(t; n, λ) =
λe
−λt
(λt)
n−1
Γ(n)
H(t) =
λe
−λt
(λt)
n−1
(n −1)!
, n ∈ N, t ≥ 0
which shows that the exponentially distributed, independent variable is a special case of the
Gamma distribution where n = 1;
f(t; 1, λ) = λe
−λt
= Y
1
Now we can solve by induction. Assuming the sum, Y
n
=
n
i=1
X
i
, of independent and expo
nentially distributed random variables, X
1
, . . . , X
n
, with parameter λ satisfy the probability
desity function for some n ∈ N;
_
∃n ∈ N
¸
¸
¸
¸
Y
n
=
λe
−λt
(λt)
n−1
(n −1)!
, t ≥ 0
_
We want to show that the sum Y
n+1
= Y
n
+ X
n+1
, where X
n+1
is independent of Y
n
and
follows the same distribution, has, by convolution, a probability density function
Y
n+1
=
λe
−λt
(λt)
n
n!
, t ≥ 0
We have;
Y
n+1
= f(t; n + 1, λ)
=
_
∞
−∞
f(z; n, λ)f(X
n+1
; λ)(t −z)dz
=
_
t
0
λe
−λz
(λz)
n−1
(n−1)!
λe
−λ(t−z)
dz
= λe
−λt
_
t
0
λ
(λz)
n−1
(n+1)!
dz
=
λe
−λt
(λt)
n
n!
which shows that the sum, Y
n+1
, of n+1 exponentially distributed, independent, random
variables is a gamma distributed random variable with parameters (n + 1, λ). In plainer
language, an exponentially distributed random variable gives the duration before the ﬁrst
success in a Poisson process. The Duration before the n − th success is then the sum Y
n
which we have shown to be a gamma distributed random variable.
5