You are on page 1of 2

Simulation exercise sheet 1

The exercises marked with ∗ are not part of the curriculum, they form sup-
plementary material.

1. Let U be uniformly distributed on the interval [8, 12]. Write down its
density function, calculate its expectation and variance.
2. If U1 , . . . , UN are i.i.d. uniformly distributed on [0, 1] then how can you
transform them into i.i.d. uniformly distributed random variables on
[a, b] ?
3. Consider f (x) = xex . The numerical integrals [0,1] f (x)dx and [2,3] f (x)dx
R R

can be written in the form of an expectation Eg(V ) where V is a suit-


able random variable and g is a suitable function. How would you
choose g, V for the two cases above ? How would you evaluate thses
integrals in a Monte Carlo way ?
4. We have i.i.d. random numbers U1 , . . . , UN , uniformly distributed on
[0, 1]. We want to generate from them V1 , . . . , VN , i.i.d. random vari-
ables with density function γV (x) = 2x for x ∈ [0, 1] and γV (x) = 0 for
x∈ / [0, 1]. How can we achieve this ?
5. Let U be uniform on [0, 1]. Find a function L such that L(U ) has so-
called double exponential distribution, i.e. it has density
1 −|u|
γ(u) = e , u ∈ R.
2
(Hint: check that the c.d.f. corresponding to γ is F (x) = ex /2, x ≤ 0;
F (x) = 1 − e−x /2, x > 0 and conclude that L(y) := ln(2y), y ∈ (0, 1/2);
L(y) = − ln(2 − 2y), y ∈ (1/2, 1), the inverse of F , will do.)
6. We have i.i.d. random numbers W1 , . . . , WN with density γW (x) =
(3/7)x2 , x ∈ [1, 2] Using this sample, let us generate i.i.d. random
variables U1 , . . . , UN which are uniformly distributed on [0, 1]. (Hint:
we have fabricated W1 from U1 so far. Now we need to go the other
way round.)
7. We have i.i.d. random numbers U1 , . . . , UN at our disposition, uni-
formly distributed on [0, 10]. We need i.i.d. random variables R1 , . . . , RN
such that P (R1 = −1) = 0.4, P (R1 = 0) = 0.3 and P (R1 = 1) = 0.3.
How can we generate them ? (Hint: For any subinterval J of [0, 1],
P (U1 ∈ J) equals the length of J.)
8. Calculate the moment generating function of the exponential distri-
bution with parameter λ > 0, i.e.

M (θ) := EeλX , θ ∈ R,

where X ∼ exp(λ). (Result is λ/(λ − θ) for θ < λ and infinity other-


wise.)

1
9. ∗ This is a supplementary exercise, for those interested in program-
ming. Test the Box-Muller method: generate standard Gaussian ran-
dom variables G1 , . . . , G10000 using this method (and a computer). Then
draw a histogram: take each interval [k/100, (k + 1)/100), for k ∈
{−300, −299, . . . , 298, 299}, and draw a rectangle over it which has an
area of
number of i such that Gi ∈ [k/100, (k + 1)/100)
.
10000
The histogram should have a shape that is similar to the standard
Gaussian density for x ∈ [−3, 3), i.e. similar to
1 2
φ(x) = √ e−x /2 .

Try to check this.
10. We possess a device which, based on principles of physics, is able to
produce random variables Z1 , . . . , ZN that are independent and have
exponential distribution with parameter 2.
We need independent and identically distributed random variables
Y1 , . . . , YN such that their common law has density

y2
γ(y) = , y ∈ [0, 3], γ(y) = 0, y ∈
/ [0, 3].
9

How can we generate such Y1 , . . . , YN using Z1 , . . . , ZN ?



Calculate E Y1 analitically.
√ Also write down the Monte Carlo esti-
mate for evaluating E Y1 using the random numbers Y1 , . . . , YN .
Due to some error, our device produces Z̃1 , . . . , Z̃N that have exponen-
tial distribution with parameter 4 instead of 2. Not knowing this, we
continue to generate Ỹ1 , . . . , ỸN from Z̃1 , . . . , Z̃N in the same way as
we generated Y1 , . . . , YN from Z1 , . . . , ZN .
What is the density of the law of Ỹ1 ?
(Hint: it is easier to determine the cumulative distribution function of
Ỹ1 first and then differentiate it to get the density.)
p √
Using this density, evaluate the difference √ E Ỹ1 − E Y1 (this shows
the bias introduced in the calculation of E Y1 by the error of the de-
vice).

You might also like