You are on page 1of 4

Simulation of Random Variables

In the following, we denote by U a random variable uniformly dis-


tributed on [0, 1].
We assume rst that we know how to simulate U (see Appendix).

(a) Simulation of Bernoulli distribution of parameter p

Fix a number p ∈ (0, 1). Consider the random variable



0 if U ≤p
X = 1(p,1) (U ) = .
1 if U >p

Question: What is the distribution of X? Draw the CDF of X.

Answer: Let us compute the CDF (cumulative distribution function) of X:


Since X ∈ [0, 1], if x < 0, then FX (x) = 0 and if x ≥ 1, then FX (x) = 1.
Now, let x ∈ [0, 1). One has

FX (x) = P(X ≤ x) = P(1(p,1) (U ) ≤ x) = P(U ≤ p) = p,

the last equality comes from the fact that for U uniform on [0, 1], its CDF satises:

∀x ∈ [0, 1], FU (x) = x.

Conclusion: It follows that P(X = 0) = p and P(X = 1) = 1 − p. We conclude


that X ∼ B(p) is a Bernoulli distribution of parameter p.

−→ This process simulates coin tossing (heads or tails).

(b) Simulation of uniform distribution on {1, 2, 3, 4, 5, 6}

Consider the random variable



1

 1 if 0<U < 6
6 1 2
2 <U <

if
X 
6 6
X= i1( i−1 , i ) = .
.
.
6 6  .
i=1 
5

 6 if
6
<U <1

Question: What is the distribution of X? Draw the CDF of X.

1
Answer: Since X ∈ {1, 2, . . . , 6}, if k∈
/ {1, 2, . . . , 6}, then P(X = k) = 0. Now, let
k ∈ {1, 2, . . . , 6}, then
 
k−1 k 1
P(X = k) = P <U < = .
6 6 6

Conclusion: We conclude that X ∼ Unif{1, 2, 3, 4, 5, 6} is a uniform distribution


on {1, 2, 3, 4, 5, 6}.

−→ This process simulates die rolling.

(c) Simulation of distribution with bijective CDF

−1
Let Y be a random variable such that FY is invertible (FY exists). Consider the
random variable
X = FY−1 (U ).
Question: What is the distribution of X?

Answer: Let x ∈ R. One has

P(X ≤ x) = P(FY−1 (U ) ≤ x) = P(U ≤ FY (x)) = FY (x).

Conclusion: We conclude that FX = FY , hence X and Y have same distribution.

• Example: Simulation of Cauchy distribution

Let Y be a standard Cauchy distribution, that is

1 1
fY (x) = , x ∈ R.
π 1 + x2
The CDF of Y is
Z x
1 1 1 π
FY (x) = dt = arctan(x) + .
π −∞ 1 + t2 π 2
One has  
1
FY (x) = u ⇐⇒ x = tan π(u − ) .
2

X = tan π(U − 12 ) ,

Conclusion: Consider then X has a Cauchy distribution.

2
(d) Simulation of arbitrary distribution (from a uniform on [0, 1])

Let Y be a random variable. Let us dene the generalized inverse of FY by

FY−1 (u) = inf{x ∈ R : FY (x) > u}, u ∈ [0, 1].


Consider the random variable
X = FY−1 (U ).
Question: What is the distribution of X?

Answer:
Lemma: For x∈R and u ∈ [0, 1], one has

FY (x) > u ⇐⇒ x ≥ FY−1 (u).


Proof: Exercise.

Let x ∈ R. From Lemma above, one has

P(X ≤ x) = P(FY−1 (U ) ≤ x) = P(U < FY (x)) = FY (x).

Conclusion: We conclude that FX = FY , hence X and Y have same distribution.

Consequence: To simulate an arbitrary random variable with CDF F, perform


the following algorithm:

1. −→ Compute F −1 .
2. −→ Simulate U uniform on [0, 1].
3. −→ Output X = F −1 (U ).
From the above analysis, X is a random variable with CDF F.

Appendix. How to simulate a uniform random variable on [0, 1]?

It is impossible in practice to simulate truly random numbers in [0, 1], as one


would need to manipulate innity.
In practice, we use pseudo-random numbers. Most random number generators
start with an initial value X0 , called the seed, and then recursively compute values
by specifying positive integers a, c and m, and then letting for n ≥ 0,
Xn+1 = (aXn + c) modulo m.

3
Xn
Thus each Xn is either 0, 1, . . . , m − 1 and the quantity
m
is taken as an approx-
imation to a uniform random variable on [0, 1]. It can be shown that subject to
suitable choices for a, c, m, the preceding gives rise to a sequence of numbers that
looks as if it were generated from independent random variables uniform on [0, 1].

You might also like