Professional Documents
Culture Documents
Methods
Stéphane Paltani
What are
Monte-Carlo
methods?
Generation of
Monte Carlo Methods random variables
Markov chains
Monte-Carlo
Error estimation
Stéphane Paltani Numerical
integration
What are
Monte-Carlo
What are Monte-Carlo methods? methods?
Generation of
random variables
Generation of random variables Markov chains
Monte-Carlo
Error estimation
Markov chains Monte-Carlo Numerical
integration
Optimization
Error estimation
Numerical integration
Optimization
Monte Carlo
Caveat Methods
Stéphane Paltani
What are
Monte-Carlo
methods?
The presentation aims at being user-focused and at Generation of
presenting usable recipes random variables
Markov chains
Do not expect a fully mathematically rigorous description! Monte-Carlo
Error estimation
Numerical
integration
Applications Applications
Simple examples
Markov chains
Generation of random variables Monte-Carlo
Error estimation
Numerical
Markov chains Monte-Carlo integration
Optimization
Error estimation
Numerical integration
Optimization
Monte Carlo
General concepts Methods
Stéphane Paltani
What are
Monte-Carlo
methods?
Generation of
1. Determine the statistical properties of possible inputs random variables
Markov chains
2. Generate many sets of possible inputs which follows Monte-Carlo
Numerical
3. Perform a deterministic calculation with these sets integration
√
The error on the results typically decreases as 1/ N
Monte Carlo
Outline Methods
Stéphane Paltani
Applications Applications
Simple examples
Markov chains
Generation of random variables Monte-Carlo
Error estimation
Numerical
Markov chains Monte-Carlo integration
Optimization
Error estimation
Numerical integration
Optimization
Monte Carlo
Numerical integration Methods
Stéphane Paltani
What are
Monte-Carlo
methods?
General concepts
Applications
Simple examples
Generation of
random variables
Markov chains
Monte-Carlo
Most problems can be solved by integration Error estimation
What are
Monte-Carlo
methods?
General concepts
Applications
Simple examples
Generation of
random variables
Given any arbitrary probability distribution and provided
Markov chains
one is able to sample properly the distribution with a Monte-Carlo
variance,. . . )
I determine confidence
R∞ intervals, i.e.
P(x > α) = α f (x)dx
I determine composition of distributions, i.e. given
P(x), find P(h(x)), h(x) = x 2 ; cos(x) − sin(x); . . .
Note that these are all integrals!
Monte Carlo
Optimization problems Methods
Stéphane Paltani
What are
Monte-Carlo
methods?
General concepts
Applications
Simple examples
Generation of
random variables
Numerical solutions to optimization problems incur the Markov chains
Monte-Carlo
risk of getting stuck in local minima.
Error estimation
Numerical
integration
Optimization
What are
Monte-Carlo
methods?
General concepts
Applications
Simple examples
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Numerical
integration
I Radiation transfer is Google-wise the main Optimization
What are
Monte-Carlo
methods?
General concepts
Applications
Simple examples
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Numerical
integration
Optimization
Applications Applications
Simple examples
Markov chains
Generation of random variables Monte-Carlo
Error estimation
Numerical
Markov chains Monte-Carlo integration
Optimization
Error estimation
Numerical integration
Optimization
Monte Carlo
Probability estimation Methods
What are
What is the probability to obtain either 3, 6 or 9 heads if Monte-Carlo
methods?
one draws a coin ten times? General concepts
Applications
Simple examples
I Binomial probability:
Generation of
P = B(3; 10, 0.5)+B(6; 10, 0.5)+B(9; 10, 0.5) ' 0.33 random variables
Markov chains
I Monte-Carlo simulation: Monte-Carlo
What are
Monte-Carlo
Assuming N random variables xi ∼ N (0, PNσ), i = 1, . . . , N, methods?
uncertainty is: √
Simple examples
Generation of
σx̄ = σ/ N random variables
Markov chains
The Monte-Carlo way: Monte-Carlo
Error estimation
1. Draw a set of N random variables
Numerical
yi ∼ N (0, σ), i = 1, . . . , N integration
−1
PN
2. Calculate the sample mean ȳ = N i=1 yi
Optimization
What are
Monte-Carlo
methods?
General concepts
Applications
Simple examples
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Numerical
integration
Optimization
Markov chains
Quasi-random sequences Monte-Carlo
Error estimation
Optimization
Error estimation
Numerical integration
Optimization
Monte Carlo
Random number generators Methods
What are
Monte-Carlo
methods?
I We want to draw many random variables Generation of
random variables
Ni ∼ U(0, 1), i = 1, . . . which satisfy (or approximate Random-number
generators
sufficiently well) all randomness properties Transformation method
Rejection method
Markov chains
f (Ni , Nj , . . .) ∀i, j, . . . has also the right properties Monte-Carlo
What are
Monte-Carlo
I Many random number generators are based on the methods?
Generation of
recurrence relation: random variables
Random-number
generators
What are
Monte-Carlo
1. Multiplication by a doesn’t span the whole range of methods?
values, i.e. if Ni = 10−6 , Ni+1 ≤ 0.016, failing a Generation of
simple statistical test random variables
Random-number
generators
I Swap consecutive output values: Generate a few Transformation method
values (∼ 32), and at each new call pick one at Rejection method
Quasi-random sequences
random. This is RAN1 Markov chains
2. The period m = 231 − 1 might be too short Monte-Carlo
Error estimation
I Add the outcome of two RAN1 generators with
Numerical
(slightly) different m’s (and a’s). The period is the integration
least common multiple of m1 and m2 ∼ 2 · 1018 . This Optimization
is RAN2
3. The generator is too slow
I Use in C inline Ni+1 = 1664525 · Ni + 1013904223
using unsigned long. Patch the bits into a real
number (machine dependent). This is RANQD2
Monte Carlo
Random number generators Methods
What are
NR: Numerical Recipes Monte-Carlo
methods?
GSL: GNU Scientific Library Generation of
random variables
Random-number
Library Generator Relative speed Period generators
Transformation method
231
Rejection method
NR RAN0 1.0 ∼ Quasi-random sequences
Always use GSL! See the GSL doc for the many more
algorithms available
Monte Carlo
Outline Methods
Stéphane Paltani
Markov chains
Quasi-random sequences Monte-Carlo
Error estimation
Optimization
Error estimation
Numerical integration
Optimization
Monte Carlo
Transformation method Methods
Generation of
random variables
Random-number
generators
Transformation method
Rejection method
Quasi-random sequences
Markov chains
Monte-Carlo
Error estimation
Numerical
integration
1. Given a distribution p(y ), the cumulative
Ry distribution Optimization
function (CDF) of p(y ) is F (y ) = 0 p(w) dw
2. We want to draw y uniformly in the shaded area, i.e.
uniformly over F (y ); by construction 0 ≤ F (y ) ≤ 1,
3. We draw x ∼ U(0, 1) and find y so that x = F (y )
4. Therefore y (x) = F −1 (x), x ∼ U(0, 1)
Monte Carlo
Transformation method Methods
F (y ) = 1 − e−λy = x Generation of
random variables
Random-number
generators
Transformation method
1 Rejection method
λ Markov chains
Monte-Carlo
Note: this is equivalent to Error estimation
Numerical
1 integration
y (x) = − ln(x),
λ Optimization
What are
Monte-Carlo
methods?
Generation of
random variables
Random-number
generators
Transformation method
Rejection method
Quasi-random sequences
Markov chains
Monte-Carlo
To draw a point in a homogeneous sphere of radius R:
Error estimation
1. φ can be drawn uniformly from U(0, 2π) Numerical
integration
2. θ has a sine distribution p(θ) = sin(θ)/2, θ ∈ [0; π[ Optimization
Transformation: θ = 2 arccos(x)
radius shell has a volume f (R) ∼ R 2 dR, so
3. Each √
R∝ 3 x
4. Alternatively, draw a point
p at random on the surface
of a sphere (x, y , z)/ x 2 + y 2 + z 2 with
x, y , z ∼ N (0, 1)
Monte Carlo
Outline Methods
Stéphane Paltani
Markov chains
Quasi-random sequences Monte-Carlo
Error estimation
Optimization
Error estimation
Numerical integration
Optimization
Monte Carlo
Rejection method Methods
If the CDF of p(x) is difficult to estimate (and you can What are
forget about inversion), the rejection method can be used Monte-Carlo
methods?
Generation of
random variables
Random-number
generators
Transformation method
Rejection method
Quasi-random sequences
Markov chains
Monte-Carlo
Error estimation
1. Find a comparison function f (x) that can be Numerical
integration
sampled, so that f (x) ≥ p(x), ∀x
Optimization
2. Draw a random deviate x0 from f (x)
3. Draw a uniform random deviate y0 from U(0, f (x0 ))
4. If y0 < p(x0 ), accept x0 , otherwise discard it
5. Repeat 2.–4. until you have enough values
The rejection method can be very inefficient if f (x) is very
different from p(x)
Monte Carlo
Rejection method Methods
What are
Monte-Carlo
methods?
Generation of
random variables
Random-number
generators
Transformation method
Rejection method
Quasi-random sequences
Markov chains
Monte-Carlo
Error estimation
Numerical
integration
αn e−α
The Poisson distribution is discrete: P(n; α) = n! Optimization
Make it continuous:
α[x] e−α
P(x; α) =
[x]!
1
A Lorentzian f (x) ∝ (x−α)2 +c 2
is a good comparison
function
Monte Carlo
Rejection method Methods
What are
Monte-Carlo
methods?
Generation of
random variables
Random-number
generators
Transformation method
Rejection method
Quasi-random sequences
Markov chains
Monte-Carlo
Error estimation
Numerical
A simpler way to draw a point in a homogeneous sphere integration
What are
Monte-Carlo
methods?
GNU Scientific Library implements (not exhaustive!):
Generation of
random variables
Gaussian Binomial Random-number
generators
Correlated bivariate Gaussian Poisson Transformation method
Rejection method
Exponential Quasi-random sequences
Markov chains
Quasi-random sequences Monte-Carlo
Error estimation
Optimization
Error estimation
Numerical integration
Optimization
Monte Carlo
Quasi-random sequences Methods
What are
Monte-Carlo
methods?
Generation of
random variables
Random-number
generators
Transformation method
Rejection method
Quasi-random sequences
Markov chains
Monte-Carlo
Error estimation
All sets of points fill “randomly” the area [[0; 1]; [0; 1]]
Numerical
integration
The left and center images are “sub-random” and fill more
Optimization
uniformly the area
These sequences are also called low-discrepancy
sequences
These sequences can be used to replace the RNG when
x ∼ U(a, b) is needed
Monte Carlo
Quasi-random sequences Methods
What are
Monte-Carlo
methods?
Generation of
random variables
Random-number
generators
Transformation method
Rejection method
Quasi-random sequences
Markov chains
Monte-Carlo
Error estimation
Numerical
integration
Optimization
What are
Monte-Carlo
methods?
I Sobol’s sequence: Count in binary, but using Gray
Generation of
code and put a radix in front: 0.1, 0.11, 0.01, 0.011, random variables
0.001, 0.101, 0.111, . . . Random-number
generators
Transformation method
This can be generalized to N dimensions Rejection method
Quasi-random sequences
This is the red set of points
Markov chains
Monte-Carlo
I Halton’s sequence: H(i) is constructed the following
Error estimation
way: take i expressed in a (small) prime-number
Numerical
base b (say b = 3), e.g. i = 17 ≡ 122 (base 3). integration
What are
Monte-Carlo
methods?
Generation of
random variables
Random-number
generators
Transformation method
Rejection method
Quasi-random sequences
Markov chains
Monte-Carlo
Error estimation
Numerical
integration
Optimization
What are
Monte-Carlo
methods?
Generation of
random variables
Random-number
generators
Error estimation
I Niederreiter’s sequence Numerical
integration
Optimization
Monte Carlo
Outline Methods
Stéphane Paltani
Generation of
Generation of random variables random variables
Markov chains
Monte-Carlo
Markov chains Monte-Carlo Markov chains
Markov chains Monte-Carlo
Error estimation
Markov chains Monte-Carlo
Numerical
Metropolis-Hastings MCMC integration
Optimization
Error estimation
Numerical integration
Optimization
Monte Carlo
Markov chains Methods
What are
Monte-Carlo
methods?
Generation of
A set of random variables xi , i = 1, . . . is a Markov chain if: random variables
Markov chains
Monte-Carlo
P(xi+1 = x|x1 , . . . , xi ) = P(xi+1 = x|xi ) Markov chains
Markov chains Monte-Carlo
Metropolis-Hastings MCMC
Error estimation
Numerical
integration
Optimization
Monte Carlo
Markov chains Methods
What are
Monte-Carlo
methods?
Generation of
random variables
I State machine
Markov chains
Monte-Carlo
Markov chains
I White noise: xi+1 = εi , εi ∼ N (µ, σ) Markov chains Monte-Carlo
Metropolis-Hastings MCMC
Numerical
integration
I AR[1] process: xi+1 = ϕ · xi + εi , εi ∼ N (µ, σ),
Optimization
|ϕ| < 1
I Google’s PageRank
Monte Carlo
Markov chains Methods
What are
Monte-Carlo
I Homogeneity: A Markov chain is homogeneous if methods?
Generation of
P(xi+1 = x|xi ) = P(xj+1 = x|xj ), ∀i, j random variables
Markov chains
I Ergodicity: A Markov chain is ergodic if Monte-Carlo
Markov chains
finite) Numerical
integration
Ergodicity means that all possible states will be Optimization
reached at some point
I Reversibility: A Markov chain is reversible if
there exists a distribution Π(x) such that:
Π(α)P(xi+1 = α|xi = β) = Π(β)P(xi+1 = β|xi = α),
∀i, α, β
Monte Carlo
Markov chains Methods
What are
Monte-Carlo
methods?
Generation of
random variables
If a Markov chain is reversible: Markov chains
Z Z Monte-Carlo
Markov chains
Generation of
Generation of random variables random variables
Markov chains
Monte-Carlo
Markov chains Monte-Carlo Markov chains
Markov chains Monte-Carlo
Error estimation
Markov chains Monte-Carlo
Numerical
Metropolis-Hastings MCMC integration
Optimization
Error estimation
Numerical integration
Optimization
Monte Carlo
Markov chains Monte-Carlo Methods
What are
Monte-Carlo
Can we build a Markov chain that has pretty much any methods?
Generation of
requested equilibrium distribution? YES!! The answer is random variables
Markov chain Monte-Carlo Markov chains
Monte-Carlo
I MCMC is one of the ten best algorithms ever! Markov chains
Markov chains Monte-Carlo
(along with FFT, QR decomposition, Quicksort, . . . ) Metropolis-Hastings MCMC
Error estimation
I MCMC uses a homogeneous, ergodic and reversible
Numerical
Markov chain to generate consistent samples drawn integration
What are
Monte-Carlo
methods?
Assuming a distribution f (x) that we want to sample: Generation of
random variables
1. Choose a proposal distribution p(x) and an initial
Markov chains
value x1 Monte-Carlo
Markov chains
What are
Monte-Carlo
Random-walk updates, i.e. proposal distribution is methods?
constant, i.e. x̂ = xi + εi , εi ∼ p(x): Generation of
random variables
I Gaussian update: p(x) = N (0, σ) Markov chains
Monte-Carlo
I Uniform update: p(x) = U(−υ, υ) Markov chains
Markov chains Monte-Carlo
I Student’s t update Metropolis-Hastings MCMC
Error estimation
I Lévy-flight update: p(x) is heavy-tailed (power law)
Numerical
integration
Optimization
Monte Carlo
Markov chains Monte-Carlo Methods
What are
Monte-Carlo
methods?
I Independence sampler: x̂ ∼ Y , i.e. do not use xi ! Generation of
random variables
This is equivalent to the rejection method
Markov chains
I Langevin update: x̂ = xi + εi + σ2 ∇ log f (xi ). Uses Monte-Carlo
Markov chains
What are
I Check the rejection ratio. Values between 30 and Monte-Carlo
methods?
70% are conventionally accepted Generation of
random variables
I Discard the burning phase. the autocorrelation
Markov chains
function is a standard way to check if the initial value Monte-Carlo
Markov chains
has become irrelevant or not Markov chains Monte-Carlo
Metropolis-Hastings MCMC
I The width of the proposal distribution (e.g. σ for a Error estimation
Gaussian update or υ for a uniform update) should Numerical
integration
be tuned during the burning phase to set the
Optimization
rejection fraction in the right bracket
I Reflection can be used when an edge of f (x) is
reached, e.g. set x̂ = |xi + εi | if xi must remain
positive
I Be careful with multimodal distributions. Not all
modes may be sampled
Monte Carlo
Outline Methods
Stéphane Paltani
Generation of
Generation of random variables random variables
Markov chains
Monte-Carlo
Markov chains Monte-Carlo Markov chains
Markov chains Monte-Carlo
Error estimation
Markov chains Monte-Carlo
Numerical
Metropolis-Hastings MCMC integration
Optimization
Error estimation
Numerical integration
Optimization
Monte Carlo
Metropolis-Hastings MCMC Methods
What are
I A symmetric proposal distribution might not be Monte-Carlo
methods?
optimal
Generation of
I Boundary effects: less time is spent close to random variables
Markov chains
boundaries, which might not be well sampled Monte-Carlo
Markov chains
Markov chains Monte-Carlo
Metropolis-Hastings MCMC
Error estimation
Numerical
integration
Optimization
What are
Monte-Carlo
methods?
I The proposed update is x̂ ∼ Q(x; xi )
Generation of
I The probability to accept the next point, A is modified random variables
Markov chains
so that instead of: Monte-Carlo
Markov chains
f (x̂) Markov chains Monte-Carlo
Numerical
the probability is corrected by the Hastings ratio: integration
Optimization
f (x̂) Q(x̂; xi )
A = min 1,
f (xi ) Q(xi ; x̂)
Generation of
Generation of random variables random variables
Markov chains
Monte-Carlo
Markov chains Monte-Carlo Error estimation
Error distribution
Error propagation
Resampling methods:
Error estimation bootstrap
Numerical integration
Optimization
Monte Carlo
Error distribution Methods
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Error distribution
Error propagation
Resampling methods:
bootstrap
Numerical
integration
Optimization
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Error distribution
Error propagation
Resampling methods:
bootstrap
Numerical
integration
Optimization
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Error distribution
Error propagation
Resampling methods:
bootstrap
Numerical
integration
Optimization
1. Draw N couples (xi , yi ) ∼ (U(0, 1), U(0, 1)), rejecting
those for which yi < xi − 0.5
2. Calculate the correlation coefficient r
3. Repeat 1.–2. many times
4. Study the distribution. In this case the null hypothesis
has a ∼ 10% probability
Monte Carlo
Statistical significance Methods
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Error distribution
Error propagation
Resampling methods:
bootstrap
Numerical
integration
Optimization
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Error distribution
Error propagation
Resampling methods:
bootstrap
Numerical
integration
1. Find the parameters σobserved that minimizes QKS,obs ,
Optimization
and the corresponding QKS,obs,min
2. Draw N points from a Gaussian with rms σobserved
3. Find the minimum of QKS,sim as a function of trial σ
4. Repeat 2.–3. many times, and check the distribution
of QKS,sim,min against QKS,obs,min . In this case, 20% of
the QKS,sim,min exceed QKS,obs,min
Monte Carlo
Outline Methods
Stéphane Paltani
Generation of
Generation of random variables random variables
Markov chains
Monte-Carlo
Markov chains Monte-Carlo Error estimation
Error distribution
Error propagation
Resampling methods:
Error estimation bootstrap
Numerical integration
Optimization
Monte Carlo
Error propagation Methods
What are
Monte-Carlo
methods?
Generation of
random variables
I One measures the parallax π of a given objects Markov chains
Monte-Carlo
Gaussian, i.e. the true parallax Π and π are related Error propagation
Resampling methods:
by π = Π + ε, ε ∼ N (0, σ) bootstrap
Numerical
integration
I In the limit σ π, we have the error propagation on
Optimization
the distance
D = 1/π
∆D = ∂D 1 σ
∂π ∆π = π 2 ∆π = π 2
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Error distribution
Error propagation
Resampling methods:
bootstrap
Numerical
integration
Optimization
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Error distribution
Error propagation
Resampling methods:
bootstrap
Numerical
integration
Optimization
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Error distribution
Error propagation
Resampling methods:
bootstrap
Numerical
integration
Optimization
Generation of
Generation of random variables random variables
Markov chains
Monte-Carlo
Markov chains Monte-Carlo Error estimation
Error distribution
Error propagation
Resampling methods:
Error estimation bootstrap
Numerical integration
Optimization
Monte Carlo
Bootstrap Methods
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Error distribution
Error propagation
Resampling methods:
bootstrap
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Error distribution
Error propagation
Resampling methods:
bootstrap
Numerical
integration
What if we have no idea about the underlying distribution?
Optimization
What are
Monte-Carlo
methods?
I Generate many pseudo-samples by random,
Generation of
Monte-carlo, direct extraction of data points from the random variables
Numerical
some repeated data points and some missing data integration
points Optimization
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Error distribution
Error propagation
Resampling methods:
bootstrap
Numerical
We generate a similar histogram with the data only: integration
Optimization
1. Draw N indices (integers) αi from [1; N]
2. Calculate the correlation coefficient r from the new
sets xαi and yαi , i = 1, . . . , N
3. Build and study the distribution of r by repeating
1.–2. many times
Monte Carlo
Bootstrap Methods
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Error distribution
Error propagation
Resampling methods:
bootstrap
Numerical
integration
Optimization
If the statistics of interest is discrete (i.e., can have only
few values), the bootstrap distribution will be quite bad
2
√ add a smoothing random variable ∼ N (0, σ ),
One can
σ = 1/ N, to all the picked values of a bootstrap test
Generation of
Generation of random variables random variables
Markov chains
Monte-Carlo
Markov chains Monte-Carlo Error estimation
Numerical
integration
Error estimation Monte-Carlo algorithm
Stratified sampling
Importance sampling
Optimization
Numerical integration
Monte-Carlo algorithm
Stratified sampling
Importance sampling
Optimization
Monte Carlo
Monte-Carlo algorithm Methods
What are
Monte-Carlo
(1/h)M
R
V f dV in M dimensions requires ∼ steps methods?
Generation of
random variables
Markov chains
Given a volume V and a function f (with <w> being the Monte-Carlo
average of w over V ): Error estimation
r Numerical
integration
<f 2> − <f>2
Z
Monte-Carlo algorithm
f dV ' V <f> ±V Stratified sampling
V N Importance sampling
Optimization
This is the basic theorem of Monte-Carlo integration
1. Draw N points at random within V
2. Calculate <f>= N1 N 2
P
i=1 fi (and <f >)
3. Multiply <f> by V to obtain the integral
Monte Carlo
Monte-Carlo algorithm Methods
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Numerical
integration
R Monte-Carlo algorithm
Optimization
If it is not possible or not easy to sample uniformly the
volume, one has to find a volume W containing V , which
can be sampled and whose volume is known
q
2 2
The error is then W <f >−<f> N , but
1, if w ∈ V
f (w) =
0, if not
Monte Carlo
Monte-Carlo algorithm Methods
What are
The error term depend on the variance of f within V : Monte-Carlo
methods?
r
< f 2 > − < f >2 Generation of
random variables
V
N Markov chains
Monte-Carlo
Numerical
Optimization
1 5z
s= e → ds = e5z dz
5
Z Z Z 1 Z Z Z e5 /5
=⇒ e5z dz = ds
x y z=−1 x y s=e−5 /5
Generation of
Generation of random variables random variables
Markov chains
Monte-Carlo
Markov chains Monte-Carlo Error estimation
Numerical
integration
Error estimation Monte-Carlo algorithm
Stratified sampling
Importance sampling
Optimization
Numerical integration
Monte-Carlo algorithm
Stratified sampling
Importance sampling
Optimization
Monte Carlo
Stratified sampling Methods
1
R
If F̂ is the estimator of F = V V f dV , What are
Monte-Carlo
methods?
Var(f ) σ2
Var(F̂ ) = ≡ Generation of
N N random variables
Markov chains
If one cut V in two equal volumes V1 and V2 , the new Monte-Carlo
2 4 N1 N2 Stratified sampling
Importance sampling
Optimization
The minimum is reached for: Ni /N = σi /(σ1 + σ2 )
(σ1 + σ2 )2
=⇒ Var(Fˆ0 ) =
4N
One can therefore cut V into many separate pieces,
estimate the rms in each of them, and draw the points in
each subvolume proportionally to the rms of f
Monte Carlo
Stratified sampling Methods
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Numerical
integration
Monte-Carlo algorithm
Stratified sampling
Importance sampling
Optimization
What are
When V has many dimensions d, stratified sampling Monte-Carlo
methods?
needs to cut the volume in K d subvolumes Generation of
random variables
The MISER algorithm uses a recursive stratified Markov chains
Monte-Carlo
sampling. It cuts the volume in two parts one dimension
Error estimation
at a time:
Numerical
1. For each dimension i, cut the current volume in two, integration
Monte-Carlo algorithm
Generation of
Generation of random variables random variables
Markov chains
Monte-Carlo
Markov chains Monte-Carlo Error estimation
Numerical
integration
Error estimation Monte-Carlo algorithm
Stratified sampling
Importance sampling
Optimization
Numerical integration
Monte-Carlo algorithm
Stratified sampling
Importance sampling
Optimization
Monte Carlo
Importance sampling Methods
What are
Instead of sampling uniformly, we use a distribution p: Monte-Carlo
methods?
Z Generation of
random variables
pdV = 1
V Markov chains
Monte-Carlo
r 2 2 Error estimation
< f /p > − < f /p >2
Z Z
f f
=⇒ f dV = pdV = ± Numerical
integration
V V p p N Monte-Carlo algorithm
Stratified sampling
Optimization
|f |
p=R
V |f |dV
What are
The VEGAS algorithm implements importance sampling Monte-Carlo
methods?
with a separable distribution in d dimensions:
Generation of
random variables
p ∝ g1 (x1 ) · g2 (x2 ) · g3 (x3 ) . . . · gd (xd ) Markov chains
Monte-Carlo
Numerical
1. Choose arbitrary gi (xi ) (e.g., constant) integration
Monte-Carlo algorithm
Markov chains
Monte-Carlo
Markov chains Monte-Carlo
Error estimation
Numerical
integration
Error estimation
Optimization
Stochastic optimization
Swarm intelligence
Numerical integration Genetic algorithms
Simulated annealing
Optimization
Stochastic optimization
Swarm intelligence
Genetic algorithms
Simulated annealing
Monte Carlo
Stochastic optimization Methods
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Numerical
integration
Minimization of complex functions in many dimensions
Optimization
often shows multiple minima Stochastic optimization
Swarm intelligence
Genetic algorithms
Markov chains
Monte-Carlo
Markov chains Monte-Carlo
Error estimation
Numerical
integration
Error estimation
Optimization
Stochastic optimization
Swarm intelligence
Numerical integration Genetic algorithms
Simulated annealing
Optimization
Stochastic optimization
Swarm intelligence
Genetic algorithms
Simulated annealing
Monte Carlo
Swarm intelligence Methods
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Numerical
integration
Optimization
Numerous agents explore the landscape with simple Stochastic optimization
Markov chains
Monte-Carlo
Markov chains Monte-Carlo
Error estimation
Numerical
integration
Error estimation
Optimization
Stochastic optimization
Swarm intelligence
Numerical integration Genetic algorithms
Simulated annealing
Optimization
Stochastic optimization
Swarm intelligence
Genetic algorithms
Simulated annealing
Monte Carlo
Genetic algorithms Methods
What are
Use Darwinian evoution of a gene pool to find the fittest Monte-Carlo
methods?
genes
Generation of
random variables
I Each chromosome represents a candidate solution
Markov chains
to the problem at hand, represented as a series of Monte-Carlo
Numerical
I A fitness function can be calculated for each integration
chromosome Optimization
Stochastic optimization
I At each generation Swarm intelligence
Genetic algorithms
1. Genes mutate, Simulated annealing
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Numerical
You want to build the best possible average spectrum with integration
know the sample contain 50% interlopers (galaxies with Swarm intelligence
Genetic algorithms
Markov chains
Monte-Carlo
Markov chains Monte-Carlo
Error estimation
Numerical
integration
Error estimation
Optimization
Stochastic optimization
Swarm intelligence
Numerical integration Genetic algorithms
Simulated annealing
Optimization
Stochastic optimization
Swarm intelligence
Genetic algorithms
Simulated annealing
Monte Carlo
Simulated annealing Methods
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Numerical
integration
Optimization
In nature, crystal (e.g., ice) can form over large areas, Stochastic optimization
Swarm intelligence
thus minimizing internal energy Genetic algorithms
Simulated annealing
What are
Simulated annealing requires: Monte-Carlo
methods?
Generation of
1. A configuration of all possible states S, i.e. the space random variables
3. “Options” for the next state after the current state, Numerical
integration
similar to updates in Metropolis algorithms Optimization
Stochastic optimization
4. A temperature T which is related to the probability to Swarm intelligence
Genetic algorithms
accept an increase in energy: Simulated annealing
−∆E
S0 is accepted if ς ∼ U(0, 1) < exp
kT
What are
Monte-Carlo
methods?
Generation of
I {S} and E are the problem to solve, i.e. we want random variables
Smin | E(Smin ) < E(S), ∀S 6= Smin Markov chains
Monte-Carlo
I The options are a tricky point, as a completely Error estimation
random move in many dimensions might be very Numerical
integration
inefficient in case of narrow valleys. The downhill
Optimization
simplex (amoeba) algorithm can be tried Stochastic optimization
Swarm intelligence
I The acceptance condition makes it similar to the Genetic algorithms
Simulated annealing
Metropolis algorithm
I There are also several possibilities for the cooling
schedule, for instance T → (1 − ε)T at each step
Monte Carlo
Simulated annealing Methods
What are
Monte-Carlo
methods?
Generation of
random variables
Markov chains
Monte-Carlo
Error estimation
Numerical
integration
Optimization
Stochastic optimization
Swarm intelligence
Genetic algorithms
Simulated annealing