Professional Documents
Culture Documents
Monte Carlo Methods and Statistical Computing: My Personal Experience
Monte Carlo Methods and Statistical Computing: My Personal Experience
My Personal Experience
Debasis Kundu
Outline
1 Preface
2 A Brief History
3 Application
4 Major Ingradients
5 What we can do?
6 Statistical Computation
7 Stories Untold
Outline
1 Preface
2 A Brief History
3 Application
4 Major Ingradients
5 What we can do?
6 Statistical Computation
7 Stories Untold
Limitations:
Advantages:
Journals:
Books:
Outline
1 Preface
2 A Brief History
3 Application
4 Major Ingradients
5 What we can do?
6 Statistical Computation
7 Stories Untold
Outline
1 Preface
2 A Brief History
3 Application
4 Major Ingradients
5 What we can do?
6 Statistical Computation
7 Stories Untold
Examples
Examples:Contd.
Suppose we want to find the maximum or minimum
of the following function
f (x1 , . . . , xk ),
where a1 ≤ x1 ≤ b1 , . . . , ak ≤ xk ≤ bk .
Or suppose we want to analyze the following
non-linear model
y (x1 , . . . , xk ) = f (x1 , . . . , xk , θ) + e.
Debasis Kundu Monte Carlo Methods and Statistical Computing: My Personal E
Preface
A Brief History
Application
Major Ingradients
What we can do?
Statistical Computation
Stories Untold
Outline
1 Preface
2 A Brief History
3 Application
4 Major Ingradients
5 What we can do?
6 Statistical Computation
7 Stories Untold
Back Ground
1. Uniform.
2. Binomial.
3. Geometric.
4. Poisson.
1. Uniform.
2. Exponential.
3. Normal.
4. Gamma.
5. Log-concave probability density function
X = F −1 (U)
f (x) ≤ cg (x).
Theorem:
1. The random variable generated by this
method has density function f (x)
2. The number of iterations of the algorithm
that are needed is a geometric random variable
with mean c,
Example 1:
Suppose we want to generate from
Take
g (x) = 1, 0 < x < 1.
c = 135/64.
Example 3:
Suppose we want to generate from
2 2
f (x) = √ e −x /2 ; 0 < x < ∞.
2π
Take
g (x) = e −x ; 0 < x < ∞.
and p
c= 2e/π.
Debasis Kundu Monte Carlo Methods and Statistical Computing: My Personal E
Preface
A Brief History
Application
Major Ingradients
What we can do?
Statistical Computation
Stories Untold
Outline
1 Preface
2 A Brief History
3 Application
4 Major Ingradients
5 What we can do?
6 Statistical Computation
7 Stories Untold
Y = Xb + e
Example
Y = Xb + e
Example
Y = f (X, θ) + e
Here f is a known function the vector X is also
known, the paramete vector θ is unknown. The
problem is to estimate the parameter vector θ,
based on a sample of size n.
Example
Example
Outline
1 Preface
2 A Brief History
3 Application
4 Major Ingradients
5 What we can do?
6 Statistical Computation
7 Stories Untold
Important Issues
MLE
Newton-Raphson Method
Assuming sufficiently smooth f (θ), we want to solve
∂f (θ)
=0
∂θ
Standard method is to use Newton-Raphson
method. Using Taylor series expansion, it can be
easily obtained:
2 (k) −1
(k+1) (k) ∂ f (θ ) ∂f (θ(k) )
θ =θ −
∂θ∂θT ∂θ
Debasis Kundu Monte Carlo Methods and Statistical Computing: My Personal E
Preface
A Brief History
Application
Major Ingradients
What we can do?
Statistical Computation
Stories Untold
EM Algorithm
Copula Method
Non-linear regression
p
X
y (t) = [Ak cos(ωk t) + Bk sin(ωk t)] + e(t)
k=1
Outline
1 Preface
2 A Brief History
3 Application
4 Major Ingradients
5 What we can do?
6 Statistical Computation
7 Stories Untold
Thank You