Professional Documents
Culture Documents
Constrained and Unconstrained Optimization: Carlos Hurtado
Constrained and Unconstrained Optimization: Carlos Hurtado
Carlos Hurtado
Department of Economics
University of Illinois at Urbana-Champaign
hrtdmrt2@illinois.edu
1 Numerical Optimization
2 Minimization of Scalar Function
3 Golden Search
4 Newton’s Method
5 Polytope Method
6 Newton’s Method Reloaded
7 Quasi-Newton Methods
8 Non-linear Least-Square
9 Constrained Optimization
On the Agenda
1 Numerical Optimization
2 Minimization of Scalar Function
3 Golden Search
4 Newton’s Method
5 Polytope Method
6 Newton’s Method Reloaded
7 Quasi-Newton Methods
8 Non-linear Least-Square
9 Constrained Optimization
Numerical Optimization
min f (x )
x
or
min f (x ) s.t. x ∈ B
x
Numerical Optimization
On the Agenda
1 Numerical Optimization
2 Minimization of Scalar Function
3 Golden Search
4 Newton’s Method
5 Polytope Method
6 Newton’s Method Reloaded
7 Quasi-Newton Methods
8 Non-linear Least-Square
9 Constrained Optimization
Bracketing Method
Bracketing Method
I We selected the new point using the mid point between the extremes,
but what is the best location for the new point d?
a b d c
I One possibility is to minimize the size of the next search interval.
I The next search interval will be either from a to d or from b to c
I The proportion of the left interval is
b−a
w=
c −a
On the Agenda
1 Numerical Optimization
2 Minimization of Scalar Function
3 Golden Search
4 Newton’s Method
5 Polytope Method
6 Newton’s Method Reloaded
7 Quasi-Newton Methods
8 Non-linear Least-Square
9 Constrained Optimization
Golden Search
I The proportion of the new segment will be
c −b
1−w =
c −a
or
d −a
w +z =
c −a
Golden Search
√
1+ 5
I In mathematics, the golden ration is φ = 2
I This goes back to Pythagoras
√
1 3− 5
I Notice that 1 − φ = 2
I The Golden Search algorithm uses the golden ratio to set the new
point (using a weighed average)
I This reduces the bracketing by about 40%.
I The performance is independent of the function that is being
minimized.
C. Hurtado (UIUC - Economics) Numerical Methods 6 / 27
Golden Search
Golden Search
Golden Search
30
y = x(x - 2)(x + 2) 2
25
20
15
10
10
2 1 0 1 2
x
Golden Search
I We can use the minimize scalar function from the scipy .optimize
module.
1 >>> def f ( x ) :
2 >>> .... return ( x - 2) * x * ( x + 2) **2
3 >>> from scipy . optimize import minimize_scalar
4 >>> opt_res = minimize_scalar ( f )
5 >>> print opt_res . x
6 1.28077640403
7 >>> opt_res = minimize_scalar (f , method = ’ golden ’)
8 >>> print opt_res . x
9 1.28077640147
10 >>> opt_res = minimize_scalar (f , bounds =( -3 , -1) , method = ’
bounded ’)
11 >>> print opt_res . x
12 -2.0000002026
On the Agenda
1 Numerical Optimization
2 Minimization of Scalar Function
3 Golden Search
4 Newton’s Method
5 Polytope Method
6 Newton’s Method Reloaded
7 Quasi-Newton Methods
8 Non-linear Least-Square
9 Constrained Optimization
Newton’s Method
I Let us assume that the function f (x ) : R → R is infinitely
differentiable
I We would like to find x ∗ such that f (x ∗ ) ≤ f (x ) for all x ∈ R.
I Idea: Use a Taylor approximation of the function f (x ).
I The polynomial approximation of order two around a is:
1
p(x ) = f (a) + f 0 (a)(x − a) + f 00 (a)(x − a)2
2
I Hence,
f 0 (a)
x =a−
f 00 (a)
C. Hurtado (UIUC - Economics) Numerical Methods 9 / 27
Newton’s Method
Newton’s Method
I The Newton’s method starts with a given x1 .
I To compute the next candidate to minimize the function we use
f 0 (xn )
xn+1 = xn −
f 00 (xn )
I Do this until
|xn+1 − xn | < ε
and
|f 0 (xn+1 )| <
Newton’s Method
I A Quick Detour: Root Finding
I Consider the problem of finding zeros for p(x )
I Assume that you know a point a where p(a) is positive and a point b
where p(b) is negative.
I If p(x ) is continuous between a and b, we could approximate as:
p(x ) ' p(a) + (x − a)p 0 (a)
I The idea is the same as before. Newton’s method also works for
finding roots.
C. Hurtado (UIUC - Economics) Numerical Methods 11 / 27
Newton’s Method
Newton’s Method
- Discontinuous derivative
On the Agenda
1 Numerical Optimization
2 Minimization of Scalar Function
3 Golden Search
4 Newton’s Method
5 Polytope Method
6 Newton’s Method Reloaded
7 Quasi-Newton Methods
8 Non-linear Least-Square
9 Constrained Optimization
Polytope Method
where f : Rn → R.
I We start with the points x1 , x2 and x3 , such that
f (x1 ) ≥ f (x2 ) ≥ f (x3 )
I Using the midpoint between x2 and x3 , we reflect x1 to the point y1
I Check if f (y1 ) < f (x1 ).
I If true, you have a new polytope.
I If not, try x2 . If not, try x3
I If nothing works, shrink the polytope toward x3 .
I Stop when the size of the polytope is smaller then ε
C. Hurtado (UIUC - Economics) Numerical Methods 13 / 27
Polytope Method
Polytope Method
I Let us consider the following function:
f (x0 , x1 ) = (1 − x0 )2 + 100(x1 − x02 )2
Polytope Method
I Let us consider the following function:
f (x0 , x1 ) = (1 − x0 )2 + 100(x1 − x02 )2
3000 3000
2500 2500
2000 2000
1500 y 1500 y
1000 1000
500 500
0 0
2.01.5 2.01.5
1.0 0.5 1 1.0 0.5 1
0 0
0.0 1 0.0 1
x0 0.5 1.0 2
x1 x0 0.5 1.0 2
x1
1.5 3 1.5 3
2.0 4 2.0 4
Polytope Method
1.5
1.0
0.5
0.0
0.5
1.0 0.5 0.0 0.5 1.0 1.5
Polytope Method
On the Agenda
1 Numerical Optimization
2 Minimization of Scalar Function
3 Golden Search
4 Newton’s Method
5 Polytope Method
6 Newton’s Method Reloaded
7 Quasi-Newton Methods
8 Non-linear Least-Square
9 Constrained Optimization
Newton’s Method
1
p(x ) = f (a) + ∇f (a)(x − a) + (x − a)0 H(a)(x − a)
2
where x 0 = (x1 , · · · , xn ).
Newton’s Method
I The hessian matrix H(x ) is a square matrix of second-order partial
derivatives that describes the local curvature of a function of many
variables.
∂ 2 f (x ) ∂ 2 f (x ) ∂ 2 f (x )
2 ∂x1 ∂x2 ··· ∂x1 ∂xn
2∂x1
∂ f (x ) ∂ 2 f (x ) ∂ 2 f (x )
···
∂x22
∂x2 ∂x1 ∂x2 ∂xn
H(x ) = .. .. ..
..
. . . .
∂ 2 f (x ) ∂ 2 f (x ) ∂ 2 f (x )
∂xn ∂x1 ∂xn ∂x2 ··· ∂xn2
Newton’s Method
H(x k )−1 ∇f (x k ) = s
∇f (x k ) = H(x k )s
On the Agenda
1 Numerical Optimization
2 Minimization of Scalar Function
3 Golden Search
4 Newton’s Method
5 Polytope Method
6 Newton’s Method Reloaded
7 Quasi-Newton Methods
8 Non-linear Least-Square
9 Constrained Optimization
Quasi-Newton Methods
Quasi-Newton Methods
Quasi-Newton Methods
I One of the methods that requires the fewest function calls (therefore
very fast) is the Newton-Conjugate-Gradient (NCG).
I In python
1 >>> from scipy . optimize import fmin_ncg
2 >>> opt3 = fmin_ncg (f , x0 =[10 ,10] , fprime = gradient )
On the Agenda
1 Numerical Optimization
2 Minimization of Scalar Function
3 Golden Search
4 Newton’s Method
5 Polytope Method
6 Newton’s Method Reloaded
7 Quasi-Newton Methods
8 Non-linear Least-Square
9 Constrained Optimization
Non-linear Least-Square
Non-linear Least-Square
Non-linear Least-Square
On the Agenda
1 Numerical Optimization
2 Minimization of Scalar Function
3 Golden Search
4 Newton’s Method
5 Polytope Method
6 Newton’s Method Reloaded
7 Quasi-Newton Methods
8 Non-linear Least-Square
9 Constrained Optimization
Constrained Optimization
Constrained Optimization
Constrained Optimization
Constrained Optimization