You are on page 1of 6

Optimization methods (MFE)

Review session

Elena Perazzi

EPFL

Fall 2019

Elena Perazzi (EPFL) Optimization methods (MFE) Review session Fall 2019 1/6
Lecture 1 – Unconstrained optimization
Taylor theorem (1-dim and n-dim)
I Concept of gradient and hessian
Unconstrained optimization:
I First-order necessary conditions
I Sufficient sufficient conditions
I 1-dimensional and n-dimensional case
Newton method for root finding (1-dim and n-dim)
Newton method for min finding (1-dim and n-dim)
Speed of convergence of a sequence
Secant method
I Quasi-Newton method implements n-dimensional secant method
Comparing secant and Newton method
I faster convergence for Newton
I No need to compute derivatives for secant
Steepest descent (or gradient descent) method (n-dim)
Simplex method (derivative-free)
Elena Perazzi (EPFL) Optimization methods (MFE) Review session Fall 2019 2/6
Lecture 2 – Optimization with equality constraints

Lagrange theorem
Cookbook recipe: the Lagrangean and its FOCs
Constraint qualification : at the optimum, javobian of constraint must
have maximum rank
Intuition: to find a constrained minimum of f , set df = 0 in the
directions in which the constraint is preserved.
Interpretation of Lagrange multipliers
Second order sufficient conditions
I Conditions on the Lagrangean: positive/negative definiteness of d 2 L
for (dx, dy ) s.t. dg = 0.
I Conditions on the bordered hessian.
Newton method with equality constraints: NOT EXAM MATERIAL

Elena Perazzi (EPFL) Optimization methods (MFE) Review session Fall 2019 3/6
Lecture 3 – Optimization with inequality constraints

Penalty function method (also for equality constraints)


Barrier method (only for inequality constraints)
Kuhn-Tucker theorem (necessary conditions) and intuition
Complementarity slackness conditions
Kuhn-Tucker sufficient conditions (convexity properties of objective
function and constraints)

Elena Perazzi (EPFL) Optimization methods (MFE) Review session Fall 2019 4/6
Lecture 4 – Dynamic optimization

Finite-horizon: solving Euler equations starting from final time T and


iterating backward
Infinite-horizon
I Euler equations and transversality condition.
I Solution by forward iteration of Euler equations
I Dynamic programming methods
Existence and uniqueness of solution: Contraction mapping theorem,
Blackwell sufficient conditions.
Solution algorithm by Value function iteration
Example: optimal growth model

Elena Perazzi (EPFL) Optimization methods (MFE) Review session Fall 2019 5/6
Lecture 5 – Dynamic optimization

Solution methods by linearization around steady state NOT EXAM


MATERIAL
some examples of stochastic problems:
I Optimal portfolio choice with CRRA utility
I Unemployed worker problem

Elena Perazzi (EPFL) Optimization methods (MFE) Review session Fall 2019 6/6

You might also like