You are on page 1of 18

OptimizationAlgorithmsin MATLAB

MariaGVillarreal
ISEDepartment TheOhioStateUniversity

February03,2011

Outline
ProblemDescription O i i i Problem Optimization bl that h canbe b solve l in i MATLAB OptimizationToolboxsolvers NonLinearOptimization Multobjective Optimization

ProblemDescription
Objective:
Determine the values of the controllable process variables (factors) that improve the output of a process (or system).

Facts:
Have a computer simulator (input/output blackbox) to represent the output of a process (or system). The simulation program takes very long time to run.

Procedure:
Evaluate a set (usually small) of input combination (DOE) into the computer code and obtain an output value for each one. Construct a mathematical model to relate inputs and outputs, which is easier i and d faster f t to t evaluate l t then th the th actual t l computer t code. d Use this model (metamodel), and via an optimization algorithm obtained the values of the controllable variables (inputs/factors) that optimize a particular output (s).
3

OptimizationProblemthatcanbesolvein MATLAB(Optimization ( i i i Toolbox) lb )


ConstrainedandUnconstrainedcontinues anddiscrete
Linear Quadratic BinaryInteger Nonlinear Multiobjective M lti bj ti Problems P bl

OptimizationToolboxsolvers
Minimizers
This group of solvers attempts to find a local minimum of the objective function near a starting point x0. p

If you have h a Global Gl b l Optimization O ti i ti Toolbox T lb license, use the GlobalSearch or MultiStart solvers. These solvers automatically generate random start points i t within ithi bounds. b d
5

OptimizationToolboxsolvers
Multiobjective minimizers
This group of solvers attempts to either minimize the maximum value of a set of functions (fminimax), or to find a location where a collection of functions is below some prespecified values (fgoalattain).

NonLinearOptimization

Rastrigin's Rastrigin s function

[GlobalOptimizationToolboxGuide]

GeneralSearchAlgorithm
Move from p point x(k) to x(k+1)=x(k)+x(k), where x(k) =kd(k), such that f(x(k+1))<f(x(k)). d(k) is a desirable search direction and, k is called the step size. To find T fi d x(k) we need d to t solve l to t subproblems, b bl one to find d(k) and one for k. These iterative procedures (techniques) are often called direction methods.
9

GeneralSteps
1 Estimateareasonableinitialpointx(0).Set 1. k=0(iterationcounter). 2 Computeadirectionsearchd(k) 2. 3. Checkconvergence. 4. Calculatestepsizek inthedirectiond(k). 5. Update p x(k+1)=x(k)+kd(k) 6. Gotostep2.
10

Algorithmstocompute Search hDirection(d)


SteepestDescentMethod(Gradientmethod) ConjugateGradientMethod Newton Newtons sMethod(Usessecondorderpartial
derivativeinformation)

QuasiNewtonMethods(ApproximatesHessian
matrixanditsinverseusingfirstorderderivative)
DFPMethod et od( (Approximates pp o a esthe einverse e seof o the e Hessian) BFGSMethod(ApproximatesHessianmatrix)
11

SteepestDescentMethod

12

Newtons Newton sMethod


1. 2. 3. 4. 5. 6. 7. Estimatestartingpointx(0).Setk=0 Calculatec(k)(gradientoff(x)atx(k)) Checkconvergence (if||c(k)||<,stop) CalculatetheHessianmatrixatx(k),H(k). Calculatethedirectionsearchasd(k)=[H(k)]1c(k) .Useanother algorithmtocomputek . Updatex(k+1)=x(k)+kd(k). Setk=k+1andgotostep2.

Drawbacks: Needstocalculatesecondorderderivatives. Hneedstobepositivedefinitetoassureadecentdirection Hmaybesingularatsomepoint.


13

QuasiNewtonsMethod BFGSMethod h d (Used ( dinMALTAB) )

[Arora,J.(2004),IntroductiontoOptimumDesign,2nd Ed,page.327]

14

Methodtocalculatestepsize ( (assuming disknown) k )


Equalintervalsearch G ld Search Golden S h PolynomialInterpolation InaccurateLineSearch

15

OptimizationtoolboxforNonLinear Optimization
Solvers:
fmincon (constrained nonlinear minimization)
Trustregionreflective (default)
Allows only bounds or linear equality constraints, but not both.

Activeset (solve KarushKuhnTucker (KKT) equations and used quasiNetwon method to approximate the hessian matrix) Interiorpoint Sequential Quadratic Programming (SQP)

fminunc (unconstrained nonlinear minimization)


LargeScale Problem: Trustregion method based on the interiorreflective Newton method MediumScale: BFGS QuasiNewton method with a cubic line search p procedure.

fminsearch (unconstrained multivariable optimization, nonsmooth functions)


NelderMead simplex (derivativefree method)

16

Multiobjective Optimization
Solvers:
fminmax (minimizethemaximumvalueofasetof functions). ) fgoalattain (findalocationwhereacollectionof functionsarebelowsomeprespecified values).

17

ChoosingaSolver ( (Optimization DecisionTable) bl )

[OptimizationToolboxGuide]

18

You might also like