You are on page 1of 10

Unconstrained Optimization

Methods
Amirkabir University of Technology
Dr. Madadi
Unconstrained minimization
Rarely
   a practical design problem would be unconstrained.
 The constraints do not have significant influence in certain
design problems.
 Some of the powerful and robust methods of solving
constrained minimization problems require the use of
unconstrained minimization techniques.
 The study of unconstrained minimization techniques
provide the basic understanding necessary for the study of
constrained minimization methods.
 The unconstrained minimization methods can be used to
solve certain complex engineering analysis problems.

Find which minimizes


Classification of Unconstrained
Minimization Methods
The direct search methods require only the
objective function values
Non-gradient methods or zeroth-order
methods
Most suitable for simple problems involving
a relatively small number of variables.
Generally less efficient
Classification of Unconstrained
Minimization Methods
The descent techniques require, in addition to
the function values, the first and in some
cases the second derivatives of the objective
function
Gradient methods
Generally more efficient
First-order methods: require first derivatives
of the function
Second-order methods: require first and
second derivatives of the function
General Approach
 
Iterative
Start from an initial trial solution
Proceed toward the minimum point in a
sequential manner
Differ from each other only in the method of
generating the new point (from ) and in testing
the point for optimality.
Rate of Convergence
  
 Convergence of order

If and , the method is said to be linearly convergent


(corresponds to slow convergence). If , the method is said to be
quadratically convergent (corresponds to fast convergence). An
optimization method is said to have superlinear convergence
(corresponds to fast convergence) if

(1)
Scaling of Design Variables
  
 The rate of convergence of most unconstrained
minimization methods can be improved by scaling the
design variables
 Condition number of a matrix

 Round of error

 Well conditioned

 Ill conditioned
Direct search methods
Random
  search methods
Random jumping method
Random walk method
Random walk method with direction exploitation
Random search methods
If the objective function is discontinuous and
non-differentiable at some of the points.
Find the global minimum when the objective
function possesses several relative minima.
When other methods fail due to local
difficulties such as sharply varying functions
and shallow regions.
They can be used in the early stages of
optimization to detect the region where the
global minimum is likely to be found.
Grid search method
Large number of function evaluation
Small number of design variables
Find a good starting point for other efficient
methods

You might also like