Professional Documents
Culture Documents
Dr Yasser El-shaer
Upon completion of this chapter, you will be able to:
• Define local and global minima (maxima) for unconstrained and constrained
optimization problems
• Write optimality conditions for unconstrained problems
• Write optimality conditions for constrained problems
• Check optimality of a given point for unconstrained and constrained problems
• Solve first-order optimality conditions for candidate minimum points
• Check convexity of a function and the design optimization problem
• Use Lagrange multipliers to study changes to the optimum value of the cost function
due to variations in a constraint
Classification of optimization methods
OPTIMALITY CONDITIONS
Basic concept Functions of a
of optimality Single Variable
A stationary point may be (a) a minimum, (b) a maximum, or (c) an inflection point
DEFINITIONS OF GLOBAL AND LOCAL MINIMA
• A global minimum point is the one where there are no other feasible points with better cost function
values.
• A local minimum point is the one where there are no other feasible points “in the vicinity” with better cost
function values.
Existence of a Minimum
Necessary condition:
Consider a function f(x) of single variable defined for a < x < b. To find a
point of x*∈(a, b) that minimizes f(x), the first derivative of function f(x)
with respect to x at x = x* must be a stationary point; that is, f `(x*) = 0.
• Sufficient condition:
For the same function f(x) stated above and f (x*) = 0, then it can be said that
f(x*) is a minimum value of f(x) if f ``(x*) > 0, or a maximum value if f
``(x*) < 0.
Functions of a Single Variable
Basic concept of Functions of
optimality Multiple Variables
• Necessary condition:
The gradient of the function f(x) at x = x* must be a stationary point
• Sufficient condition:
For the same function f(x) stated above, let f(x*) = 0, then f(x*) has a minimum value of f(x) if
its Hessian matrix positive-definite.
All of the matrices are asymmetric except one. The symmetric matrix A associated with the
quadratic form can be obtained by setting pij = pji
• The symmetric matrix A for the quadratic form is obtained from any asymmetric
matrix P as follows:
Dividing the coefficients equally between the off-diagonal terms, we obtain the symmetric matrix
associated with the quadratic form:
DETERMINATION OF THE FORM
OF A MATRIX
Stationary point nature summary
Gradient Vector: Partial Derivatives of a Function
• Geometrically, the gradient vector is normal to the tangent plane at the point
x*, as shown in Figure for a function of three variables. Also, it points in the
direction of maximum increase in the function.
• The first derivatives of the function with respect to the variables are zero at the minimum or maximum,
which again is called a stationary point.
Example:3
MINIMUM-COST CYLINDRICAL TANK DESIGN
• The core idea is to look for points where the contour lines of f, and g are
tangent to each other.
• This is the same as finding points where the gradient vectors of f, and g, are
parallel to each other.
• Using contour maps
• Reasoning about this problem becomes easier if we
visualize it not with a graph, but with its contour lines.
when the contour lines of two functions f, and g,
are tangent, their gradient vectors are parallel
Lagrange wrote down a special new function which takes in
all the same input variables as f and g, along with the new λ :
Example
• consider the following inputs:
Solution:
One solution can be obtained by setting s to zero to satisfy
the condition 2us=0 in Eq. (g). Equations (d) through (f)
are solved to obtain x1*=x2*=1, u*=1, s=0. When s=0, the
inequality constraint is active. x1, x2, and u are solved
from the remaining three equations, (d) through (f), which
are linear in the variables. This is a stationary point of L,
so it is a candidate minimum point.
Note from Figure that it is actually a minimum point,
since any move away from x* either violates the
constraint or increases the cost function.
• These gradients are along the same line but in opposite directions,
as shown in Figure. Observe also that any small move from point C
either increases the cost function or takes the design into the
infeasible region to reduce the cost function any further (i.e., the
condition for a local minimum is violated). Thus, point (1, 1) is indeed
a local minimum point.
• This geometrical condition is called the sufficient condition for a local
minimum point.
• It turns out that the necessary condition u 0 ensures that the
gradients of the cost and the constraint functions point in opposite
directions. This way f cannot be reduced any further by stepping in
the negative gradient direction for the cost function without violating
the constraint. That is, any further reduction in the cost function leads
to leaving the earlier feasible region at the candidate minimum point.
This can be observed in Figure.
• The necessary conditions for the equality- and inequality-
constrained problem written in the standard form, can be summed
up in what are commonly known as the Karush-Kuhn-Tucker
(KKT)
Solution to optimality conditions using MATLAB