You are on page 1of 48

Engineering Optimization

Concepts and Applications

Fred van Keulen


Matthijs Langelaar
CLA H21.1
A.vanKeulen@tudelft.nl

Engineering Optimization – Concepts and Applications


Contents
● Constrained Optimization: Optimality conditions recap

● Constrained Optimization: Algorithms

● Linear programming

Engineering Optimization – Concepts and Applications


Inequality constrained problems
● Consider problem with only inequality constraints:

min f ( x) x  n g2
x
x2
s. t. g i ( x)  0 i  1 m f g1

● At optimum, only active


constraints matter: g3
min f ( x) x  n
x
x1
s. t. g j ( x)  0 j  1 k  m

● Optimality conditions similar to equality constrained case

Engineering Optimization – Concepts and Applications


Inequality constraints
● First order optimality:
g0
1
T f  g 
μ   
L ( x)  f ( x)  μ T g ( x)  s  s 
f T g
μ  0T
x x
● Consider feasible local variation around optimum:
Since
f T g f T g g0
μ  0T  x  μ x  0
x x x x
0 0
(boundary optimum) (feasible perturbation)
Engineering Optimization – Concepts and Applications
Optimality condition
● Multipliers must be non-negative:

f T g g2
x  μ x  0 x2
x x f g1

0 0
0
k m
 f    j g j  0
j 1
g 2 -f
g1 x1
 f    j g j
● Interpretation: negative gradient (descent direction) -f
lies in cone spanned by positive constraint gradients
Engineering Optimization – Concepts and Applications
Optimality condition (2)
f     g  j
g
j
2

x2 Feasible cone
● Feasible direction:
f  g g1
g  s  0 j  1 k  m
j
T  g1 2

  g  s  0
T
j

● Descent direction: g 2 -f


g1 x1
f  s  0
T

  f  s  0
T

● Equivalent interpretation: no descent direction exists


within the cone of feasible directions
Engineering Optimization – Concepts and Applications
Karush-Kuhn-Tucker conditions
● First order optimality conditions for constrained problem:

min f ( x)
x Lagrangian:
s. t. g ( x)  0 L  f ( x)  μ T g ( x)  λ T h ( x)
h ( x)  0

L f g i hi
   i   i 0
x x x x
g  0, h  0
λ  0, μ  0, k g k  0

Engineering Optimization – Concepts and Applications


Sufficiency
● KKT conditions are necessary conditions for local
constrained minima
● For sufficiency, consider the sufficiency conditions
based on the active constraints:
2L
x  T
2
x  0
x
on tangent subspace of h and active g.
● Special case: convex objective & convex feasible
region: KKT conditions sufficient for global optimality
Engineering Optimization – Concepts and Applications
Significance of multipliers
● Consider case where optimization problem depends on
parameter a: min f (x; a )
x

s. t. h(x; a )  0

Lagrangian: L(x, λ; a )  f ( x; a)   T h( x; a)
T T
L f T h f h
KKT:   0   
x x x x x
T
df f f dx
Looking for:  
da a x da
Engineering Optimization – Concepts and Applications
Significance of multipliers (3)
● Lagrange multipliers describe the sensitivity of the
objective to changes in the constraints:
df f h
 
da a a
● Similar equations can be derived for multiple
constraints and inequalities

● Multipliers give “price of raising the constraint”

● Note, this makes it logical that at an optimum,


multipliers of inequality constraints must be positive!
Engineering Optimization – Concepts and Applications
Contents
● Constrained Optimization: Optimality Criteria

● Constrained Optimization: Algorithms

● Linear Programming

Engineering Optimization – Concepts and Applications


Constrained optimization methods
● Approaches:

– Transformation methods (penalty / barrier functions)


plus unconstrained optimization algorithms
– Random methods / Simplex-like methods

– Feasible direction methods

– Reduced gradient methods

– Approximation methods (SLP, SQP)

● Note, constrained problems can also have interior optima!

Engineering Optimization – Concepts and Applications


Augmented Lagrangian method
● Recall penalty method: ~
f ( p  10)

~
f  f  pmax(0, g ) 
2

● Disadvantages:

– High penalty factor needed for accurate results

– High penalty factor causes ill-conditioning, slow convergence

Engineering Optimization – Concepts and Applications


Augmented Lagrangian method
● Basic idea:

– Add penalty term to Lagrangian

– Use estimates and updates of multipliers

L  f   i h i  p  hi
2

● Also possible for inequality constraints

● Multiplier update rules


( k 1) (k )
determine convergence i  i  2 phi (x ( k ) )
● Exact convergence for moderate values of p

Engineering Optimization – Concepts and Applications


Contents
● Constrained Optimization: Optimality Criteria

● Constrained Optimization: Algorithms

– Augmented Lagrangian

– Feasible directions methods

– Reduced gradient methods

– Approximation methods

– SQP

● Linear Programming

Engineering Optimization – Concepts and Applications


Feasible direction methods
● Moving along the boundary

– Rosen’s gradient projection method

– Zoutendijk’s method of feasible directions

● Basic idea:

– move along steepest descent direction  f


until constraints are encountered
– step direction obtained by projecting steepest
descent direction on tangent plane
– repeat until KKT point is found

Engineering Optimization – Concepts and Applications


1. Gradient projection method
● Iterations follow the
x3  f
constraint boundary: h=0
● For nonlinear
constraints, mapping
back to the constraint x1 x2
surface is needed, in
normal space
● For simplicity, consider linear min f ( x)
x
equality constrained problem: s. t. h(x)  Ax  b  0
Engineering Optimization – Concepts and Applications
Gradient projection method (2)
 f
● Recall:
h
s  As  0
– Tangent space: x T z
 h 
– Normal space: z    p  AT p
 x  s
● Projection: decompose  f in tangent/normal vector:
 f  s  z  s  A T p
A* :  Af  As  AAT p  Af  AA T p
s  f  A p T

 p   AA 
T 1
Af
 f  A AA T
 
T 1
Af
Engineering Optimization – Concepts and Applications
Gradient projection method (3)
● Search direction in tangent space:

s  f  A AA T
 T 1
 Af

 I  A AA T
  A f   P f 
T 1 Projection
matrix
1
h
T
 h h T
 h
P I  
 x f
● Nonlinear case:
x  x x
 
T
h
● Correction in normal space: s'  c
x

Engineering Optimization – Concepts and Applications


Correction to constraint boundary
● Correction in normal subspace,
xk
e.g. using Newton iterations: x’k+1 T
sk s'k 
h
ck
xk+1 x
First order Taylor approximation:
h
h(x'k 1 s'k )  h(x'k 1 )  s'k  0 
x
T 1
h h
T
 h h 
h(x'k 1 )  ck  0  ck    h(x'k 1 )
x x  x x 
1
h  h h 
T

Iterations: x k 1  x'k 1    h(x'k 1 )


x  x x 
Engineering Optimization – Concepts and Applications
Practical aspects
● How to deal with inequality constraints?

● Use active set strategy:

– Keep set of active inequality constraints

– Treat these as equality constraints

– Update the set regularly (heuristic rules)


● In gradient projection method, if s = 0:

– Check multipliers: could be KKT point

– If any i < 0, this constraint is inactive and can be


removed from the active set
Engineering Optimization – Concepts and Applications
Slack variables
● Alternative way of dealing with inequality constraints:
using slack variables:
min f ( x) min f ( x)
x x ,y

s. t. g(x)  0 s. t. g(x)  y  0
h ( x)  0 h( x)  0
xxx xxx
0  y  Large number
● Disadvantages: all constraints considered all the time,
+ increased number of design variables
Engineering Optimization – Concepts and Applications
2. Zoutendijk’s feasible directions
● Basic idea:
– move along steepest descent direction  f
until constraints are encountered
– at constraint surface, solve subproblem to find
descending feasible direction
– repeat until KKT point is found

● Subproblem: min 
s ,
T
Descending: f s  0 s.t. f T s   ,
T
Feasible: g i s  0 i g i T s  
 1  si  1
Engineering Optimization – Concepts and Applications
Zoutendijk’s method
● Subproblem linear: efficiently solved

● Determine active set before solving subproblem!

● When  = 0: KKT point found

● Method needs feasible starting point.

Engineering Optimization – Concepts and Applications


Contents
● Constrained Optimization: Optimality Criteria

● Constrained Optimization: Algorithms

– Augmented Lagrangian

– Feasible directions methods

– Reduced gradient methods

– Approximation methods

– SQP

● Linear Programming

Engineering Optimization – Concepts and Applications


Reduced gradient methods
● Basic idea:

– Choose set of n - m decision variables d


– Use reduced gradient in unconstr. gradient-based method
1
● Recall:  f f h h x3
 
d d s s d x2

reduced gradient x1

● State variables s can 1


h h (iteratively for
be determined from: s   d nonlinear constraints)
s d
Engineering Optimization – Concepts and Applications
Reduced gradient method
● Nonlinear constraints: Newton iterations to return to
constraint surface (determine s):
h
h(d*, s*)  0  h(d*, s)  ds
s
1
 h 
ds     h(d*, s) until convergence
 s 
● Variants using 2nd order information also exist
● Drawback: selection of decision variables
(but some procedures exist)

Engineering Optimization – Concepts and Applications


Contents
● Constrained Optimization: Optimality Criteria

● Constrained Optimization: Algorithms

– Augmented Lagrangian

– Feasible directions methods

– Reduced gradient methods

– Approximation methods

– SQP

● Linear Programming

Engineering Optimization – Concepts and Applications


Approximation methods
● SLP: Sequential Linear Programming

● Solving series of linear approximate problems

● Efficient methods for linear constrained problems


available f
min f (x k )  x  x k 
x x k
min f ( x)
x
g
s. t. g( x )  0 s. t. g(x k )  x  x k   0
x k
h ( x)  0
h
h(x k )  x  x k   0
x k
Engineering Optimization – Concepts and Applications
SLP 1-D illustration
● SLP iterations approach convex feasible domain from
outside:
min f  2x f
x

s. t. g ( x)  3( x  1)( x  4)  0 g

dg
 6 x  15
dx
g ( x)1  12  15 x  0
x = 0.8
g ( x) 2  1.92  10.2 x  0 x = 0.988

Engineering Optimization – Concepts and Applications


SLP points of attention
● Solves LP problem in every cycle: efficient only when
analysis cost is relatively high
● Tendency to diverge

– Solution: trust region (move limits)

x2

x1

Engineering Optimization – Concepts and Applications


SLP points of attention (2)
● Infeasible starting point can result in unsolvable LP
problem
– Solution: relaxing constraints in first cycles

f f
min f (x k )  x  x k  min f (xk )   x  xk   k 
x x k x, x k
g g
s. t. g (x k )  x  x k   0 s. t. g(x k )   x  xk    1  0
x k x k

k sufficiently large to force


solution into feasible region

Engineering Optimization – Concepts and Applications


SLP points of attention (3)
● Cycling can occur when optimum lies on curved
constraint
– Solution: move limit reduction strategy

f
x2

x1
Engineering Optimization – Concepts and Applications
Method of Moving Asymptotes
● First order method, by Svanberg (1987)

● Builds convex approximate problem,


approximating responses using:
 Pi Qi Pi
n
Qi 
y ( x)  r      xi  Li U i  xi
i 1  U i  xi xi  Li 

● Approximate problem solved


efficiently
● Popular method in topology optimization

Engineering Optimization – Concepts and Applications


Sequential Approximate Optimization
● Zeroth order method:

1. Determine initial trust region

2. Generate sampling points (design of experiments)

3. Build response surface (e.g. Least Squares, Kriging, …)

4. Optimize approximate problem

5. Check convergence, update trust region, repeate from 2


● Many variants!

● See also Lecture 4

Engineering Optimization – Concepts and Applications


Sequential Approximate Optimization
● Good approach for expensive models

● RS dampens noise

● Versatile
Design domain

Optimum

Sub-optimal point
Response surface Trust region

Engineering Optimization – Concepts and Applications


Contents
● Constrained Optimization: Optimality Criteria

● Constrained Optimization: Algorithms

– Augmented Lagrangian

– Feasible directions methods

– Reduced gradient methods

– Approximation methods

– SQP

● Linear Programming

Engineering Optimization – Concepts and Applications


SQP
● SQP: Sequential Quadratic Programming

● Newton method to solve the KKT conditions

min f ( x) L f T h
x KKT points:  λ  0T
x x x
s. t. h ( x)  0

Newton:  2 L 2L   L 
 x 2 xλ   x   x 
 2     
 L  L  λ 
2
 L 
 λx λ 2   λ 
Engineering Optimization – Concepts and Applications
SQP (2)
● Newton:  2 L 2 L   L T 
 x 2 xλ   x   
 x 
 2     T 
 L  L  λ 
2
 L 
 λx λ 2   λ 
 2 f 2
T  h h 
T
T T
 2 λ   f h 
x x 2
 x  x    λ
       x x 
 h  λ   
 0   h 
x 
 W AT   x  f  AT λ 
    
 A 0  λ   h 
Engineering Optimization – Concepts and Applications
SQP (3)
 W AT   x  f  AT λ  λ k 1  λ k  λ k
      
 A 0  λ   h  x k 1  x k  s k
T
 Wk A k   s k    f k 
T
f k  Wk s k  A k λ k 1  0
   
 Ak 0  λ k 1    h k  Ak sk  hk  0

Note: KKT conditions of:


1 T
min q (s k )  f k s k  s k Wk s k
sk 2
Quadratic subproblem for
s.t. A k s k  h k  0
finding search direction sk

Engineering Optimization – Concepts and Applications


Quadratic subproblem
● Quadratic subproblem with linear constraints can be
solved efficiently:
1 T
– General case: min x Qx  cT x
x 2
s.t. Ax  b  0
– KKT condition: Q AT  x   c
     
 A 0  λ   b 
1
– Solution:
x  Q A   cT

    
λ   A 0   b 
Engineering Optimization – Concepts and Applications
Basic SQP algorithm

1. Choose initial point x0 and initial multiplier estimates 0

2. Set up matrices for QP subproblem

3. Solve QP subproblem  sk , k+1

4. Set xk+1 = xk + sk

5. Check convergence criteria

Finished

Engineering Optimization – Concepts and Applications


SQP refinements
2L
● For convergence of Newton method,
2 must be
x
positive definite

– Line search along sk improves robustness


2L
● To avoid computation of Hessian information for x 2 ,
quasi-Newton approaches (DFP, BFGS) can be used
(also ensure positive definiteness)
● For dealing with inequality constraints, various active
set strategies exist

Engineering Optimization – Concepts and Applications


Comparison
Method AugLag Zoutendijk GRG SQP

Feasible starting point? No Yes Yes No


Nonlinear constraints? Yes Yes Yes Yes
Equality constraints? Yes Hard Yes Yes
Uses active set? Yes Yes No Yes
Iterates feasible? No Yes No No
Derivatives needed? Yes Yes Yes Yes

SQP generally seen as best general-purpose method for


constrained problems
Engineering Optimization – Concepts and Applications
Contents
● Constrained Optimization: Optimality Criteria

● Constrained Optimization: Algorithms

● Linear programming

Engineering Optimization – Concepts and Applications


Linear programming problem
● Linear objective and constraint functions:

min c1 x1  c2 x2    cn xn  cT x
x
T
s.t. hi  a i x  bi  0 i  1 m1
j  m1  1 m2 
T
g j  a j x  bj  0

T  a1T   b1 
min c x  T b 
x
 a 2   2 
s.t. h  A1x  b1  0 A1  , b1 
    
g  A2x  b2  0  T  
a m1  bm1 
Engineering Optimization – Concepts and Applications
Feasible domain
● Linear constraints divide design space into two convex
half-spaces: x2 X1 T
g1  a1 x  b1  0

x1
● Feasible domain = intersection of convex half-spaces:
X  X 1  X 2   X m
● Result: X = convex polyhedron
Engineering Optimization – Concepts and Applications
Global optimality
● Convex objective function min cT x
x
on convex feasible domain: s.t. h  A1x  b1  0
KKT point = unique global g  A 2x  b2  0
optimum
● KKT conditions: L  cT x  λ T A1x  b1   μT A 2 x  b 2 
A1x  b1  0, A 2 x  b 2  0
 T T T T
c  λ A 1  μ A 2  0
  T
μ A 2 x  b 2   0
λ  0, μ  0

Engineering Optimization – Concepts and Applications

You might also like