Professional Documents
Culture Documents
Recap / overview
Optimization problem
Negative null form
Special topics
Model
Definition
Sensitivity analysis
Checking
Topology optimization
Solution methods
Unconstrained problems
Constrained problems
Optimality criteria
Optimality criteria
Optimization algorithms
Optimization algorithms
f 0
H positive definite
f 0
Definiteness H
Nature x*
Positive d.
Minimum
Positive semi-d.
Valley
Indefinite
Saddlepoint
Negative semi-d.
Ridge
Negative d.
Maximum
Complex eigenvalues?
Question: what is the nature of a stationary point when
l
k1
F 6, l 2
k1 10, k 2 9.5
k2
dz l l cos 1 cos 2
1
1
2
2
k11 k 2 2 Fdz
2
2
k11 Fl sin 1 cos 2
0
k 2 2 Fl sin 2 cos 1
0
0
Fl sin 1 sin
k1 Fl cos 01 cos 2
k1 2k 2
Fcrit min
,
0 sin k12sin
Fl
2
k 2 Fl cos 1 lcos l 2
Fl
Fcrit 4.75
1
Engineering Optimization Concepts and Applications
Unconstrained optimization
algorithms
Single-variable methods
0th order (involving only f )
1st order (involving f and f )
2nd order (involving f, f and f )
Example:
f x1 x2 x`1 x1e x2 x2 e x1
2
Stationary points:
f
2 x1
x2
2
x
x
x
0
1
1 2 e
x
1
f 0
f 1 x x e x2 2 x e x1 0
1 2
2
x2
Strengths:
No derivatives needed
Work also for
Weaknesses:
(Usually) less efficient
Setting:
min f ( x)
Iterative process:
s.t. x x x
Model
Optimizer
x
Engineering Optimization Concepts and Applications
Termination criteria
Stop optimization iterations when:
Solution is sufficiently accurate (check optimality criteria)
Progress becomes too slow:
xk xk 1 x ,
f ( xk ) f ( xk 1 ) f
xb
xa
Brute-force approach
Simple approach: exhaustive search
f
L0
n points:
Final interval size =
Ln
2
Ln
L0
n 1
x
(bracketing)
2. Iteratively reduce the size of the interval [ak, bk]
(sectioning)
3. Approximate the minimum by the minimum of a simple
[a0, b0]
x
x1
x2 = x1+
x3 = x2+
x4 = x3+2
Unimodality
Bracketing and sectioning methods work best for unimodal
functions:
An unimodal function consists of exactly one monotonically
increasing and decreasing part
Dichotomous search
Conceptually simple
idea:
a0
L0/2
f f :
Engineering Optimization Concepts and Applications
<< L0
b0
1
L1 L0
2
L0
L0
1
Lm m 1 m
2
2
Lideal
m
L0
m
2
Lm Lideal
L0
m
10
10 10 2 m
ideal
10
10
Lm
log
L0
L0
L0
10
2
1024
ideal
L10
L0
10 10240
Ideal
interval
reduction
L0
10240
Sectioning - Fibonacci
Situation:
Fibonacci,
1180?-1250?
minimum
bracketed
between x1
and x3 :
x1
x2
x4
x3
Optimal
sectioning
IN
IN-1 = 2IN
IN-2 = 3IN
IN-3 = 5IN
IN-4 = 8IN
IN-5 = 13IN
<< IN
I N j F j 1 I N
F j Fibonacci number
I N j 2 I N j 1 I N j
I k I k 2 I k 1
converges to golden
FN 1
FN
I1 I 2 I 3
I1 I1 2 I1
1 0 1, 2
2
I2 = I1
1 5
5 1
0.618034
2
Final interval: I N I
N
1
Engineering Optimization Concepts and Applications
I1
I2 = I1
I3 = I2
Lm
log
L0
Ideal dichotomous
interval reduction
Golden
section
Fibonacci
Example:
reduction to 2% of
original interval:
N
Dichotomous
12
Golden section 9
Fibonacci
(Exhaustive
8
99)
Quadratic interpolation
Three points of the bracket define interpolating quadratic
function:
~
f ( x) ax 2 bx c
New point evaluated at
minimum of parabola:
For minimum: a > 0!
ai
ai+1
bii+1
~
b
f ' 2ax b 0 xnew
2a
xnew
Unconstrained optimization
algorithms
Single-variable methods
0th order (involving only f )
1st order (involving f and f )
2nd order (involving f, f and f )
Cubic interpolation
Similar to quadratic interpolation, but with 2 points and
derivative information:
~
f ( x) ax 3 bx 2 cx d
ai
bi
Bisection method
Optimality conditions: minimum at stationary point
Root finding of
Secant method
Also based on root finding of
Uses linear interpolation
Unconstrained optimization
algorithms
Single-variable methods
0th order (involving only f )
1st order (involving f and f )
2nd order (involving f, f and f )
Newtons method
Again, root finding of
f :
f ' ( xk )
xk 1 xk hk xk
f " ( xk )
Linear
approximation
Newtons method
Best convergence of all methods:
f
xk
xk+2
xk
xk+1
Unless it diverges
Engineering Optimization Concepts and Applications
xk+1
xk+2
0th order
Quadratic interpolation
In practice:
additional tricks
needed to deal
with:
Cubic interpolation
Bisection method
1st order
Multimodality
2nd order
fluctuations
Round-off
errors
Divergence
Secant method
Newton method
Strong