You are on page 1of 21

8.

OPTIMISATION
8.1 Introduction
 Example: Cantilever Beam

 Optimisation Problem:
Design a minimum weight of cantilever beam (profile: I-
beam) that meet failure and deflection criteria
Design variables : b, h, t
Design parameters :E, r, Sy, L, F
Objective function : r.A.L

R Setiawan 125
8. OPTIMISATION
8.1 Introduction
 Application

R Setiawan 126
8. OPTIMISATION
8.2 Introduction
 Application

Numerical Optimization
Iteration

Design solution considering


manufacturing aspect

R Setiawan 127
8. OPTIMISATION
8.1 Introduction
 Essentially, similar to root location, optimisation process
seeks for a point on a function.
 The difference:
 Root location to find a root that gives the function zero:

f x   0
 Optimisation to find a root that gives minimum or
maximum value of the function:
f ' x   0
 Why minimising/maximising??
 Minimising weight of material
 Minimising cost
 Maximising performance
 …

R Setiawan 128
8. OPTIMISATION
Classification:
 The presence of Constraints:
 Constrained Optimisation
 Unconstrained Optimisation

 Number of Design variables: Fig. PT 4.4


 One-dimensional
 Multi-dimensional

 Nature of Objective function & constraints:


 Linear Programming (if Objective function and Constraints
are linear)
 Quadratic Programming (if Objective function is quadratic
and Constraints are linear)
 Non-linear Programming (if Objective function and
Constraints are nonlinear)

R Setiawan 130
8. OPTIMISATION
8.3 One Dimensional
 General Flow Chart

Initial guess Selection


for the next
evaluation
Y
Error
check

Optimum N

Find next
guess

R Setiawan 131
8. OPTIMISATION
8.3 One Dimensional
 Golden-Section Search
l1 l2 5 1
 d
l1  l2 l1 2

Recall: Bisection & False-position

 Quadratic Interpolation
 Curve around the extreme is approximated by quadratic
curve, f2(x0, x1, x2, f(x0), f(x1), f(x2))
 Find the maximum of the f2, to obtain x3.
 Repeat the process with new datapoint, (x3, f(x3))…

R Setiawan 132
8. OPTIMISATION
8.3 One Dimensional
 Golden-Section Search
l1 l2 5 1
 R  0.61803...
l1  l2 l1 2
Recall: Bisection & False-position

R Setiawan 133
8. OPTIMISATION
8.3 One Dimensional
 Golden Section Algorithm:
 Initial guess, xu and xl
 Find golden section distance:

5 1
d xu  xl 
2
x1  xl  d
x2  xu  d

 If f(x1) > f(x2)  xl new = x2


 If f(x1) < f(x2)  xu new = x1
 Repeat the above 3 steps until it reaches convergence (x1
 x2 or xl  xu )
 Learn Ex. 13.1
R Setiawan 134
8. OPTIMISATION
8.3 One Dimensional
 Golden Section
 Selection of next point (Ex. 13.1)

 Stopping criteria

R Setiawan 135
8. OPTIMISATION
8.3 One Dimensional
 Quadratic Interpolation:
 Take three initial guess
 Formulate quadratic equation
 Find the maximum/minimum
by differentiate it, resulting
in:

x3 
     
f x0  x12  x22  f  x1  x22  x02  f  x2  x02  x12
    
2 f  x0  x1  x2  2 f  x1  x2  x0  2 f  x2  x0  x1 
 Learn Ex. 13.2

R Setiawan 136
8. OPTIMISATION
8.3 One Dimensional
 Quadratic Interpolation:
 Selection of next point
 Sequential 
x0 n  x1o
x1n  x2o
x2 n  x3

 Or, like Golden section method (Ex. 13.2):

R Setiawan 137
8. OPTIMISATION
8.3 One Dimensional
 Newton Method
 Recall Newton-Rhapson method to obtain the root of
equations
 Only, now f’(x) = 0, instead of f(x) = 0.

f ' x 
xi 1  xi 
f " x 
 Function, f, must be differentiable twice to find non zero f”

Do Text book Prob. 13.6 and 13.8

R Setiawan 138
8. OPTIMISATION
8.4 Multidimensional, Unconstrained
 Most optimisation problems involve multi-dimensional

 Classification:
 Non-gradient approach (Direct Method), e.g. (Pseudo)
random search, Univariate & Pattern
 Gradient Based approach, e.g. Gradient & Hessian,
Steepest Ascent/Descent

R Setiawan 139
8. OPTIMISATION
8.4 Multidimensional, Unconstrained
 Pseudo random:
 For each design variable, calculate:

x  xl   xu  xl r Where r is a random number


between 0 to 1, generated by
computer
 Then: f(x..)
 Locate the maximum/minimum

R Setiawan 140
8. OPTIMISATION
8.4 Multidimensional, Unconstrained
 Univariate:
 Set Initial guess
 Perform 1D search while fixing the other variables.
 Repeat for each variable, once at a time
 Repeat he above until reaching the maximum/minimum

R Setiawan 141
8. OPTIMISATION
8.4 Multidimensional, Unconstrained
 Gradient, Hessian (2-D):
 Directional derivative or Gradient directs us to the
steepest ascend/descend (slope) towards
maximum/minimum for each numerical step

R Setiawan 142
8. OPTIMISATION
8.4 Multidimensional, Unconstrained
 Gradient, Hessian (2-D):
 The behaviour of ascending or descending can be checked
using Hessian matrix test:

 If H  0 and  2
f x 2
 0  f has a local minimum
 If H  0 and  2 f x 2  0  f has a local maximum
 If H  0  f has a saddle point

 For evaluation of gradient and Hessian, one can use


analytical or numerical approach

R Setiawan 143
8. OPTIMISATION
8.4 Multidimensional, Unconstrained
 Gradient, Hessian (multi-dimensional):

 f x  f x  f x  
T

f x    L 
 x1 x2 xn 

??

R Setiawan 144
8. OPTIMISATION
8.4 Multidimensional, Unconstrained
 Steepest Ascend/Descend Algorithm:
 Set initial guess
 Find the directional gradient f x 
 Check for the maximum/minimum/saddle point existence
from Hessian matrix determinant |H|
 Formulate one-dimensional directional function:
 f f f 
  
g h  f  x1  h, x2  h,L, xn  h 
 x1 x2 xn 
 Find the optimum step size, h by solving g’(h) = 0
 Find the next step of xi.  f x  f x  f x  
T

xi 1    L  h
 x1 x2 xn 
 Calculate error check
 Repeat the process, with corrected ‘initial guess’

R Setiawan 145
8. OPTIMISATION

* Learn Ex 14.3 and 14.4


* Try Prob. 14.5 and 14.8

R Setiawan 146

You might also like