You are on page 1of 29

Introduction to Optimization

Anjela Govan
North Carolina State University
SAMSI NDHS Undergraduate workshop 2006
What is Optimization?
Optimization is the mathematical discipline
which is concerned with finding the maxima
and minima of functions, possibly subject to
constraints.
Where would we use optimization?
Architecture
Nutrition
Electrical circuits
Economics
Transportation
etc.

What do we optimize?
A real function of n variables


with or without constrains
) , , , (
2 1 n
x x x f
Unconstrained optimization






2 2
2 ) , ( min y x y x f + =
Optimization with constraints
2
2 ) , ( min
1 , 5 2
2 ) , ( min
0
2 ) , ( min
2 2
2 2
2 2
or
or
= +
+ =
> < <
+ =
>
+ =
y x
y x y x f
y x
y x y x f
x
y x y x f
Lets Optimize
Suppose we want to find the minimum of the
function
Review max-min for R
2

What is special about a local max or a local
min of a function f (x)?
at local max or local min f(x)=0
f(x) > 0 if local min
f(x) < 0 if local max


Review max-min for R
3
Review max-min for R
3
Second Derivative Test
Local min, local max, saddle point
Gradient of f vector (df/dx, df/dy, df/dz)
direction of fastest increase of f
Global min/max vs. local min/max

Gradient Descent Method Examples
Minimize function






11 , 11
) ( 5 . 0 ) , (
2 2
s s
+ =
y x
y x y x f o
Minimize function






4 , 4
) cos( ) cos( ) , (
s s
=
y x
y x y x f
Gradient Descent Method Examples
Use function gd(alpha,x0)
Does gd.m converge to a local min? Is there a
difference if o > 0 vs. o < 0?
How many iterations does it take to converge to a
local min? How do starting points x0 affect
number of iterations?
Use function gd2(x0)
Does gd2.m converge to a local min?
How do starting points x0 affect number of
iterations and the location of a local minimum?


How good are the optimization methods?
Starting point
Convergence to global min/max.
Classes of nice optimization problems
Example: f(x,y) = 0.5(ox
2
+y
2
), o > 0
Every local min is global min.


Other optimization methods
Non smooth, non differentiable surfaces
can not compute the gradient of f
can not use Gradient Method
Nelder-Mead Method
Others
Convex Hull
A set C is convex if
every point on the line
segment connecting x
and y is in C.

The convex hull for a
set of points X is the
minimal convex set
containing X.

Simplex
A simplex or n-simplex is
the convex hull of a set of
(n +1) . A simplex is an n-
dimensional analogue of a
triangle.
Example:
a 1-simplex is a line segment
a 2-simplex is a triangle
a 3-simplex is a tetrahedron
a 4-simplex is a pentatope


Nelder-Mead Method
n = number of variables, n+1 points
form simplex using these points; convex hull
move in direction away from the worst of
these points: reflect, expand, contract, shrink

Example:
2 variables 3 points simplex is triangle
3 variables 4 points simplex is tetrahedron
Nelder-Mead Method reflect, expand
Nelder-Mead Method-reflect, contract
A tour of Matlab: Snapshots from the minimization
After 0 steps
A tour of Matlab: Snapshots from the minimization
After 1 steps
A tour of Matlab: Snapshots from the minimization
After 2 steps
A tour of Matlab: Snapshots from the minimization
After 3 steps
A tour of Matlab: Snapshots from the minimization
After 7 steps
A tour of Matlab: Snapshots from the minimization
After 12 steps
A tour of Matlab: Snapshots from the minimization
After 30 steps (converged)
fminsearch function
parameters: q =[C,K]
cost function:

Minimize cost function
[q,cost]=
fninsearch(@cost_beam, q0,[],time,y_tilde)


2
N
1 i
i
) y_tilde ]) , [ , (y(t cost
i
K C

=
=
Our optimization problem
In our problem
Our function:
cost function lives in R
3

2 parameters C and K, n=2
Simplex is a triangle

2
N
1 i
i
) y_tilde ]) , [ , (y(t cost
i
K C

=
=


Done!

You might also like