You are on page 1of 32

Non-Linear Programming -

Unconstrained Optimization

Direct Search
Methods and
Univariate search
method

MRUDUL PATEL
1 M.E. STRUCTURE SEM – 1
Introduction

Real World Problems

Nonlinear Complicated
Possible Solution:
Optimization Objective Function
Search Algorithm

Analytical Solution

2
Flowchart for Search Algorithm

Start with Trial Solution Xi, Set i=1

Compute Objective function f(Xi)

Generate new solution Xi+1

Compute Objective function f(Xi+1)

Convergence Check Set i=i+1


No
Yes

Optimal Solution, Xopt=Xi


3
Classification of Search Algorithm

Search Algorithm

Direct Search Algorithm Indirect Search Algorithm


•Considers only objective •Partial derivatives of objective
function function are considered
•Partial derivatives of •Also called descent method
objective function: not
considered
•Called nongradient or
zeroth order method
4
Direct Search Algorithm

Random Search Method

Grid Search Method

Univariate Method
Direct Search
Algorithm
Pattern Directions

Rosen Brock’s Method

Simplex Method
5
Direct Search Algorithm (Contd..)

6
Random Jump Method

7
Random Jump Method ( Contd.. )

8
Random Walk Method

9
Random Walk Method ( Contd..)

10
Random Walk Method ( Contd..)

11
Random Walk Method ( Contd..)

12
Random Walk Method with
Direction Exploitation

13
Random Walk Method with
Direction Exploitation ( Contd.. )

14
Grid Search Method

 Methodology involves setting up grids in the decision


space.
 Objective function values are evaluated at the grid
points.
 The point corresponding to the best objective
function value considered as optimum solution.
 Major Drawback: number of grid points increases
exponentially with the number of decision variables.

15
Random Search Method VS Grid
Search Method ( Contd.. )

16
Univariate Search Method

 Generates trial solution for one decision variable


keeping all others fixed.
 Best solution for each of the decision variables
keeping others constant are obtained.
 The whole process is repeated iteratively till
convergence.

17
Univariate Search Method(Contd..)

18
Univariate Search Method(Contd..)

19
Univariate Search Method(Contd..)

20
Univariate Search Method(Contd..)

21
Univariate Search Method(Contd..)

22
Univariate Search Method(Contd..)

23
Example ( Univariate Search Method )

24
Example ( Contd.. )

25
Example ( Contd.. )

26
Example ( Contd.. )

27
Example ( Contd.. )

28
Pattern Direction
In Univariate method, search direction is same as coordinate axes
directions.
 Makes the convergence process slow.
 In pattern directions, search is performed not along the
coordinate directions, but along the direction towards the best
solution.
 Popular Methods: Hooke and Jeeves’ Method, and Powells
method.
 Hooke and Jeeves’ Method: uses exploratory move and pattern
move.
 Powells method: uses conjugate gradient

29
Other Direct Search Methods

Rosen Brock’s Method of Rotating Coordinates:


Modified version of Hooke and Jeeves’ Method.
Coordinate system is rotated in such a way that the first axis
always orients to the locally estimated direction of the best solution
and all the axes are made mutually orthogonal and normal to the
first one.

Simplex Method
Conventional direct search algorithm.
The best solution lies on the vertices of a geometric figure in N-
dimensional space made of a set of N+1 points.
The method compares the objective function values at the N+1
vertices and moves towards the optimum point iteratively.
Movement of the simplex algorithm: reflection, contraction and
30 expansion.
Summary

 Nonlinear optimization with complicated objective


functions or constraints: Can be solved by search
algorithm.
 Care should be taken so that the search should not
get stuck at local optima.
 Search algorithms are not directly applicable to
constrained optimization
 Penalty functions can be used to convert constrained
optimization into unconstrained optimization

31
Thank You

32

You might also like