You are on page 1of 2

STUCTURAL OPTIMISATION

2021/2022

Assignment: Nonlinear Programming


N2

A written report should be handed in containing a short presentation of the problem,


results, discussion, and source code. Remember to answer all questions and comment your
results. You are not allowed to copy solutions or computer codes from others.

1. An engineering design problem is formulated as:


min f ( x) = x1 + x 2
 x
s.t. g1 ( x) = − x1 − x 2 ≤ 0

 g 2 ( x) = −2 + x1 − 2 x 2 ≤ 0
 g 3 ( x) = 8 − 6 x1 + x12 − x 2 ≤ 0

a- Sketch the constraints in a two-dimensional design. Draw the constraints gradients and
identify the feasible domain F.
b- Plot the contours of the objective function (draw 4 contours). Draw the gradient of the
objective function, and indicate the minimising direction of this function by an arrow.
c- Locate the optimum solution on the graph. Determine which constraints are active and
find the optimum solution x1* and x 2* numerically.

2. Write a Matlab function that solves the nonlinear one dimensional problem
min f ( x) ,
x∈R
using Newton Method algorithm. Test your code to solve the minimisation problem of
chapter 2.1 page 18.

3. Write a Matlab function that solves the nonlinear multidimensional problem


minn f ( x) ,
x∈R
using Fletcher-Reeves’ Conjugate Gradient algorithm based on the code developed in
question 2.

4. Test your Fletcher-Reeves’ Conjugate Gradient solver on the Rosenbrock banana


function
f ( x1 , x 2 ) = 100( x 2 − x12 ) 2 + (1 − x1 ) 2 .
a- Visualize the function using the Matlab plot routine ezmeshc for -2 ≤ x1 ≤ 2, -2 ≤ x2 ≤ 2.
b- What point is the global minimum to f?
c- Start your code from x = (-1; 0)T and from x = (0; 1)T and compare the number of
iterations as well as the number of function and gradient evaluations.

1/2
d- Compare with fminunc from Matlab's Optimization toolbox. In particular, compare the
number of iterations as well as the number of function and gradient evaluations. Use the
following syntax to call fminunc:

options = optimset('GradObj','on','Hessian','on','LargeScale','on');
[x,fval,exitflag,output] = fminunc(@rosenbrock,x0,options);

5. Write a Matlab function that solve the nonlinear constrained optimisation problem
minn f ( x)
x∈R
,
s.t. g j ( x) ≤ 0
using an exterior method algorithm. Test your algorithm to solve the problems of page 21
and page 42 of your chapter 2.2.

Specifications, hints:

Your solver should call an external function to evaluate the function value and the gradient.
To avoid unnecessary calculations, you can use the build-in Matlab function nargout:

function [f,g,H] = rosenbrock(x)


f = ... % Compute the objective function value at x
if nargout > 1 % rosenbrock called with two output arguments
g = ... % Gradient of the function evaluated at x
End

The name of your function (rosenbrock in the example above) needs to be passed to your
solver. A way to achieve this in Matlab is to use function handlers. Assume that your solver is
called myfun. When calling myfun, use the syntax myfun(x0, @rosenbrock) which tells
myfun to use the function rosenbrock to evaluate the function.

A header for myfun solver can look like


function [x, nit, firstorderopt] = myfun(x0, fhandle)

where the input arguments are x0, a column vector of dimension n containing the initial guess
for the algorithm, and fhandle, a function handle to the function that should be minimized.
Use the command feval when you need to evaluate the function, the gradient, or the Hessian
inside myfun: when evaluating only the function value, write f = feval(fhandle,x), whereas
when evaluating the function value and the gradient, write [f, g]= feval(fhandle,x). The output
arguments to myfun are x, a column vector containing the solution; nit, the number of
iterations; and firstorderopt, the norm of the gradient, ∇f at the final step.

2/2

You might also like