Professional Documents
Culture Documents
2021/2022
2. Write a Matlab function that solves the nonlinear one dimensional problem
min f ( x) ,
x∈R
using Newton Method algorithm. Test your code to solve the minimisation problem of
chapter 2.1 page 18.
1/2
d- Compare with fminunc from Matlab's Optimization toolbox. In particular, compare the
number of iterations as well as the number of function and gradient evaluations. Use the
following syntax to call fminunc:
options = optimset('GradObj','on','Hessian','on','LargeScale','on');
[x,fval,exitflag,output] = fminunc(@rosenbrock,x0,options);
5. Write a Matlab function that solve the nonlinear constrained optimisation problem
minn f ( x)
x∈R
,
s.t. g j ( x) ≤ 0
using an exterior method algorithm. Test your algorithm to solve the problems of page 21
and page 42 of your chapter 2.2.
Specifications, hints:
Your solver should call an external function to evaluate the function value and the gradient.
To avoid unnecessary calculations, you can use the build-in Matlab function nargout:
The name of your function (rosenbrock in the example above) needs to be passed to your
solver. A way to achieve this in Matlab is to use function handlers. Assume that your solver is
called myfun. When calling myfun, use the syntax myfun(x0, @rosenbrock) which tells
myfun to use the function rosenbrock to evaluate the function.
where the input arguments are x0, a column vector of dimension n containing the initial guess
for the algorithm, and fhandle, a function handle to the function that should be minimized.
Use the command feval when you need to evaluate the function, the gradient, or the Hessian
inside myfun: when evaluating only the function value, write f = feval(fhandle,x), whereas
when evaluating the function value and the gradient, write [f, g]= feval(fhandle,x). The output
arguments to myfun are x, a column vector containing the solution; nit, the number of
iterations; and firstorderopt, the norm of the gradient, ∇f at the final step.
2/2