You are on page 1of 2

Problem 5)

In optimization, assume that we are optimizing the RosenBrock function, the first necessary
condition that can lead us to the minimum is
∇ f ( x )=0
If we get the RosenBrock function,
f ( x 1 , x 2 ) =100( x 2−x 12 )2+(1− x1 )2
and find the gradient we get the following gradient:

∇ f ( x )= 2 x 1−400∗x 1∗x 2+400∗x 2 2


[ 200( x 2−x 1 2) ]
Then we need to make the gradient equal to zero and find the values of x=[x 1;x2], noting that
the gradient is the function we are finding the roots for not the RosenBrock function.

Let us use the Newton Raphson method with xo=[3;3] using the following code on Matlab:
syms x y

f=[2 * x - 400 * x * y + 400* x^3 - 2;

200 * (y - x^2)];

joc=jacobian(f, [x y]);

x_0=[3;3];

i=1; ear=1; xr(:,1)=x_0;

while (ear>(0.001)) % Newton raphson method

n(i)=i;

As a result we reached the minimum of the function with 5 iterations as follows:


n xr,n εa,n f(xr,n)
x1 x2 f1 f2
1 2.99833 8.99001 0.632069131638011 3.96402 0.00544222
2 0.989065 -3.0589 3.79967231902582 1597.18 -807.43
3 0.989079 0.978276 2.90204003756126 -0.0213402 -0.000253648
4 0.999997 0.999875 0.0171142411768257 0.0475939 -0.0238
5 1 1 8.84137998278840e-05 0 0
Even though we applied only the first necessary condition, we know that the minimum for the
RosenBrock function is [1,1], and this is exactly what we have got with 5 iterations in the
Newton Raphson Method.
Note that the necessary condition does not always lead us to the minimum, it might lead us to a
maximum or a saddle point.

You might also like