Professional Documents
Culture Documents
In optimization, assume that we are optimizing the RosenBrock function, the first necessary
condition that can lead us to the minimum is
∇ f ( x )=0
If we get the RosenBrock function,
f ( x 1 , x 2 ) =100( x 2−x 12 )2+(1− x1 )2
and find the gradient we get the following gradient:
Let us use the Newton Raphson method with xo=[3;3] using the following code on Matlab:
syms x y
200 * (y - x^2)];
joc=jacobian(f, [x y]);
x_0=[3;3];
n(i)=i;