Professional Documents
Culture Documents
Jaanav Mathavan
EE22B167
October 25, 2023
1 Implementation Details
The Code uses if-else conditional statements to check if its a 1D function or 2D function.
f1: f (x) = x2 + 3x + 8
This function is very simple. Since only one minima exists, the starting point doesn’t
matter much. Clearly the gradient descent converges to the minimum point.
Figure 1
1
Figure 2
2
f4: f (x, y) = e−(x−y) · sin(y)
This function is a 2D function. This poses a new challenge of saddle point. Its clear
as day that there is a minima. Based on starting point, we can end up in 2 places. A
Saddle point at the center or at the minima. Otherwise this is routine gradient descent
algorithm.
Figure 3
f5: f (x) = cos(x)4 − sin(x)3 − 4 sin(x)2 + cos(x) + 1
Something interesting here is that there are multiple minimas. My algorithm is highly
affected by the starting point in determining its capacity to find out the global minima.
The point gets stuck at local minimas which can be overcome by techniques which allow
exploration such as annhealing.
Figure 4
3 Conclusion
In this assignment, we successfully implemented gradient descent-based optimization for
various mathematical functions. The code allows optimization of one-dimensional and
two-dimensional functions, providing flexibility for different optimization tasks. The pro-
vided visualizations help in understanding the optimization process and the convergence
to the minimum.
The code and the associated functions are valuable tools for solving optimization
problems in different domains, and the flexibility to adapt to various functions and ranges
makes it a versatile solution.