You are on page 1of 3

1. Name some linear regression problem and deeply explain one of them.

Consider the following training set of m=4 training examples:

x y
1 0.5
2 1
4 2
0 0

Consider the linear regression model hθ(x)=θ0+θ1x. What are the values of θ0 and
θ1 that you would expect to obtain upon running gradient descent on this model?
(Linear regression will be able to fit this data perfectly.)

θ0=0.5,θ1=0

θ0=0.5,θ1=0.5

θ0=1,θ1=0.5

θ0=0,θ1=0.5

θ0=1,θ1=1

 θ0=0,θ1=0.5. As J(θ0,θ1)=0, y = hθ(x) = θ0 + θ1x. Using any two values in


the table, solve for θ0, θ1.

2. What does gradient mean in linear equation?


• gradient is the steepness and direction of a line as read from left to right.
• the gradient or slope can be found by determining the ratio of the rise
(vertical change) to the run (horizontal change) between two points on the line,
or by using a linear equation in slope-intercept form (y = mx + b).
• a positive gradient indicates the line is heading up.
• a negative gradient indicates the line is heading down.
• the gradient of a horizontal line is zero.
• the gradient of a vertical line is undefined.
In machine learning, a gradient is a derivative of a function that has more than
one input variable. Known as the slope of a function in mathematical terms, the
gradient simply measures the change in all weights with regard to the change
in error.
- What is gradient descent mean?

Gradient descent is a first-order iterative optimization algorithm for finding


a local minimum of a differentiable function. The idea is to take repeated
steps in the opposite direction of the gradient (or approximate gradient) of
the function at the current point, because this is the direction of steepest
descent. Conversely, stepping in the direction of the gradient will lead to
a local maximum of that function; the procedure is then known as gradient
ascent. Gradient descent is simply used in machine learning to find the
values of a function's parameters (coefficients) that minimize a cost function
as far as possible.

3. What is the coefficient in linear regression?

Regression coefficients are estimates of the unknown population


parameters and describe the relationship between a predictor variable and the
response. In linear regression, coefficients are the values that multiply the
predictor values

The sign of each coefficient indicates the direction of the relationship between
a predictor variable and the response variable.

 A positive sign indicates that as the predictor variable increases, the


response variable also increases.
 A negative sign indicates that as the predictor variable increases, the
response variable decreases.

The coefficient value represents the mean change in the response given a one
unit change in the predictor.

Link to Google Colab:

https://colab.research.google.com/drive/1oVpZTkYwwlxE5QsctEfX8-
MKHGSUal6T?usp=sharing

You might also like