You are on page 1of 13

MACHINE LEARNING

BSIS 7th 1M
WEEK – 2 Final
BY

SANA KIRAN
The Islamia University of Bahawalpur
Rahim Yar Khan
Introduction

• What is Linear Regression?


• Mathematical Example of linear
regression.
• Pros and Cons of LR.
• “Linear regression predicts the relationship
What is between two variables by assuming a linear
connection between the independent and
Linear dependent variables.
• It seeks the optimal line that minimizes the
Regression? sum of squared differences between predicted
and actual values.
Simple Linear Regression
• In a simple linear regression, there is one independent variable and one
dependent variable. The model estimates the slope and intercept of the line of
best fit, which represents the relationship between the variables.
• The slope represents the change in the dependent variable for each unit change
in the independent variable, while the intercept represents the predicted value of
the dependent variable when the independent variable is zero.
• Linear regression shows the linear relationship between the
independent(predictor) variable i.e. X-axis and the dependent(output) variable
i.e. Y-axis, called linear regression.
• If there is a single input variable X(independent variable), such linear regression is
simple linear regression.
The blue line
is referred to
as the best-fit
straight line.
Simple Regression Calculation
• To calculate best-fit line linear regression uses a traditional slope-intercept form
which is given below,

• Yi = α0 + α1Xi

• where Yi = Dependent variable, α0 = constant/Intercept,


α1 = Slope/Intercept, Xi = Independent variable.

• This algorithm explains the linear relationship between the dependent(output)


variable y and the independent(predictor) variable X using a straight line Y= α0 +
α1 X.
The goal of the linear
regression algorithm is
to get the best values for
α0 and α1 to find the
best fit line. The best fit
line is a line that has the
least error which means
the error between
predicted values and
actual values should be
minimum.
• Simple model : The Linear regression model is the
simplest equation using which the relationship
between the multiple predictor variables and
predicted variable can be expressed.

• Computationally efficient : The modeling speed of


Pros of Linear regression is fast as it does not require
complicated calculations and runs predictions fast
when the amount of data is large.
Linear
• Interpretability of the Output: The ability of Linear
Regression regression to determine the relative influence of
one or more predictor variables to the predicted
value when the predictors are independent of each
other is one of the key reasons of the popularity of
Linear regression. The model derived using this
method can express the what change in the
predictor variable causes what change in the
predicted or target variable.
• Overly-Simplistic: The Linear regression model is
too simplistic to capture real world complexity

• Linearity Assumption: Linear regression makes


Cons of strong assumptions that there is Predictor
(independent) and Predicted (dependent) variables
Linear are linearly related which may not be the case.

Regression • Severely affected by Outliers: Outliers can have a


large effect on the output, as the Best Fit Line tries
to minimize the MSE for the outlier points as well,
resulting in a model that is not able to capture the
information in the data.

You might also like