You are on page 1of 3

Multiple Linear Regression

Pgm4:
a)Implement Multiple linear regression model for [Predictor: Math, Reading & Response variable :
Writing]dataset using the Gradient descent method .Find out R^2 score for the predicted model

Procedure: Multiple Linear Regression is a type of Linear Regression when the input has multiple
features(variables).
Gradient Descent is an optimization algorithm. We will optimize our cost function using Gradient Descent
Algorithm.
Step 1: Initialize values β0 , β1,..., βn with some value. In this case we will initialize with 0.
Step 2:Iteratively update, until it converges.

βj:=βj−α(hβ(x)−y)xj)
In step 2 we are changing the values of βj in a direction in which it reduces our cost function. And Gradient
gives the direction in which we want to move. Finally we will reach the minima of our cost function. But we
don't want to change values of βjβj drastically, because we might miss the minima. That's why we need
learning rate.
Multiple Linear Regression

As you can see our initial cost is huge. Now we'll reduce our cost periodically using Gradient
Descent.
Hypothesis:
Loss: (hβ(x)−y)
Gradient: (hβ(x)−y)xj
Gradient Descent Updation: βj:=βj−α(hβ(x)−y)xj)

Alpha: 0.0001
Iterations:10000
R^2 score: 0.909722327306
Multiple Linear Regression

b)Implement Multiple linear regression model for housing_prices.csv [Predictor: Area , Floor, Room
& Response variable : Price ]dataset using the library scikit_learn . Find the model parameters and
the R^2 score for the predicted model

Model parameters : b0 = 0 or -3106.4127920034116, b1 =4.68576316, b2=71.78274093,


b3=1894.45529322

R2 Score: 0.9098901726717316

You might also like