You are on page 1of 3

ML Activity 2

Akshat Srivastava
2006205
4
3
2
1
. Linear Regression is a statistical method used to model the
relationship between a dependent variable (also known as the
response or target variable) and one or more independent variables
(also known as predictors or features). It is called linear regression
because the relationship between the variables is assumed to be
linear, i.e., a straight line can represent the trend in the data.
. In a Non-Linear regression analysis, the relationship between the
dependent and independent variables is not assumed to be linear.
Instead, it can have a curved or nonlinear pattern. Nonlinear
regression analysis requires a more complex modeling technique, and
it may involve fitting the data to a specific function, such as a
quadratic or exponential function.
. The error in a Linear Regression model is calculated by finding the
difference between the predicted values and the actual values of the
dependent variable. The most common method used to calculate the
error is the residual sum of squares (RSS), which is the sum of the
squared differences between the predicted and actual values.
. Mean Absolute Error (MAE) and Mean Squared Error (MSE) are both
.
measures of the error between the predicted and actual values of the
dependent variable in a Linear Regression model. MAE is the average
absolute difference between the predicted and actual values, while
4
MSE is the average squared difference between the predicted and
actual values. MSE penalizes larger errors more than MAE and is
therefore more sensitive to outliers.
. The assumptions of Linear Regression include linearity, independence,
homoscedasticity, normality, and absence of multicollinearity. Linearity
assumes that the relationship between the dependent and
independent variables is linear. Independence assumes that the
observations are independent of each other. Homoscedasticity
assumes that the variance of the errors is constant across all levels of
the independent variables. Normality assumes that the errors follow a
normal distribution. Absence of multicollinearity assumes that there is
no high correlation between the independent variables.
. RMSE and MAE are both measures of the error between the predicted
and actual values, but RMSE is preferred over MAE because it
penalizes larger errors more and is more sensitive to outliers. RMSE
also has the same unit as the dependent variable, which makes it
easier to interpret.
. Normalization and Standardization are two methods used to scale the
independent variables in a Linear Regression model. Normalization
scales the variables to a range between 0 and 1, while Standardization
scales the variables to have a mean of 0 and a standard deviation of 1.
Normalization is preferred when the distribution of the variables is
unknown, while Standardization is preferred when the distribution of
the variables is known.
. Here is an example of how to implement Linear Regression Function in
SOL:
● In order to implement a Linear Regression function in SOL, you would
typically follow these steps:
● Load or generate the data that you want to use for the regression
analysis. The data should include both the independent variables and
the dependent variable that you want to predict.
● Split the data into training and testing sets. The training set is used to
train the regression model, while the testing set is used to evaluate the
performance of the model.
● Choose a regression algorithm that is appropriate for your data and
problem. There are several regression algorithms to choose from,
including linear regression, logistic regression, and polynomial
regression.
● Train the regression model on the training data. This involves finding
the optimal values of the regression coefficients that minimize the
error between the predicted and actual values.
● Evaluate the performance of the model on the testing data. This

involves calculating the error between the predicted and actual values
and using performance metrics such as R-squared, mean squared
error (MSE), and root mean squared error (RMSE).
● Once you are satisfied with the performance of the model, you can
use it to make predictions on new data.
● The exact implementation of a Linear Regression function in SOL may
vary depending on the specific algorithm and libraries that you are
using, but the general steps outlined above should be followed.

function linearRegression(x, y) {
let sumX = 0;
let sumY = 0;
let sumXY = 0;
let sumXX = 0;
let n = x.length;

for (let i = 0; i < n; i++) {


sumX += x[i];
sumY += y[i];
sumXY += x[i] * y[i];
sumXX += x[i] * x[i];
}

let slope = (n * sumXY - sumX * sumY) / (n * sumXX -


sumX * sumX);
let intercept = (sumY - slope * sumX) / n;

return { slope, intercept };


}

let x = [1, 2, 3, 4, 5];


let y = [2, 4, 5, 4, 6];

let result = linearRegression(x, y);


console.log(result);

You might also like