You are on page 1of 11

The Assumptions Underlying the Method of Least Squares (CLRM)

 Objective of OLS Estimation


 Estimation of parameters
 And inference about population parameters

 Understanding the generation of Yi

The PRF suggests that Y depends on X and u


 How is X and u generated is crucial

 Certain assumption about the generation of Y


The Assumptions Underlying the Method of Least Squares (CLRM)
The assumptions underlying the method of least squares (CLRM)

1. Regression model is Linear in parameters

Yi  1   2 X i  ui
Yi  1   2 X i   2 X i2  ui
2. X values are fixed in repeated sampling
The Assumptions Underlying the Method of Least Squares (CLRM)

3. Mean values of Disturbance Term (ui) for a given value of X is equal to zero

Symmetric distribution of error terms for a given X

It implies that u’s corresponding to


a given X are random disturbances
The Assumptions Underlying the Method of Least Squares (CLRM)

4. Equal Variance of Ui - Homoscedasticity

 Var of ui corresponding to each Xi remains same

 The distribution of Y corresponding to each X have same variance

 That is

 It implies that values of X are equally


reliable
The Assumptions Underlying the Method of Least Squares (CLRM)

 Heteroscedasticity

 That is Var of ui varies with the change in X

 The distribution of Y corresponding to each X does not have same variance

 Here the X value corresponding X3 is less reliable than X1


The Assumptions Underlying the Method of Least Squares (CLRM)

5. No Autocorrelation between Ui and Uj


 For any two X values Xi and Xj, the correlation
between ui and uj is zero

 Positive, Negative and Zero Autocorrelation


The Assumptions Underlying the Method of Least Squares (CLRM)

6. Zero Covariance between ui and Xi

No correlation between ui and Xi

It implies that the ui and explanatory variable X are uncorrelated

If they are correlated then the individual effects of X and u can not be separated
The Assumptions Underlying the Method of Least Squares (CLRM)

7. Number of observations must be Greater than Number of Parameters to


be estimated

 Overfitting Model
The Assumptions Underlying the Method of Least Squares (CLRM)

8. Variability in X (dependent) values - Variance of Xi is some finite number

 No variation in X will explain nothing of Y

 The variables must vary/change to understand the relationship


y
y

y
y

0 x x 0 x
x
The Assumptions Underlying the Method of Least Squares (CLRM)

8. No mis-specification of model – No specification Bias

 Appropriate Functional form of the model

 Specification Bias
The Assumptions Underlying the Method of Least Squares (CLRM)

8. No Mutilicollinearity
The Multicollinearity is measure of relationship between any two explanatory
variables

 No Perfect Linear Relationship between the Explanatory variables

Assume a three variable Regression model

Y1 = b1 + b2X1 + b3X2 + Ut

Then, the following should NOT be the case


X2 = a1 + a2X1
Where X2 and X1 are explanatory variables in a regression model

However, if there is multicollinearity

Y1 = b1 + b2X1 + b3(a1 + a2X1 ) + Ut

You might also like