You are on page 1of 14

ISLAMIA COLLEGE OF

SCIENCE AND COMMERCE


NAME :ARSHEEN MUBIN
ROLL NO. :21870002
COURSE :B.A ECONOMICS HONOUR’S
SEMESTER :4th
SUBJECT :ECONOMETERICS
TOPIC :CLASSICAL LINEAR REGRESSION
MODEL AND ITS ASSUMPTIONS
SUBMITTED TO: UMAR SIR
Econometrics
The term ‘Econometrics’ was
coined by “Ragnar Frisch”, and
he was one of the founder of
the Econometric Society.
It deals with the measurement
of economic relationships
.
“According to Koutsoyiannis, Econometrics is a
Social Science that is a combination of Economic
theory , Mathematics and Statistics for the
purpose of providing numerical values, for the
parameters of economic relationships and
thereby verifying economic theories.”
CLASSICAL LINEAR REGRESSION MODEL
 In Regression analysis , our is not only to obtain b1^ andb2^ but also to draw inferences about
the true b1 and b2. For this we must not only specify the functional form of the model, but also
make certain assumptions about the manner in which Yi are generated .
The PRF: Yi = β1 + β2 Xi + ui
It shows that Yi depends on both Xi and ui
Therefore , unless we are specific about how Xi an ui are generated, there is no way we
can make any statistical inferences about the Yi and also, as we shall see, about β1 and β2
Thus , the Assumptions made about the Xi variables and the error term are extremely critical to the
valid interpretation of the Regression estimates.
This Gaussian , Standard , or Classical Linear Regression Model (CLRM) is the cornerstone of the most Econometric theories

ASSUMPTIONS OF CLRM
ASSUMPTION 1
Linear Regression Model : The regression

model is linear in parameters, though it may or may
not be linear in variables. That is the regression
model as shown in equation:
 Yi = β1 + β2 Xi + ui
ASSUMPTION 2

 Fixed X Values or X Values Independent of the Error Term: Values taken by the regressor X may
be considered fixed in repeated samples (the case of fixed regressor ). Technically , X is
assumed as non-stochastic that is fixed.

ASSUMPTION 3

 Zero Mean Value of Disturbance ui: Given the value of Xi, the mean, or expected, value of the
random disturbance term ui is zero. Symbolically, we have E(ui/ Xi) =0.
ASSUMPTION 4

Homoscedasticity or Constant Variance of ui: The variance of the


error, or disturbance, term is the same regardless of the value of X.
Symbolically, var (ui) = E[ui − E(ui|Xi)] 2
=σ2
where var stands for variance.
ASSUMPTION 5
 No Autocorrelation between the Disturbances:
Given any two X values, Xi and Xj(i j), the
correlation between any two ui and uj(i j) is
zero. In short, the observations are sampled
independently. Symbolically, cov(ui, uj|Xi, Xj) =
0 (3.2.5) cov(ui, uj) = 0, if X is nonstochastic
where i and j are two different observations
and where cov means covariance
ASSUMPTION 6

 The Number of Observations n Must Be Greater


than the Number of Parameters to Be
Estimated: Alternatively, the number of
observations must be greater than the number
of explanatory variables. Since from this single
observation there is no way to estimate the two
unknowns, β1 and β2.
ASSUMPTION 7

 The Nature of X Variables: The X values in


a given sample must not all be the same.
Technically, var (X) must be a positive
number. Furthermore, there can be no
outliers in the values of the X variable,
that is, values that are very large in
relation to the rest of the observations
ASSUMPTION 8

 No specification bias: the regression


model is correctly specified. Leaving out
important explanatory variables, including
unnecessary variables, or choosing the
wrong functional form of the relationship
between the Y and X variables are some
examples of specification error.
ASSUMPTION 9
 Zero covariance between ui and Xi: The disturbance
term u and the explanatory variable(s) X are
uncorrelated .otherwise if Xi an ui are correlated it is
not possible to access there individual effects on y. If
x and u are positively correlated than when x
increases u also increases and vice versa. So it is
always difficult to isolate the influences of x and u on
Y.
ASSUMPTION 10

 There is no perfect
multicolinearity : That is there is
no perfect linear relationship
among the explanatory variables.
( this assumption is valid in
Multiple regression model).

You might also like