You are on page 1of 11

VIOLATION OF OLS- AUTOCORRELATION

AUTOCORRELATION - NATURE
 OlS assumption: Given any two X values, Xi and Xj (i ≠ j), the correlation between any two ui
and uj (i ≠ j) is zero. Symbolically,
cov (ui, uj) = E{[ui−E(ui)] }{[uj−E(uj)]}
= E{[ui−0)] }{[uj−0)] }
= E[uiuj] =0
where i and j are two different observations and where cov means covariance.
 If E[uiuj] ≠0, there is auto correlation (serial correlation to be exact)
 Example The disruption caused by a strike this quarter may very well affect output next
quarter
 The consumption expenditure of one family may very well prompt another family to increase
its consumption expenditure if it wants to keep up with the Joneses.
REASON FOR AUTOCORRELATION
1. Inertia:
 A salient feature of most economic time series is inertia, or sluggishness.
 Time series such as GNP, price indexes, production, employment, and unemployment exhibit
(business) cycles. There is a “momentum’’ built into them
2. Specification Bias: Excluded Variables Case

3. Specification Bias: Incorrect Functional Form


Suppose the “true’’ or correct model in a cost-output study is as follows:
Marginal Ci = β1 + β2Qi + β3 Qi2 + ui
but we fit the following model:
Marginal Ci = α1 + α2Qi + vi
REASON FOR AUTOCORRELATION
4. Cobweb Phenomenon:
 Many agricultural commodities reflects the so-called cobweb phenomenon
where supply reacts to price with a lag of one time period because supply
decisions take time to implement (the gestation period)
 At the beginning of this year’s planting of crops, farmers are influenced by the
price prevailing last year, so that their supply function is
Supplyt = β1 + β2 Pt-1 + ut

5. “Manipulation’’ of Data:
 Extrapolation and interpolation of data can lead to autocorrelation
 Census data for 2010 and 202. interpolation of data for 2016 (say ) or extrapolation of data for
2022 (say)
CONSEQUENCE OF AUTOCORRELATION

1. Estimated coefficients (β hat) remain unbiased and consistent


2. Standard errors of coefficients (s.e.(βhat)) are biased (inference is incorrect)
 Serial correlation typically makes OLS underestimate the standard errors of
coefficients therefore we find t scores that are incorrectly too high
The same consequences as for the heteroskedasticity
DETECTION OF AUTOCORRELATION
Some basic about order of auto correlation
Consider Yt = β1 + β2Xt + ut
No auto correlation if E[utut+s] ≠0 (s ≠0)
1. First order autocorrelation ( first order auto regressive AR(1)
ut = ρut-1 + ε -1 < ρ < 1 where, ε is stochastic disturbance term
 positive serial correlation: ρ1 is positive
 negative serial correlation: ρ1 is negative
 no serial correlation: ρ1 is zero
 positive autocorrelation very common in time series data
 e.g.: a shock to GDP persists for more than one period
DETECTION OF AUTOCORRELATION

Some basic about order of auto correlation


2) Second and higher order autocorrelation
ut = ρ1ut-1 + ρ2ut-2 + εt => AR(2)

ut = ρ1ut-1 + ρ2ut-2 +…..+ ρput-p εt => AR(P)


DETECTION OF AUTOCORRELATION
Durbin- Watdon d test
 Used to determine if there is a first-order serial correlation by examining the residuals of the
equation
Assumptions (criteria for using this test):
 The regression includes the intercept
 If autocorrelation is present, it is of AR(1) type:
ut = ρut-1 + ε
 The regression does not include a lagged dependent
variable
DETECTION OF AUTOCORRELATION (DURBIN- WATDON D TEST)
Durbin- Watdon d Under Ho, DW-test Steps3: Critical value of d
test statistics is given by Write down the critical values of d L and du for level of
significance
Testing Steps
Step 4 Decision
Step1: Formulation
of Hypothesis I. If d<dL, reject Ho and accept H1

Null II. dU < d< (4-dU), accept Ho


Hypothesis, Ho: ρ=0 III. If dL < d< dU, the test is inconclusive

Alternative H1: IV. For negative autocorrelation


ρ>0 I. If d>4- dL reject H0 and accept H1
Step2: Test II. if (dU < d< (4-dU), accept Ho
Statistics III. If (4-d ≤ d) ≤ 4-d , the test is inconclusive
DETECTION OF AUTOCORRELATION
WHAT CAN BE DONE IN CASE OF AUTOCORRELATION

1. Inclusion of important Variables in the regression

2. Transformation of the regression model


3. The White robust standard errors solve not only heteroskedasticity, but also serial
correlation
 First make sure the specification of the model is correct, only then try to correct

for the form of an error term!

You might also like