You are on page 1of 17

Gujarati(2003): Chapter 12

Definition of Autocorrelation
We assumed of the CLRMs errors that Cov (u
i
, u
j
) = 0 for
i=j, i.e. Errors are not serially correlated, or, no autocorrelation
This is essentially the same as saying there is no pattern in the
errors.

Obviously we never have the actual us, so we use their
sample counterpart, the residuals (the s).

If there are patterns in the residuals from a model, we say that
they are autocorrelated.

Some stereotypical patterns we may find in the residuals are
given on the next 3 slides.

Positive Autocorrelation
+
-
-
t
u
+
1

t
u
-3.7
-6
-6.5
-6
-3.1
-5
-3
0.5
-1
1
4
3
5
7
8
7
+
-
Time
t
u
Negative Autocorrelation
+
-
-
t
u
+
1

t
u
+
-
t
u
Time
No pattern in residuals
No autocorrelation
+
t
u
-
-
+
1

t
u
+
-
t
u
Definition: First-order of Autocorrelation, AR(1)
If Cov (u
t
, u
s
) = E (u
t
u
s
) = 0 where t = s
Y
t
= |
1
+ |
2
X
2t
+ u
t
t = 1,,T
and if
u
t
= u
t-1
+ v
t
where -1 < < 1 ( : RHO)
and v
t
~ iid (0, o
v
2
) (white noise)
This scheme is called first-order autocorrelation and denotes as AR(1)
Autoregressive : The regression of u
t
can be explained by
itself lagged one period.
(RHO) : the first-order autocorrelation coefficient
or coefficient of autocovariance
0 ) , cov(
) var(
0 ) (
2
=
=
=
s t
v t
t
v v
v
v E
o
Autocorrelation AR(1) :
Cov (u
t
u
t-1
) > 0 => 0 < < 1 positive AR(1)
Cov (u
t
u
t-1
) < 0 => -1 < < 0 negative AR(1)

If u
t
=
1
u
t-1
+ v
t

it is AR(1), first-order autoregressive
If u
t
=
1
u
t-1
+
2
u
t-2
+ v
t

it is AR(2), second-order autoregressive
If u
t
=
1
u
t-1
+
2
u
t-2
+ +
n
u
t-n
+ v
t

it is AR(n), n
th
-order autoregressive
.
High order
autocorrelation
-1 < < 1
1973 230 320 u
1973
.
... .
1995 558 714 u
1995
1996 699 822 u
1996
1997 881 907 u
1997
1998 925 1003 u
1998
1999 984 1174 u
1999
2000 1072 1246 u
2000
Year Consumption
t
= |
1
+ |
2
Income
t
+ error
t

Example of serial correlation:
TaxRate
1999

TaxRate
2000

Error term
represents
other factors
that affect
consumption
v
t
~ iid(0, o
v
2
)
TaxRate
2000
= TaxRate
1999
+ v
2000

u
t
=

u
t-1
+ v
t
The current year Tax Rate may be determined by previous year rate
The consequences of serial correlation:
(are same as those of heteroscedasticity)
1. The estimated coefficients are still unbiased.
E(|
k
) = |
k

^
3. The standard error of the estimated coefficient, Se(|
k
)
will be biased. Therefore, t- and F tests are not valid.
^
^
2. The variances of the |
k
is no longer the smallest.
So, OLS estimators are not BLUE

Therefore, when the regression has AR(1) errors,
The estimators are not BLUE. t and F tests are invalid.
Detecting Autocorrelation:
The Durbin-Watson Test

The Durbin-Watson (DW) is a test for first order autocorrelation -
i.e. it assumes that the relationship is between an error and the
previous one
u
t
= u
t-1
+ v
t
(1)
where v
t
~ N(0, o
v
2
).
The DW test statistic actually tests
H
0
: =0 and H
1
: =0
The test statistic is calculated by

( )
DW
u u
u
t t
t
T
t
t
T
=


=
=

1
2
2
2
2
Detecting Autocorrelation:
The Durbin-Watson Test

We can also write
(2)
where is the estimated correlation coefficient. Since is
a correlation, it implies that .
Rearranging for DW from (2) would give 0sDWs4.

If = 0, DW = 2. So roughly speaking, do not reject the
null hypothesis if DW is near 2 i.e. there is little
evidence of autocorrelation

Unfortunately, DW has 2 critical values, an upper critical
value (d
u
) and a lower critical value (d
L
), and there is also
an intermediate region where we can neither reject nor not
reject H
0
.

1 1 s s p
DW~ 2 1 ( )

Durbin-Watson test(Cont.)
d = 2 (1- )
==> = 1 -

==> = 1-
^
^
d
2
d
2
^
Since -1 s s 1
^
implies 0 s d s 4
DW = ~ 2 (1 - )
(u
t
u
t-1
)
2

t=2
T
^ ^
u
t
2

t=1
T
^
^
(d)
The Durbin-Watson Test











Conditions which Must be Fulfilled for DW to be a Valid Test
1. Constant term in regression
2. Regressors are non-stochastic
3. No lags of dependent variable

Ex: How to detect autocorrelation ?
Gujarati(2003) Table12.4, pp.460
Run OLS: and check the t-value of the coefficient
t t t
v u u + =
1

9385 . 0
2
122904 . 0
1
2
1

914245 . 0 = = ~ =
DW

From OLS regression result: where d or DW


*
= 0.1229
Check DW Statistic Table (At 5% level of significance, k = 1, n=40)
d
L
= 1.246
d
u
= 1.344
0
1.246 1.344 2
d
L
d
u

DW
*
0.1229
Durbin-Watson Autocorrelation test
Reject
H
0

region
H
0
: no autocorrelation
= 0
H
1
: yes, autocorrelation exists.
or > 0
positive autocorrelation
Example 2:
UM
t
= 23.1 - 0.078 CAP
t
- 0.146 CAP
t-1
+ 0.043T
t

^
(15.6) (2.0) (3.7) (10.3)
R
2
= 0.78 F = 78.9 o
u
= 0.677 RSS = 29.3 DW = 0.23 n = 68
_
^
(i) K = 3 (number of independent variables)

(ii) n = 68 , o= 0.01 significance level
0.05
(iii) d
L
= 1.525 , d
u
= 1.703 0.05
d
L
= 1.372 , d
u
= 1.546 0.01
Reject H
0
, positive autocorrelation exists