You are on page 1of 19

Lecture 11

Heteroskedasticity:
Nature and Detection

11.1
Aims and Learning Objectives

By the end of this session students should be able to:

• Explain the nature of heteroskedasticity

• Understand the causes and consequences of


heteroskedasticity

• Perform tests to determine whether a regression


model has heteroskedastic errors

11.2
11.1 Nature of Heteroskedasticity

Heteroskedasticity is a systematic pattern in


the errors where the variances of the errors
are not constant.
Ordinary least squares assumes that all
observations are equally reliable.

11.3
Regression Model
Yi = 1 + 2Xi + Ui

Homoskedasticity: Var(Ui) =  2
Or E(Ui2) =  2

Heteroskedasticity: Var(Ui) = i 2
Or E(Ui2) = i 2
11.4
Homoskedastic pattern of errors
Consumption
Yi
.
.. . .
.. . . . .
. .
. .. .
.. . . .
. . .
. . . .
. .
. . . . .
..
. .
.

Income Xi
11.5
The Homoskedastic Case

f(Yi)

.
.
.
.
X1 X2 X3 X4 Xi
Income
11.6
Heteroskedastic pattern of errors
Consumption
.
Yi .
. .
. . . .
. .
. . . .
. . . . .
. . . .. . . .
. . . . . . .
. . . . .
. . . . .
. .

Income Xi 11.7
The Heteroskedastic Case

f(Yi)

.
. rich people

.
poor people

X1 X2 X3 Income Xi
11.8
11.2 Causes of Heteroskedasticity

Common Causes, p.389

Direct Indirect

• Learning Effects • Omitted Variables

•etc • Outliers

•etc
11.9
11.3 Consequences of Heteroskedasticity

1. Ordinary least squares estimators still linear


and unbiased.

2. Ordinary least squares estimators


not efficient.

3. Usual formulas give incorrect standard


errors for least squares.

4. Confidence intervals and hypothesis tests


based on usual standard errors are wrong. 11.10
^ ^
Yi = 1 + 2Xi + ei
heteroskedasticity: Var(ei) = i 2
Formula for ordinary least squares variance
(homoskedastic disturbances): Var ( ˆ )   2

 i
2 2
x

Formula for ordinary least squares variance


(heteroskedastic disturbances): ˆ
Var (  2 ) 
 xi  i
2 2

 x 
2 2
i

Therefore when errors are heteroskedastic ordinary


least squares estimators are inefficient (i.e. not “best”)
11.11
11.4 Detecting Heteroskedasticity

Yi  ˆ1  ˆ2 X 2i  ˆ3 X 3i  ei


ei2 : squared residuals provide proxies for Ui2

Preliminary Analysis

• Data - Heteroskedasticity often occurs in cross


sectional data (exceptions: panel data)

• Graphical examination of residuals - plot ei or


ei2 against each explanatory variable or against
predicted Y, see more Fig. 11.8 and 11.9 in p.402-3
11.12
11.13
Residual Plots

Plot residuals against one variable at a time


after sorting the data by that variable to try
to find a heteroskedastic pattern in the data.
See more Fig. 11.8 and 11.9 in p.402-3. .
ei . . .
.
.. . .
. .
. .. .
0 . . . . ..
. .
. . Xi
. . .
.
.
11.14
11.15
Formal Tests for Heteroskedasticity,
see more in p.403 onwards

11.16
White’s Test

1. Estimate Yˆi  ˆ1  ˆ2 X 2i  ˆ3 X 3i


And obtain the residuals
2. Run the following auxiliary regression:
ei2  A0  A1 X 2i  A2 X 3i  A3 X 22i  A4 X 32i  A5 X 2i X 3i  Vi
3. Calculate White test statistic from auxiliary
regression
nR ~  d . f .
2 2

4. Obtain critical value from 2 distribution


(df = num. of explanatory variables in auxiliary regression
excluding constant)
5. Decision rule: if test statistic > critical 2 value
then reject null hypothesis of no heteroskedasticity
11.17
Eviews Session
• Eviews provides menu for doing White test. From
estimation window, choose menu View/Residual
test/White Heteroscedasticity.
• For testing pure heteroscedasticity use no cross
product term
• For testing heteroskedasticity and specification
bias use with cross product term
• See p.414

11.18
Summary

In this lecture we have:

1. Analysed the theoretical causes and


consequences of heteroskedasticity

2. Outlined a number of tests which can be used to


detect the presence of heteroskedastic errors

11.19

You might also like