You are on page 1of 2

Econometrics

Problem Set 2 - 18/2/2022

• Exercise 1 Usually, to assess the goodness of fit of a regression model,


coefficient of determination, also known as (centered) R2 is computed. It
is computed by writing the sampling variance of y as a function of the
residuals and is defined as
Pn 2
e
R = 1 − Pn i=1 i 2 .
2

i=1 (yi − ȳ)

Prove that
n
X n
X n
X
(yi − ȳ)2 = (ȳi − ŷi )2 + e2i
i=1 i=1 i=1

where ȳ is the sampling average of y, ei are residuals and ŷi are the
predicted values.
• Exercise 2
Consider the following partitioned regression

y = x 1 β1 + X 2 β 2 +  (1)

where x1 is a vector containing the first non constant regressor, β1 is


scalar, X 2 is the matrix of the last K − 1 regressors (possibly including
the constant) and β 2 is a vector of the remaining K − 1 parameters.
σ2
In this exercise you are required to prove that Var(b1 |X) = V ar(x1 )(1−Rj2 )
where V ar(x1 ) is the variance of x1 in sample and Rj2 is the coefficient of
determination of the following regression

x1 = X 2 γ + u (2)

This result suggests that the precision of the estimator depends on the
variance of the regressor and is inversely proportional to the association
between x1 and the combination of X 2 .

i Use the Frisch-Waugh-Lowell theorem to compute the estimator b1


of β1 for eq. (1). (Hint: instead of using ỹ use y)
ii Compute the sampling error for b1 and evaluate Var(b1 |X) as a func-
tion of x1 and M 2
iii Noting that x01 M 2 x1 is scalar, write Var(b1 |X) as a function of the
2
squared residuals of (2) to prove Var(b1 |X) = V ar(x1σ)(1−R2 )
j

iv Compare the previous result with Var(b1 |X) from y = α + β1 x1 + u

1
• Exercise 3

1 Consider the partitioned regression

y = X 1 β1 + X 2 β2 + 

and define P 1 = X 1 (X 01 X 1 )−1 X 01 and M 1 = I − P 1 . Let also


ỹ = M 1 y and X̃ 2 = M 1 X 2 . Prove the residuals from the regression
of ỹ on X̃ 2 numerically equals e = y − X 1 b1 − X 2 b2 = y − Xb.
(Hint: note that y = X 1 b1 + X 2 b2 + e and premultiply it by M 1 )
2 Prove that for a general regression model, y 0 y = ŷ 0 ŷ + e0 e where
ŷ = Xb. Use this intuition applied to model ỹ = X̃ 2 β2 + 2 to prove
that
ỹ 0 ỹ − e0 e = ỹX 2 (X 02 M 1 X 2 )−1 X 02 ỹ
3 Consider the following sequence of regressions and for each of them
compute the sum of the squared residuals (SSR):
(a) Regress ỹ on X1
(b) Regress ỹ on X̃ 2
(c) Regress ỹ on X 1 and X 2
(d) Regress ỹ on X2
Prove that for the model in (a) SSR = ỹ 0 ỹ (what is the implication
of this result?); for the model in (b), SSR = e0 e; for (c), SSR = e0 e
(use the FWL theorem); get some data, and in the regression (d) find
that SSR 6= e0 e.

• Exercise 4 Complete the sketch of the proof begun in class to show


that the F test computed through the Wald principle is equivalent to its
likelihood ratio counterpart, based on the comparison between

y = Xβ + 
y = Xβ + v , subject to Rβ = r

You might also like