You are on page 1of 3

Econometric Theory and Methods

176

Solution to Exercise 15.11

15.11 Consider the classical linear regression model yt = 1 + 2 xt2 + 3 xt3 + ut , ut NID(0, 2 ),

where xt2 and xt3 are exogenous variables, and there are n observations. Write down the contribution to the loglikelihood made by the t th observation. ) of which the typical element is expression Then calculate the matrix M( (15.36) evaluated at the ML estimates. How many columns does this matrix have? What is a typical element of each of the columns? Explain how to compute an information matrix test for this model using the OPG regression (15.25). How many regressors does the test regression have? What test statistic would you use, and how many degrees of freedom does it have? What types of misspecication is this test sensitive to?

For this model, the contribution to the loglikelihood for observation t is

t ( , )

1 = log(2 ) log( ) 2
t ( , )

The rst derivatives of

are

1 t = 2 (yt 1 2 xt2 3 xt3 ), 1 t 1 = 2 (yt 1 2 xt2 3 xt3 ) xt2 , 2 t 1 = 2 (yt 1 2 xt2 3 xt3 ) xt3 , and 3 1 1 t = + 3 (yt 1 2 xt2 3 xt3 )2. Let xt1 = 1 for all t. Then the second derivatives of 2 t 1 = 2 xti xtj , for i, j = 1, . . . , 3, i j 2 t 2 = 3 (yt 1 2 xt2 3 xt3 ) xti , for i = 1, . . . , 3, and i 3 2 t 1 = 2 4 (yt 1 2 xt2 3 xt3 )2. 2 has 1 k (k + 1) columns, where k is the number of parameters. The matrix M 2 has 10 columns, although, as we will see, Since k = 4 here, the matrix M Copyright c 2003, Russell Davidson and James G. MacKinnon
t ( , )

(S15.21)

are

Econometric Theory and Methods

177

one of these must be dropped because it is perfectly collinear with one of the . columns of G Let u t denote the t th OLS residual, and let denote the ML estimate of . that correspond to i and j for i, j = 1, . . . , 3, Then the six columns of M i j , have typical elements 1 2 1 u t xti xtj 2 xti xtj . 4 (S15.22)

that correspond to i and for i = 1, . . . , 3 have The three columns of M typical elements 3 1 3 u x u t xti . (S15.23) ti t 5 3 that corresponds to alone has typical element Finally, the column of M 5 2 2 1 4 + u u . t 2 6 4 t (S15.24)

To set up the OPG testing regression (15.25), we evaluate all the variables and ), we use regressors of at the ML estimates . For the matrix G( which the typical elements are the expressions on the right-hand sides of equations (S15.21). These are the same as the regressors of the OPG regression (15.28) that corresponds to the linear regression model. For the matrix ), we use all but one of the columns of the matrix M . M ( We do not use all of the columns, because, as noted above, one of the columns ). In fact, the column of M is collinear with one of the columns of G( of M that corresponds to 1 alone can be written, using expression (S15.22), as 1 2 1 u t 2 . 4 (S15.25)

that corresponds to , which, This is perfectly collinear with the column of G from (S15.21), has typical element 1 1 2 . + 3u t

It is evident that 1/ times this expression is equal to expression (S15.25). Thus there are only 9 test regressors in the OPG testing regression, and the test statistic, of which the simplest form is just the explained sum of squares from the testing regression, has 9 degrees of freedom. Asymptotically, we would expect n1 times the inner products of the nine columns that have typical elements given by expressions (S15.22) through (S15.24) with a vector of 1s to converge to the expectation of each of these expressions when we replace u t by ut and by . Therefore, nding these Copyright c 2003, Russell Davidson and James G. MacKinnon