You are on page 1of 1
2. Simple versus Multiple Regression Coefficients. This is based on Baltagi (19876). Consider the multiple regression Vist BX + BXu tus §=1,2,..40 slong with the following auxiliary regressions: Xu = 8+6Xu +90 Xu = 8+ dXa +0 si Im section 3, we showed that , the OLS estimate of can be interpreted asa simple regression of ¥ on the OLS residuals, A similar interpretation can be given toy. Kennedy (1981, p. 416) lai that ia not necemarly the sare sy, the OLS extimate of obtained from the regreasion Y¥ on by, bs and a constant, ¥;= 9+ Sibu +630. +t. Prow this claim by finding a relationship between the 3's and the 8, 15, Consider the simple regression with no constant: ¥j= 8X;+u; 7=1,2,..450 ‘where u; ~ 11D(0,¢2) independent of X,. Theil (1971) showed that among all linear estimators in ¥,, the minimum mean square estimator for 8, Le, that which minimizes E(B ~ §) Is given by Ba PDL KM/GP DL XP 40%). (a) Show that E(B) = /(1 +), where c= 02/92, X? > 0. (©) Conclude that the Bias (B) = E(B) — 9 = ~[e/(1 + 0)]9. Note that this bias is postive (negative) when 8 is negative (positive). This also means that jis biased towards zero, (©) Show that MSE(B) = £(B— 8)? = 0/[SD}.. X? + (0?/8*)]. Conclude that it is smaller than the MSE@oxs) 2, For the simple linear regression with heteroskedastcity of the form B(u2) = 0? show that E(s?/ C7, 2?) understates the variance of Boz,s which is Tha tor/(lha a ba} where b> 0,

You might also like