Professional Documents
Culture Documents
+ y2i + ui ;
y2i = y1i + zi
where zi iid( z ; 2z ) and ui
OLS estimator of equals
iid(0;
2)
u
^=
a. Derive the reduced form for y2i .
PN
y2 )(y1i
i=1 (y2i
PN
y2 )2
i=1 (y2i
y1 )
6= 1. Furthermore, the
+ (1
2
u
2
z
2
u
Question 2
Consider the linear regression model yi = x0i + ui ; where x0i is a 1 K vector of explanatory variables
and i = 1; :::; N . Assume (yi ; xi ) is iid and ui j xi
iid(0, 2 ): Stacking the observations we have
0
^
y = X + u and the OLS estimator of equals = (X X) 1 X 0 y.
a. Show that
N 1=2 ( ^
) = (N
X 0 X)
2E
[xi x0i ] :
PN
i=1 xi ui :
):
N 1=2
1 PN
xi ui ;
N i=1
Question 3
Consider the binary variable yi with density
f (yi jxi ; ) =
exp(x0 )
8
< pi
:
if yi = 1
pi
if yi = 0
and where pi = 1+exp(xi 0 ) the logit model. Assume the model is correctly specied and let 0 denote
i
the true value of . Suppose we have a random sample (yi ; xi ), i = 1; :::; N; and we want to estimate
by Maximum Likelihood. The log likelihood function is
LN ( ) =
a. Derive that the score vector is
N
X
yi ln pi + (1
yi ) ln (1
i=1
@LN ( ) X
=
(yi
@
i=1
b. Derive that E
@LN ( )
@
= 0:
0
pi ) xi :
pi ) :