You are on page 1of 2

10 Studenmund • Using Econometrics, Sixth Edition

4-8. ˆ )2 = Σ(Y − βˆ − βˆ X )2 . To find the minimum, differentiate Σe 2
We know that Σe 2i = Σ(Yi − Yi i 0 1 i i
ˆ ˆ
with respect to β0 and β1 and set each derivative equal to zero (these are the “normal equations”):

δ (Σe 2i ) / δβˆ0 = 2[Σ(Yi − βˆ0 βˆ1X i )] = 0

or ΣYi = N( βˆ0 ) + βˆ1 (ΣX i )

δ (Σe 2i ) / δβˆ1 = 2[Σ(Yi − βˆ0 βˆ1X i )X i ] = 0

or ΣYi X i = βˆ0 (X i ) + βˆ1 (ΣX 2i )

Solve the two equations simultaneously and rearrange:

βˆ1 = [N(ΣYi X i ) − ΣYi X i ]/N(ΣX i2 ) − (ΣX i )2 ]

where x i = (X i − X) and y i = (Yi − Y).

βˆ0 = [ΣX 2i ΣYi − ΣX i ΣX i Yi ] /[N(ΣX 2i ) − (ΣX i )2 ] = Y − βˆ1X

To prove linearity:

βˆ1 = Σx i y i /Σx 2i = Σx i (Yi − Y)/ΣX 2i
= Σx i Yi /Σx 2i − Σx i (Y)/ΣX 2i
= Σx i (Yi ) / Σx 2i − YΣx i / ΣX 2i
= Σx i (Yi ) / Σx 2i since Σx i = 0
= Σk i Yi where k i = x i / Σx 2i

βˆ1 is a linear function of Y, since this is how a linear function is defined. It is also a
linear function of the β s and !, which is the basic interpretation of linearity.
βˆ1 = βˆ0 Σk1 + β1 k i x i + Σk i!1 . βˆ0 = Y − βˆ1 (X) where Y = βˆ0 + βˆ1 (X), which is also a
linear equation.

To prove unbiasedness:

βˆ1 = Σk i Yi = Σk i ( β 0 + β1X1 + !i )
= Σk i β 0 + Σk i β1X i + Σk i!i

Since k i = x i / ΣX 2i = (X i − X) / ΣX i − X)2 , then Σk1 = 0, ΣX 2i = 1/ ΣX 2i , Σk i x i = Σk i X i = 1

So, βˆi = β1 + Σk i!1 , and given the assumptions of !, E( βˆ1 ) = β1 + Σk1E(!1 ) = β1 , proving βˆ1 is
unbiased.

©2011 Pearson Education, Inc. Publishing as Addison Wesley

ki. If we ignore long-run effects. Answers to Text Exercises 11 To prove minimum variance (of all linear unbiased estimators): βˆ1 = Σk i Yi . We’ll cover simultaneous equations in Chapter 14. (c) The result doesn’t disprove the owner’s expectation.4 more frozen yogurts. (a) This possibly could violate Assumption III. the store will sell 134. or advertising of local competition. βˆ . When they are not equal.D. Students who have read Chapter 6 might reasonably be expected to try to find a variable whose expected omitted-variable bias on the coefficient of C is negative. (d) Answers will vary wildly. Inc. School is not in session during the prime yogurt-eating summer months. so perhaps it’s best just to make sure that all suggested variables are time-series for 2-week periods. The first term is minimized only by letting wi = xi /ΣXi2 . so the variance of β1* can be minimized only by manipulating the first term. Examples include the number of rainy days in the period or the number of college students returning home for vacation in the period. and the ki are the weights. so the variable might be picking up the summer time increased demand for frozen yogurt from nonstudents. Publishing as Addison Wesley . (b) Holding constant the other independent variables. Since k i = x i /Σx 2i = (X i − X)/Σ(X i − X)/Σ(X i − X)2 . the variance of the linear estimator β1 is equal to the variance of the least-squares estimator. To write an expression for any linear estimator. then: VAR( β1* ) = σ 2 / ΣX 2i = VAR( β1* ) When the least-squares weights. this means that the owner should place the ad as long as the cost of the ad is less than the increase in profits brought about by selling 134. ©2011 Pearson Education. For students who have read Chapters 1–4 only. the best answer would be any variable that measures the existence of. Σw1 = 0 and Σw i X i = 1.4 more frozen yogurts per fortnight if it places an ad. equal wi. but it’s likely that the firm is so small that no simultaneity is involved. substitute wi for ki. 1 VAR( βˆ1* ) > VAR( βˆ1 ) Q.E. 4-9. which are also weights but not necessarily equal to ki: β1* = Σwi Yi . βˆ1 is a weighted average of the Ys. The variance of β1* : VAR( β1* ) = VAR Σw i Yi = Σw i VAR Yi = σ 2 Σw i2 [VAR(Yi ) = VAR(!i ) = σ 2 ] = σ 2 Σ(w i − x i /ΣX 2i x i /ΣX 2i )2 = σ 2 Σ(w i − x i /ΣX 2i )2 + σ 2 Σx(Σx 2i )2 + 2σ 2 Σ(w i − x i /ΣX 2i )(x i /ΣX 2i ) = σ 2 Σ(w i − x i /ΣX 2i )2 + σ 2 /(ΣX 2i ) The last term in this equation is a constant. prices of. so E( β1* ) = Σx i E(Yi ) = Σw i ( β 0 + β1X1 ) = β 0 Σw i + βi Σw i X i In order for β1* to be unbiased.