You are on page 1of 11

Econometrics 1

Lecture 3
BLUE Properties of OLS Estimators

1
Unbiasedness of Estimates-1
Unbiasedness of estimates


E 1 1

Proofs:

y x x y y

i i i

E 1 E y 2 x E i
i
x
x x 2
n
i
i

1
E 1 x wi E y i =
1
x wi E 1 2 xi ei
n n

x

i
1
or E 1 x wi 1 2 xi 1 1 x wi 2
i
2 x wi xi = 1

i n i n i

xi x
where wi
xi x
2

i 2
Unbiasedness of Estimates-2

Proof:

x x y y x x y w y w 2 x i ei
2
i i i i

x x x x
2 2 i i i 1
i i

E 2 E wi yi E

w x e
i 1 2 i i E wi 1 2 wi xi wi ei 2

3
Variance of Estimator-1
Variance of estimated parameters and the dependent variable



var 1 2 1

X2
N xi2

var 1 var
1
x w


n i i = y


2
1 1 x wi 2 2
var 1 x wi var y i 2 2
2 x wi =
n n n

x wi x2


n
var 1 2 2
2 2 2

2 1
x w i = var 1 2

n n n xi



var 2 2
1
x

x 2
i
i

4
Variance-2


var 2 2
1
2
xi x


i

It was proved above that


E 2 E wi y i E




i 1 2 i i E wi 1 2 wi xi wi ei 2
w x e

2
var 2 E E 2 2 E w e 2
i i ii i w 2
E e
2
2

1
xi x
2

cov 1 , 2 E 1 E 1 2 E 2 XE 2 E 2 X var 2
2

Where use is made of the fact that


5

E 1 E 1 Y 2 X Y E 2 X 2 E 2 X
How is variance of affected by the variance of ?

var Yi 2
Proof:

2

var Yi E y i y E 1 2 xi ei 1 2 x
2
1
2 xi E ei E 1 E 2 x
2
2

A small Shazam program

sample 1 8
read y constant x
4 1 5
6 1 8
7 1 10
8 1 12
11 1 14
15 1 17
18 1 20
22 1 25
stat y
ols y x /pcov=bb
stop

6
Shazam Output an Example
Results from Shazam

|_sample 1 8
|_read y constant x
3 VARIABLES AND 8 OBSERVATIONS STARTING AT OBS 1

|_stat y
NAME N MEAN ST. DEV VARIANCE MINIMUM MAXIMUM
Y 8 11.375 6.3682 40.554 4.0000 22.000

|_ols y x / pcov

REQUIRED MEMORY IS PAR= 1 CURRENT PAR= 2000


OLS ESTIMATION
8 OBSERVATIONS DEPENDENT VARIABLE= Y
...NOTE..SAMPLE RANGE SET TO: 1, 8

R-SQUARE = 0.9807 R-SQUARE ADJUSTED = 0.9775


VARIANCE OF THE ESTIMATE-SIGMA**2 = 0.91402
STANDARD ERROR OF THE ESTIMATE-SIGMA = 0.95604
SUM OF SQUARED ERRORS-SSE= 5.4841
MEAN OF DEPENDENT VARIABLE = 11.375
LOG OF THE LIKELIHOOD FUNCTION = -9.84116

VARIABLE ESTIMATED STANDARD T-RATIO PARTIAL STANDARDIZED ELASTICITY


NAME COEFFICIENT ERROR 6 DF P-VALUE CORR. COEFFICIENT AT MEANS
X 0.95873 0.5493E-01 17.45 0.000 0.990 0.9903 1.1694
CONSTANT -1.9274 0.8338 -2.312 0.060-0.686 0.0000 -0.1694

VARIANCE-COVARIANCE MATRIX OF COEFFICIENTS


X 0.30178E-02
CONSTANT -0.41872E-01 0.69523
X CONSTANT
|_stop
TYPE COMMAND

7
Coefficient of determination
Coefficient of determination is a measure in the regression analysis that
shows the explanatory power independent variables (regressors) in
explaining the variation on dependent variable (regressand). The total
variation on the dependent variable can be decomposed as following:

2

Var y i y i y y i y ei 2
y i y ei2 2 y i y ei
2

i i i i i

Var y i y i y ei2 For T observations and K explanatory


2

i i

variables
[Total variation] = [Explained variation] + [Residual variation]
df = T-1 K-1 T-K
8
Coefficient of determination
y y
2
y y
2
e 2

1
i

i

i

R2 1 R2
y y y y yi y
2 2 2
i i

y y
2
i
R 2
; 0 R2 1
y y
2
i

Total variation = explained variation +


unexplained variation
R-SQUARE = 0.9807 R-SQUARE ADJUSTED =
0.9775

9
Comparison of the OLS and Another Alternative
Estimator

Efficiency of parameters in weighted least square


bw can be written as a linear function of random variables Y1 Y2 and Y3 ;
1 1 1
bw ai Yi where linear weights, ai , , , are applied to Y1 Y2 and Y3 .
i 2 3 6
Therefore bw is a linear estimator.
1 1 1 1 1 1
b) bw is unbiased because E bw E Y1 E Y2 E Y3
2 3 6 2 3 6
c) Given that the Y1 Y2 and Y3 have N , 2 distribution. The variance of bw
can be computed as

1 1 1 1 1 1 7
var bw var Y1 var Y2 var Y3 2 2 2 2
2 3 6 4 9 36 18

10
Comparison of the OLS and Another Alternative
Estimator-2

2
The variance of the OLS estimator is var b
3
e) Whether bw will be good estimator if its variance less than that of
b var bw var b . It would be a bad estimator if var bw var b .
7 2 7 1 1
f). If 2 9 var bw 9 3.5 var b 2 9 3 . Therefore
18 18 3 3
Least Square Estimator b has smaller variance than the weighted least square
estimator bw .

11

You might also like