You are on page 1of 3

OLS versus MLE Example

Here is the data:

Y X 12
0.8 1
10
2.1 2
3.94 3 8

4.6 4 6
Y

5.1 5
4
6.5 6
6.9 7 2

8.2 8 0
0 2 4 6 8 10 12
9.3 9
X
10.1 10
OLS versus MLE Example
Here is the SAS code:
OLS fitting using PROC GLM:

proc glm data=data; Call the procedure for data set “data”
model y = x; Regression model: y = b0 + b1x
run;

MLE fitting using PROC GENMOD:

proc genmod data=data; Call the procedure for data set “data”
model y = x / dist=normal link=identity; Model: y = b0 + b1x
run; Assume normally distributed errors
Use an identity link (the link function
describes the relationship between y
and the linear portion of the model)
OLS versus MLE Example
Output from PROC GLM Output from PROC GENMOD

Source DF SS Mean Sq Criterion DF Value Value/DF


Model 1 81.2051 81.2051 Deviance 8 1.0533 0.1317
Error 8 1.0533 0.1316 Scaled Deviance 8 10.0000 1.2500
Corrected Total 9 82.2584 Log Likelihood -2.9362

R-Square Root MSE


0.987195 0.362857

Parameter Estimate SE Parameter DF Estimate SE


Intercept 0.297333 0.2478 Intercept 1 0.2973 0.2217
X 0.992121 0.0399 X 1 0.9921 0.0357
Scale 1 0.3245 0.0726
MLE 2 = SS/n = 0.10533 Scale =  = (2)2
Ln(L) = -n/2*ln(2e2) Note that SE estimates differ; MLE variance
estimates are biased at low sample sizes
= -5ln(17.07947*0.10533) = -2.9361 n( MLE SE ) 2
OLS SE 
df

K = 3 (intercept, x, ) K = 3 (intercept, x, scale)

You might also like