Professional Documents
Culture Documents
CHP 3 PDF
CHP 3 PDF
And,
Y i = Yˆi + uˆ i
On Error Term More
If
Yi = Yˆi + uˆ i
Then,
uˆi = Yi − Yˆi
And,
uˆi = Yi − βˆ1 − βˆ2 X i
On error term more
We need to choose SRF in such a way that, error terms should be as
small as possible,
That is,
∑ uˆ i = ∑ (Y i − Yˆi )
Should be as SMALL as possible
On Error Terms more
Therefore, the essential solution is to find a criterion in
order to minimize error disturbances in SRF.
∑ ∑( )
2
uˆ i2 = Y i − Yˆi
∑( )
2
= Yi − βˆ1 − βˆ2 X i
Thus,
∑ (
uˆ i2 = f βˆ1 , βˆ 2 )
Example to Least Squares Criterion
TheSufirst
m oModel is sBetter?
f square of Error
Why?
disturbances of th
e
second model is lo
wer
Regression Equation
∑ X Y −∑ X ∑Y
∑ ∑ Y − ∑X ∑X Y
n
X i2
i i i i
βˆ =
n ∑ X − (∑ X )
2 i i i i
βˆ =
2 2
n∑ X − (∑ X )
i i 1 2
2
∑ (X − X )(Y − Y )
i i
i i
=
∑ (X − X ) = Y - βˆ2 X
2
i
=
∑ x y
i i
∑x i
2 SampleSample
mean ofmean
Y of X
The Classical Linear Regression Model (CLRM):
The Assumptions Underlying The Method of Least Squares
E (ui / X i ) = 0
Assumption 4. Homoscedasticity or Equal Variance of ui
var (u i / X i ) = σ 2
Assumption 5. No Autocorrelation between the disturbances
Autocorrelation
If :
PRF: Yt = β1 + β2Xt + ut
And if ut and ut-1 are correlated, then Yt depends not only Xt, but also
on ut-1.
Autocorrelation
in Graphs
Assumption 6. Zero Covariance between ui and Xi.
Assumption 7.
Assumption 8.
Assumption 9.
Assumption 10. There is No Perfect Multicollinearity
Yt = β0 + β1 X1 + β2 X 2 + .....βn X n + ut
See Peter Kennedy, “Ballentine: A Graphical Aid for Econometrics”, Australian Economics Papers, Vol 20,
1981, 414-416. The name Ballentine is derived from the emblem of the well-known Ballantine beer with
its circles.
Coefficient of Determination, r2
TSS = ESS + RSS
where;
TSS = total sum of squares
ESS = explained sum of squares
RSS = residual sum of squares
ESS RSS
1= +
TSS TSS
=
∑ (Ŷ − Y )
i
+
2
∑ uˆ 2
i
∑ (Y − Y ) ∑ (Y − Y )
2 2
i i
On r2 more:
R2 indicates the explained part of the
regression model, therefore,
2 ESS
r =
TSS
And,
r 2
=
∑ ( )
Yˆi − Y
2
=
ESS
∑ (Y − Y )
2
TSS
i
Alternatively,
r 2
= 1−
∑ ˆ
ui2
∑ (Y )
2
i −Y
2 RSS
r = 1−
TSS
Coefficient
of
Determination
Coefficient Of
Determination
HW # 1:
Problem 3.20 (Chapter 3)
1982 to 2001