You are on page 1of 3

↑ 1

Chapter 1:simple linear


Regression Assumptions for model A: standard error ofmultiple regression
(Explanatory)
dearly specified

Eczise::-nn-t.,
independentvariable:X axis Dependent variable:Max is linearand
mu
(disturbance)
Actual Value:y B, 12X u =
+ + ->
variation in regressors (var (4) (0)
(error)
m

Fitted/observed value: b, bx e has expectation (E(ui) 0)


+ +

-> u
=

no =

true
Disturbance:differencebetween and actual
-> has normal distribution central limitimple
Residual: difference between actual and observed population
& Constant
->
h is homoscedastic
RSS Varian Le
criterior: Square
least
squares [(yi ↑Rz
eezy
y)2
to cancel out
·Yi E((n,- Nu)2) E(u?)
-
=
=

RSSF.Sez-e2+.te
=

R
= -

errors & multiple N:


2(Yi,j)z
w
->
has independent distribution
Interpretation 1/V:R* or R
=

doesnotinfluencethe
& minimises vertical distance between Gwinj =
ElCui Nu)(nj fa)] E(ui)E(n.)0
-
- =
=

t =
BcHo:X R2%oftotal variation in y can be

DRS).es
observed and data (4) explained byvariation in X
vertical distance
Gauss-Markov theorem LBLUE) misinterpretations of t-test
minimises
x

satisfied Adjusted R2
A classical assumptions are

Deriving interceptcoefficient OLS estimators:Nearthan any


other
statistical significance theoretical validity
R2 1
=
.

zslnt1 - 11 1
- ( -
R) ie)
significance economic validity
efficient statistical

niccini. . . . . .need
linear unbiased estimators:

properties of
estimators:
variables ↑
↑ =

degrees offreedom relevance


size of
t testdoes nottell empirical importance dependent
average value ofsample estimate principle of
parsimony
Iunbiased:close to true population
E(E) 0 Chapter 3:Multiple Regression choose smaller model
M
= Bias E(8)
=
-
o

Inne:ni BItBXita.z(xi x) is Model:1 B, +P2X2i P3X3;twi


bigger modelhas dose cofficient
= = +
- ·

var(X)
Fitted plane:4, b, +b242i b34si
E(Xi- y)z
:E(bz) P2 i 1 = +

explanatory
=

of
=

F test power

cEfficient:samplingerrortobeone Residuals eiY, Y, Yi bi


b2Xz b3x3i Ho:Pi Pk 0
=
M =
:BLUE!!!
war (x) (sd(X))
=
1

Measures ofvariation =

& when explaining, remember to say


compoundwill hypothesis 1:at least180
3. Consistent data,eltimate
to see
if to rejected
TSS (total) Ess(Explained) RCS (Residual)
= +

holding other variable constant No:NO explanalong power H coefficientsignificant


* possible to have non-linear unbiased

·Ess:factorsduetoxand
relationset with low variance than OLS
estimator

& If BLUE, still can be accurate, bias


not
is small
Normal Equations
-1, n k)
FCR - T)
=

SCHE
if ESSTT RSS, #statwill be large t isrejected
variation due to sampling offset bias
ei 4i = -
y,
-
4, 0, e..... ne
= +
=0
on

Estat 2Tstat)of
by
Regression coefficients
=

s (Ni-1-
standard Errors

e
2
= of
R
RSS 1z
=
-

1
Su
Marginal contribution test

secsis-suctTaxissecs.
2

xe -> test v
ofjointexplanatory power of
Chapter 2:Properties of regression

coefficients and hypothesis testing precision regression coefficients


of
y p, + P2X2
=

uCRss,)
+

Types of regression models:


MSD(X) E(Xi
= x)2.wo.otoseeriations
-
of
Precision ofmultiple regression efficient y p, P2Xz B343 P4xy h(RSS27
=
+
+ + +

A:cross sectional data with monfixedtastic =


about sample mean

01xy2 1Finx
=
+

nx Faux *
Ho:B3
=

By H, $370 Py70 OR B3&B4 = 0


MSD(n) S(ei e) tse,
0
= = or
muh -
=

firstfactorvarianceofslope coefficient CRSS,-RSS2) /cost of of


regressors. (strong condition)
Lexogenous Unbiased estimator of d isturbance term variance Frost in of, of remaining) =

↑ correlation between explanatory,Ararate


B:cross sectional data with stochastic RSSz 1df
=zzmsp(ei) e24se:better remaining
1x
s
=

mum

regressors. (weaker condition)


(mean) sYy RSS/(n k)
=
-
>dofb =

x x
i rejectto jointexplanatory power
Chapter4:multicolinearity Detection of multicollinearity Elasticities and Logarithmic model testfor linear restrictions
Inflation
-mssuc RisaRumaRSme
z. A

x) flatteet
variance

px)(x 1
simple variance
gy(
I
&testing oflinear restrictions F
-
= =

coefficients Factors (HIFs)


=

+
1
coefficients no.20
proper lies ofmultiple regression
Variance Inflation Factor(VF) Engel Curve:4 B,xP2 =
r) (n 1)
Rn (seof =
-

& All same +


NO exactl inear relation High

xpedy e
-

dy B,B.XBr
- 1

ship among regressors in the


sample 1. Run OLS Regression unadjusted =
=

Interactive Explanatory
R2 variables
Gauss Markor Theorm

forDr.cn
Pz:luxary OCP2.Necesity
OLS yields estimates
consistent of
the
2.CalculateHIF -
Normal goods
y P, P2Xz P3X3 B+Xzxz 1
Double log model specification
+ +

regression efficient:converging
= + +

to

For:RE but
Look I significantvalues
↑ u,aMSD,
the true value se & slope nefficient 10g y log B,
=
+ P210S X +
1 y B, + (Bz P4Xz)Xz BsX3
=
+ +
2
+

ofbe to
reject
X to hypothesis that coefficient set
perfectmulticolinearity Interpretation: X B, +P2Xz + (P3+34X2) X3th sample, a rew
undefined
=

oLs, estimators
Xsx +5 X2 with 2 so effecto ft???
are individually C+
=
test) but ↑ is expected to change by about met: a
Xs moves
Interpretation
Imperfect multicolinearity &undefined with
if Flest
reject =0 P2%with 1%increase in X
multicollinearity P2:marginal effectof Xo n 4 when XS 0
=

models
-c 4 Xu
= + +

useofbc y
=
nx i
x azX? a,
=
+
semi logarthmic X3
if = 0 (outside range), the interpretation of

estimators remain unbiased Alleviating multicollinearity:Direct Cog Y LogP,


= +10SePaX estimate PC as an estimate of
marginal effectof
multicollinearity
Y B,eP2X should be treated with cention
p!
=

consequences of
↑ ofobservations y =
B2X &n when Xs =0
1.
Interpretation
+

no.

(disturbance term) Ramsey RESET test


by about(P
1. Estimates unbiased butOLS BLUE?
change
are
↓ of by including relevant expected to
harder to dinguish effect
2. variables
↑ is
&
So I test functional misspecification provide
2.
explanarone to

x100%)
of

2. ↑ variance, ↑ errors
3. Increase MSD(X2) Ararof with unit
increase in X
3. Computed t-testscores ↓ a simple indicator ofevidence ofn o n linearity
logarithmic model
Reduce Vxz,xy changeinfoon Disturbance term in
specifications 4.

by;
4. Estimates sensitive changes
to in

p,xP y b,
y B,yPre
+

los y 10gB. - P2CosX


=

addition of variables affectby u


= +
= =

↓ WC statistically insignificant, coefficients Alleviating multicollinearity:Indirect tilted


U=log regression
1 pum & save
change drastically) en mustbe multiplicative
1. combine correlated variables
dependentvariats v
values of
5. overall fitof
equation and estimation ofefficient
correlated variables
&danger Quadratic Regression
ofnon-multicollinear variables 2. Drop
some
will be largely 2. Add to
regression specification,
to

y
=
unaffected 3. Theoretical restriction se y B, B24z B3X +u P2 2P3Xz
= +
+
= +

coefficient, possible butwill notbe interpretation itshouldpick up quadratic and


& F-testofa Linear Restriction
explanatory
Linearly related to CRSSm-RSSu)/1EsNEVER of a unit
P2:Effect change in
4 on interactive non-linearity. Ifpresent,
FC1, n -k)
10.y
=

Ho H,iy2
=
variables RSSY/(n-1)
X
reject special
for case where X2 = 0. For non- 0 0

then do to stalet-critical
Chapter 5:Transformation ofvariables 2000 values of
X2, marginal effectwill
why?
y B, =
+
z B3(X3
+
+
B 189 X4
+
1
+
be different
3. If stat coefficiento fye
for is

zn xzs
=

r3z + 10g 4
= = +
Pirate of change coefficientof e
of

significant, in indicates some


that
per unit
change in to
.. y B,=
Pzzz B,z3
+ + +
B z +u
+ +

non-linearily
kind of is present
&

You might also like