You are on page 1of 11

as a sum of

Ax

terms

N2

Suppose A is an N N matrix
We can write A as
A=

[ ][ ][ ][ ]
a11 0 0 a12
+
+ 0 0+0 0
a21 0 0 a22
0 0 0 0

This allows us to write


Ax=

(a 0x )+(a 0x )+( a 0x )+(a 0x )=(aa


11

12

21

22 2

x 1 +a12 x2
21 x1 +a 22 x 2
11

The product Ax is a vector where each element in


has N terms
Q: Law of Large Numbers and scaling by ???

Ax

We can also write the quadratic product


x ' Ax=x '

[ ] [

] [ ] [ ]

a11 0
0 a12
x + x'
x+ x ' 0 0 x + x ' 0 0 x
a21 0
0 a 22
0 0
0 0

The first term


x'

a 11 0
a x
x= ( x1 , x2 ) 11 1 =a11 x 21
0 0
0

( )

The product
The product

x ' Ax

x ' Ax

is a scalar
is a sum of

Q1: Scale by N ???


Q2: Geometric sum is finite
2

N2

terms

2: Invertible matrix
Our OLS estimator is ^=[ X X ] X ' y
We are allowed to take an inverse when X has full
column rank
No column is a linear combination of other
columns
20 cigarettes per pack, we cannot use both cigs
and packs
'

[ ]

1100
= 1 0 1 0
1001

and

x=( 1,1,1,1 )'

x =0

We found a x 0 s.t. x =0
Therefore, is not full rank
Q: Estimate the model

y t =xt + t +ut

3: Projection Matrix
Projection matrix
1

P ( X )=X [ X ' X ] X '

Properties
1

P ( X ) X=X [ X ' X ] X ' X=X


P ( X ) P ( X ) =P ( X )

The residual matrix


M ( X ) I P ( X )

Properties
P ( X ) M ( X )=P ( X ) [ I P ( X ) ]=P ( X ) P ( X )=0
P ( X ) + M ( X )=I

For the
What is

N1

P( J N )

vector
?

J N =( 1,1,.,,, .1 )

'

'

J N J N =N
1

[ J 'N J N ]

1
N

1 1
J N N J N J 'N =
1 1
1 1
1
P ( J N )=

N
1 1

So P ( J ) is an N N matrix of N1
Q: What does P ( J ) x look like ?
N

What is

M (JN)

1 1
1
M ( J N ) =I P ( J N )=I

N
1 1

For

N=3

[ ][ ]

1
3
1 0 0
1
M ( J3 )= 0 1 0
3
0 0 1
1
3

[ ]

So

M (JN)

1
3
1
3
1
3

1
2 1
3
3
3
1
1 2
=
3
3
3
1
1 1
3
3
3

1
3
1
3
2
3

demeans a vector

M ( J 3 ) x=x x

Q: Is there any

s.t.

( J N ) x=0

???

4: Kronecker Product
The Kronecker product A B takes M N and
matrices and makes M M N N matrix
1

For

M 1=N 1=2

A B=

a11 B a 12 B
a 21 B a 22 B

Suppose
I P ( J N )=

]
A=I

P (JN)
0
0
P( J N )

and

B=P ( J N )

, what is

I P (J N )

???

Q: what does this do for

( x 1 , x 2 , x 3 , x 4 , x5 , x 6 )

???

M 2 N 2

4: Expectation and variance


For the random variable

with

E [ x ]=

and

Var [ x ] = 2

The affine transformation


E [ a+bx ] =a+ b
2
2
Var [ a+ bx ] =E [ ( a+ bxE [ a+bx ] ) ]=E [ ( a+ bxab )2 ] =E [ ( b ( x ) ) ] =b2 2

Q: If

z N ( 0, 2 )

, what is the distribution of

a+bz

???

For a vector
variable

and a matrix
E [ x ]=

and

and vector random

Var [ x ] =

We have
E [ A +Bx ] = A+B
'
Var [ A+ Bx ] =B B

Q: Suppose
Var [ A+ Bx ] ???

=B1 B1

and

B=B'

, what is

5: FWL Theorem
The Frisch-Waugh-Lovell theorem states for
y= 1 x 1 + 2 x 2 +u2
^
1
1
=[ X ' X ] X ' y
^
2

()

1
^
2=[ x 2 M ( x 1) x 2 ] x 2 M ( x 1 ) y

This is also written as


M ( x 1 ) ~
x1 , M ( y1 ) ~
y1
1 ~ ~
~
~
^ =[ x ' x ] x ' y

2
1
1
1

Q: What if

x 1=J N

???

You might also like