You are on page 1of 17

Covariance Stationary Time Series

Stochastic Process: sequence of rvs ordered by time


{Yt}
= {. . . , Y1, Y0, Y1, . . .}
Defn: {Yt} is covariance stationary if
E[Yt] = for all t

Example: Independent White Noise (IW N (0, 2))


Yt = t, t iid (0, 2)

E[Yt] = 0, var(Yt) = 2, j = 0, j 6= 0
Example: Gaussian White Noise (GW N (0, 2))
Yt = t, t iid N (0, 2)
Example: White Noise (W N (0, 2))

cov(Yt, Ytj ) = E[(Yt )(Ytj )] = j for


all t and any j

Remarks

j = jth lag autocovariance; 0 = var(Yt)


j = j / 0 = jth lag autocorrelation

Yt = t
E[t] = 0, var(t) = 2, cov(t, tj ) = 0

Nonstationary Processes

Wolds Decomposition Theorem

Example: Deterministically trending process

Any covariance stationary time series {Yt} can be represented in the form

Yt = 0 + 1t + t, t W N (0, 2)

E[Yt] = 0 + 1t depends on t

Note: A simple detrending transformation yield a stationary process:

Yt = +
0 = 1,

j=0

j=0

Xt = Yt 0 1t = t
Example: Random Walk
Yt = Yt1 + t, t W N (0, 2), Y0 is fixed
t
X
= Y0 +
j var(Yt) = 2t depends on t
j=1
Note: A simple detrending transformation yield a stationary process:
Yt = Yt Yt1 = t

j tj , t W N (0, 2)

2j <

Properties:
E[Yt] =
0 = var(Yt) = 2

j=0

2j <

j = E[(Yt )(Ytj )]

= E[(t + 1t1 + + j tj + )
(tj + 1tj1 + )

= 2( j + j+1 1 + )
= 2

k=0

k k+j

Autoregressive moving average models (ARMA) Models


(Box-Jenkins 1976)

ARMA(1,0) Model (1st order SDE)

Idea: Approximate Wold form of stationary time series


by parsimonious parametric models (stochastic dierence
equations)

Solution by recursive substitution:

ARMA(p,q) model:
Yt = 1(Yt1 ) + + p(Yt2 )
+t + 1t1 + + q tq

t W N (0, 2)
Lag operator notation:

(L)(Yt ) = (L)t

(L) = 1 1L pLp
(L) = 1 + 1L + + q Lq

Stochastic dierence equation


(L)Xt = wt
Xt = Yt , wt = (L)t

Yt = Yt1 + t, t W N (0, 2)
Yt = t+1Y1 + tY0 + t0 + + t1 + t
= t+1Y1 +
= t+1Y1 +

t
X

iti

i=0
t
X

iti, i = i

i=0

Alternatively, solving forward j periods from time t :


Yt+j = j+1Yt1 + j t + + t+j1 + t+j
= j+1Yt1 +

j
X

it+ji

i=0

Dynamic Multiplier:
dYj
dYt+j
=
= j = j
d0
dt

Impulse Response Function (IRF)


Plot j vs. j
Cumulative impact (up to horizon j)
j
X

If || < 1 then
lim j = lim j = 0

i=1

Long-run cumulative impact

Stability and Stationarity Conditions

j = (1)

i=1

= (L) evaluated at L = 1

and the stationary solution (Wold form) for the AR(1)


becomes.
Yt =

j tj =

j=0

j tj

j=0

This is a stable (non-explosive) solution. Note that


(1) =

j=0

j =

1
<
1

If = 1 then
Yt = Y0 +

t
X

j=0

j , j = 1, (1) =

which is not stationary or stable.

AR(1) in Lag Operator Notation


(1 L)Yt = t
If || < 1 then
(1 L)1 =

j=0

j Lj = 1 + L + 2L2 +

such that
(1 L)1(1 L) = 1
Trick to find Wold form:
Yt = (1 L)1(1 L)Yt = (1 L)1t
=
=
=

j=0

X
j=0

X
j=0

j Lj t
j tj

Moments of Stationary AR(1)


Mean adjusted form:
Yt = (Yt1 ) + t, t W N (0, 2), || < 1
Regression form:
Yt = c + Yt1 + t, c = (1 )
Trick for calculating moments: use stationarity properties
E[Yt] = E[Ytj ] for all j
cov(Yt, Ytj ) = cov(Ytk , Ytkj ) for all k, j
Mean of AR(1)
E[Yt] = c + E[Yt1] + E[t]
= c + E[Yt]

j tj , j = j

E[Yt] =

c
=
1

Variance of AR(1)

Autocovariances and Autocorrelations

0 = var(Yt) = E[(Yt )2] = E[((Yt1 ) + t)2]


= 2E[(Yt1 )2] + 2E[(Yt1 )t] + E[2t ]

= 2 0 + 2 (by stationarity)
2
0 =
1 2
Note: From the Wold representation

X
2
j
2
2j

0 = var
tj =
=
1 2
j=0
j=0

Trick: multiply Yt by Ytj and take expectations


j = E[(Yt )(Ytj )]

= E[(Yt1 )(Ytj )] + E[t(Ytj )]

= j1 (by stationarity)
2
j = j 0 = j
1 2
Autocorrelations:
j =

j
0

j 0
= j = j
0

Note: for the AR(1), j = j . However, this is not true


for general ARMA processes.
Autocorrelation Function (ACF)
plot j vs. j


0.99
0.9
0.75
0.5
0.25

half-life
68.97
6.58
2.41
1.00
0.50

Table 1: Half lives for AR(1)

Application: Half-Life of Real Exchange Rates


The real exchange rate is defined as
zt = st pt + pt

st = log nominal exchange rate

pt = log of domestic price level


Half-Life of AR(1): lag at which IRF decreases by one
half
j = j = 0.5
j ln = ln(0.5)
ln(0.5)
j=
ln
The half-life is a measure of the speed of mean reversion.

pt = log of foreign price level


Purchasing power parity (PPP) suggests that zt should
be stationary.

ARMA(p, 0) Model

Stability and Stationarity Conditions

Mean-adjusted form:

Trick: Write pth order SDE as a 1st order vector SDE

Yt = 1(Yt1 ) + + p(Ytp ) + t

t W N (0, 2)

E[Yt] =

Lag operator notation:

Xt
Xt1
Xt2
..
Xtp+1

or
(L)(Yt ) = t

(L) = 1 1L pLp

1 2
1 0
0 1
..
..
0 0

...
...

p
X
t1

0
Xt2

0
Xt3

..
..

Xtp
1
0

t = F t1 + vt
(pp)(p1)
(p1)
(p1)

Unobserved Components representation

Use insights from AR(1) to study behavior of VAR(1):

Yt = + Xt

t+j = Fj+1t1 + Fj vt + + Fvt+j1 + vt


Fj = F F F (j times)

(L)Xt = t
Regression Model formulation
Yt = c + 1Yt1 + + pYtp + t

(L)Yt = c + t, c = (1)

Intuition: Stability and stationarity requires


lim Fj = 0

Initial value has no impact on eventual level of series.

t
0
0
..
0

Example: AR(2)

Result: The ARMA(p, 0) model is covariance stationary


and has Wold representation

Xt = 1Xt1 + 2Xt2 + t
or

"

Xt
Xt1

"

1 2
1 0

#"

Xt1
Xt2

"

t
0

Yt = +

"

Xt+j
Xt+j1

1 2
1 0

#j "

t
0

=
#

"

1 2
1 0

+ +

"

#j+1 "

1 2
1 0

Xt1
Xt2
#"

t+j1
0

with j = (1, 1) element of Fj provided all of the eigenvalues of F have modulus less than 1.
Finding Eigenvalues
#

First row gives Xt+j


(j+1)

(j+1

(j)

Xt+j = [f11 Xt1 + f12 Xt2] + f11 t


+ + f11t+j1 + t+j
(j)
f11 = (1, 1) element of

Note:

F2 =

"

1 2
1 0

#"

1 2
1 0

21 + 2 12
1
2

"

t+j
0

is an eigenvalue of F and x is an eigenvector if

Fx = x (F Ip)x = 0
F Ip is singular det(F Ip) = 0
Example: AR(2)

Fj

"

j tj , 0 = 1

j=0

Iterating j periods out gives


"

det(F I2) = det


#

= det

1 2
1 0

1 2
1

= 2 1 2

"

0
0

#!

The eigenvalues of F solve the reverse characteristic


equation
2 1 2 = 0

To see why |i| < 1 implies limj Fj = 0 consider


the AR(2) with real-valued eigenvalues. By the spectral
decomposition theorem
1, T1 = T0
F = TT
"
#
"
#
1 0
t11 t12
=
, T=
,
0 2
t21 t22

Using the quadratic equation, the roots satisfy


i =

21 + 42

, i = 1, 2
2
These root may be real or complex. Complex roots induce
periodic behavior in yt. Recall, if i is complex then

T1 =

a2 + b2 = modulus

Fj = (TT1) (TT1)
= Tj T1

a = R cos(), b = R sin()
q

t11 t12
t21 t22

Then

i = a + bi
R =

"

and
lim Fj = T lim j T1 = 0

provided |1| < 1 and |2| < 1.

Note:

Examples of AR(2) Processes

j T1
Fj = T
"
#"
t11 t12
=
t21 t22

j
1

0
j
2

#"

t11 t12
t21 t22

so that
(j)

f11

= (t11t11)1 + (t12t22)2
j

Yt = 0.6Yt1 + 0.2Yt2 + t
1 + 2 = 0.8 < 1
"
#
0.6 0.2
F =
1
0
The eigenvalues are found using

= c11 + c22 = j

i =

where
c1 + c2 = 1

1 =

Then,
j

lim j = lim (c11 + c22) = 0


j
j

2 =

0.6 +
0.6

21 + 42

q2

(0.6)2 + 4(0.2)
2
(0.6)2 + 4(0.2)

2
j
j = c1(0.84) + c2(0.24)j

= 0.84
= 0.24

Here
Yt = 0.5Yt1 0.8Yt2 + t

1 + 2 = 0.3 < 1
"
#
0.5 0.8
F =
1
0

2.95
0.5
a =
= 0.25, b =
= 0.86
2
2
i = 0.25 0.86i
q
q
modulus = R = a2 + b2 = (0.25)2 + (0.86)2 = 0.895

Polar co-ordinate representation:

Note:
21 + 42 = (0.5)2 4(0.8) = 2.95
complex eigenvalues

Then

i = a bi, i = 1
q
(21 + 42)
1
a =
, b=
2
2

i = a + bi s.t. a = R cos(), b = R sin()


= R cos() + R sin()i = R ei
Frequency satisfies

a
a
cos() =
= cos1
R
R
2
period =

Here,

Stationarity Conditions on Lag Polynomial (L)

R = 0.895

0.25
1
= cos
= 1.29
0.985
2
period =
= 4.9
1.29
Note: the period is the length of time required for the
process to repeat a full cycle.

Consider the AR(2) model in lag operator notation


(1 1L 2L2)Xt = (L)Xt = t
For any variable z, consider the characteristic equation
(z) = 1 1z 2z 2 = 0
By the fundamental theorem of algebra

Note: The IRF has the form


j

1 1z 2z 2 = (1 1z)(1 2z)

j = c11 + c22
Rj [cos(j) + sin(j)]

so that
1
1
, z2 =
1
2
are the roots of the characteristic equation. The values
1 and 2 are the eigenvalues of F.
z1 =

Note: If 1 +2 = 1 then (z = 1) = 1(1 +2) = 0


and z = 1 is a root of (z) = 0.

Result: The inverses of the roots of the characteristic


equation
(z) = 1 1z 2z 2 pz p = 0
are the eigenvalues of the companion matrix F. Therefore, the AR(p) model is stable and stationary provided
the roots of (z) = 0 have modulus greater than unity
(roots lie outside the complex unit circle).
Remarks:
1. The reverse characteristic equation for the AR(p) is
z p 1z p1 2z p2 p1z p = 0
This is the same polynomial equation used to find the
eigenvalues of F.

2. If the AR(p) is stationary, then


(L) = 1 1L pLp

= (1 1L)(1 2L) (1 pL)

where |i| < 1. Suppose, all i are all real. Then


(1 iL)1 =

i Lj

j=0
1
(L)
= (1 1L)1 (1 pL)1

X
X
X
j
j
j
=
1Lj
2Lj . . .
2Lj
j=0
j=0
j=0

The Wold solution for Xt may be found using


Xt = (L)1t

j=0

1Lj

j=0

2Lj . . .

j=0

2Lj t

3. A simple algorithm exists to determine the Wold form.


To illustrate, consider the AR(2) model. By definition
(L)1 = (1 1L 2L2) = (L),
(L) =

j Lj

Moments of Stationary AR(p) Model


Yt = 1(Yt1 ) + + p(Ytp ) + t
t W N (0, 2)

or

j=0

1 = (L)(L)

1 = (1 1L 2L2)

(1 + 1L + 2L2 + )

Collecting coecients of powers of L gives


1 = 1 + ( 1 + 1)L + ( 2 1 1 2)L2 +
Since all coecients on powers of L must be equal to
zero, it follows that
1 = 1
2 = 1 1 + 2
3 = 1 2 + 2 1
..
j = 1 j1 + 2 j2

Yt = c + 1Yt1 + + pYtp + t
c = (1 )

= 1 + 2 + + p
Note: if = 1 then (1) = 1 = 0 and z = 1 is
a root of (z) = 0. In this case we say that the AR(p)
process has a unit root and the process is nonstationary.

Straightforward algebra shows that

ARMA(0,1) Process (MA(1) Process)

E[Yt] =
0 = 1 1 + 2 2 + + p p + 2

j = 1 j1 + 2 j2 + + p jp
j = 1j1 + 2j2 + + pjp

The recursive equations for j are called the Yule-Walker


equations.
Result: ( 0, 1, . . . , p1) is determined from the first
p elements of the first column of the (p2 p2) matrix
2[Ip2 (F F)]1
where F is the state space companion matrix for the
AR(p) model.

Yt = + t + t1 = + (L)t
(L) = 1 + L, t W N (0, 2)
Moments:
E[Yt] =
var(Yt) = 0 = E[(Yt )2]
= E[(t + t1)2]
= 2(1 + 2)
1 = E[(Yt )(Yt1 )]

= E[(t + t1)(t + t1)]

= 2

1 = 1 =
0
1 + 2
j = 0, j > 1

Remark: There is an identification problem for


0.5 < 1 < 0.5

The values and 1 produce the same value of 1. For


example, = 0.5 and 1 = 2 both produce 1 = 0.4.
Invertibility Condition: The MA(1) is invertible if || < 1
Result: Invertible MA models have infinite order AR representations
(Yt ) = (1 + L)t, || < 1

= (1 L)t, =

(1 L)1(Yt ) = t

()j Lj (Yt ) = t

j=0

so that
t = (Yt ) + (Yt1 ) + ()2 (Yt2 ) +

You might also like