You are on page 1of 8

Lecture 2: ARMA Models

Bus 41910, Autumn Quarter 2008, Mr. Ruey S. Tsay


Autoregressive Moving-Average (ARMA) models form a class of linear time series models
which are widely applicable and parsimonious in parameterization. By allowing the order
of an ARMA model to increase, one can approximate any linear time series model with
desirable accuracy. [This is similar to using rational polynomials to approximate a general
polynomial; see the impluse response function.]
In what follows, we assume that {a
t
} is a sequence of independent and identically distrib-
uted random variables with mean zero and nite variance
2
. That is, {a
t
} is a white noise
series. Sometimes, we use
2
a
to denote the variance of a
t
.
A. Autoregressive (AR) Processes
1. AR(p) Model: Z
t

1
Z
t1

p
Z
tp
= c + a
t
or (B)Z
t
= c + a
t
, where c is a
constant. If Z
t
is stationary with mean E(Z
t
) = , then the model can be written as
(B)(Z
t
) = a
t
. The latter is the parameterization implemented in the R package.
2. Stationarity: All zeros of the polynomial (B) lie outside the unit circle. (Why?)
3. Moments: (Assume stationarity)
Mean: Taking expectation of both sides of the model equation, we have
E(Z
t
)
1
E(Z
t1
)
p
E(Z
tp
) = c + E(a
t
)

1

p
= c
(1
1

p
) = c
=
c
1
1

p
This relation between c and is important for stationary time series.
Autocovariance function: For simplicity, assume c = = 0.
Multiplying both sides of the AR model by Z
t
and taking expectations, we
have
E[(Z
t

1
Z
t1

p
Z
tp
)Z
t
] = E(Z
t
a
t
)
For = 0,

0

1

1

p

p
=
2
.
For > 0,

1

p

p
= 0.
1
Autocorrelation function (ACF):
k
=

k

0
. From autocovariance function, we
have

1

p

p
=
_

2

0
for = 0
0 for > 0.
Since the ACFs satisfy the p-th order dierence equation, namely (B)

= 0
for > 0, they decay exponentially to zero as .
4. Yule-Walker equation: Consider the above equations jointly for = 1, , p, we have
_

2
.
.
.

p
_

_
=
_

_
1
1

2

2p

1p

1
1
1

3p

2p
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

p1

p2

p3

1
1
_

_
_

2
.
.
.

p
_

_
which is the p-order Yule-Walker equation. For given

s, the equation can be used


to solve for
i
. Of course, we can obtain

s for given
i
s and
2
via the moment
generating function or the -weight representation to be discussed shortly.
5. MA representation:
Z
t
=
1
(B)
a
t
=

i=0

i
a
ti
where the
i
s are referred to as the -weights of the model. These -weights are also
called the impulse response function and can be obtained from the
i
s by equating
the coecients of B
i
1
(B)
= 1 +
1
B +
2
B
2
+
1 = (1
1
B
p
B
p
)(1 +
1
B +
2
B
2
+ )
Thus, we have

1
=
1

2
=
1

1
+
2

3
=
1

2
+
2

1
+
3
.
.
.

p
=
1

p1
+
2

p2
+ +
p1

1
+
p

=
p

i=1

i
for > p.
Again, the -weights satisfy the dierence equation (B)

= 0 for > p so that


they also decay exponentially to zero as goes to innite.
2
6. The moment generating function:
(z) =

2
(z)(z
1
)
.
7. A simple example: the AR(1) case
Z
t

1
Z
t1
= c + a
t
.
Stationarity condition: |
1
| < 1.
Mean: =
c
1
1
.
ACF:

1
.
Yule-Walker equation:
1
=
1

0
=
1
.
MA representation:
Z
t
=
c
1
1
+ a
t
+
1
a
t1
+
2
1
a
t2
+
3
1
a
t3
+
so that the -weights are

1
.
The variance of Z
t
: Var(Z
t
) =

2
1
2
1
.
8. An AR(2) model:
(1
1
B
2
B
2
)Z
t
= c + a
t
.
Stationarity condition: Zeros of (B) are outside the unit circle.
Mean: =
c
1
1

2
.
ACF:
0
= 1,
1
=
1
/(1
2
), and
j
=
1

j1
+
2

j2
, for j > 1. Why?
Yule-Walker equation:

1
=
1
+
2

2
=
1

1
+
2
,
or equivalently,
_
1
1

1
1
_ _

1

2
_
=
_

1

2
_
.
MA representation:
Z
t
=
c
1
1

2
+ a
t
+
1
a
t1
+
2
1
a
t2
+
3
1
a
t3
+ ,
where 1 +
1
B +
2
B
2
+ =
1
1
1
B
2
B
2
.
3
The variance of Z
t
can be obtained from the moment equations as
Var(Z
t
) =
1
2
1 +
2

2
[(1
2
)
2

2
1
]
.
B. Moving-Average (MA) Processes
1. MA(q) Model: Z
t
= c + a
t

1
a
t1

q
a
tq
or Z
t
= c + (B)a
t
.
2. Stationarity: Finite order MA models are stationary.
3. Mean: By taking expectation on both sides, we obtain
E(Z
t
) = c + E(a
t
)
1
E(a
t1

q
E(a
tq
).
= c
Thus, for MA models, the constant term c is the mean of the process.
4. Invertibility: Can we write an MA model as an AR model? Consider
1
(B)
(Z
t
) = a
t
Let (B) = 1
1
B
2
B
2
=
1
(B)
. We call the s the weights of the process
Z
t
.
Similar to
1
(B)
, for the above AR representation to be meaningful we require that

i=1

2
i
< .
The necessary and sucient condition for such a convergent -weight sequence is that
all the zeros of (B) lie outside the unit circle.
5. Autocovariance function: Again for simplicity, assume = 0.
Multiple both sides by Z
t
and take expectation. We have

=
_

_
(1 +
2
1
+ +
2
q
)
2
for = 0

q
i=0

i

+i
for = 1, , q
0 for > q
where
0
= 1. This shows that for an MA(q) process, the autocovariance function

is zero for > q. This is a special feature of MA processes and it provides a


convenient way to identify an MA model in practice.
4
6. Autocorrelation function:

0
. Again, from the autocovariance function we have

=
_
= 0 for = q
0 for > q.
In other words, the ACF has only a nite number of non-zero lags. Thus, for an
MA(q) model, Z
t
and Z
t
are uncorrelated provided that > q. For this reason, MA
models are referred to as short memory time series.
7. The moment generating function:
(z) =
2
(z)(z
1
).
8. A simple example: the MA(1) case
Z
t
= c + a
t
a
t1
Invertibility condition: || < 1.
Mean: E(Z
t
) = = c.
ACF:

=
_

_
1 for = 0

1+
2
for = 1
0 for > 1.
The above result shows that
1
0.5 for an MA(1) model.
Variance:
0
= (1 +
2
)
2
.
Moment generating fucntion:
(z) = [(1 +
2
) (z + z
1
)]
2
.
From the MGF, we have
0
= (1 +
2
)
2
,
1
=
2
, and
j
= 0 for j 2.
5
C. Mixed ARMA Processes
1. ARMA(p, q) Model: Z
t

1
Z
t1

p
Z
tp
= c + a
t

1
a
t1

q
a
tq
or
(B)Z
t
= c + (B)a
t
. Again, for a stationary process, we can rewrite the model as
(B)(Z
t
) = (B)a
t
.
2. Stationarity: All zeros of (B) lie outside the unit circle.
3. Invertibility: All zeros of (B) lie outside the unit circle.
4. AR representation: (B)Z
t
=
c
(1)
+ a
t
, where
(B) =
(B)
(B)
.
The -weight
i
can be obtained by equating the coecients of B
i
in
(B)(B) = (B).
5. MA representation: Z
t
=
c
(1)
+ (B)a
t
, where
(B) =
(B)
(B)
.
Again, the -weights can be obtained by equating coecients.
Note that (B)(B) = 1 for all B for an ARMA model. This identity has many
applications.
6. Moments: (Assume stationarity)
Mean: =
c
1
1

p
.
Autocovariance function: (Assume = 0) Using the result
E(Z
t
a
t
) =
_

2
for = 0

2
for > 0
0 for < 0,
and the same technique as before, we have

1

p

p
=
_

_
(1
1

1

q

q
)
2
for = 0
(

+
+1

1
+ +
q

q
)
2
for = 1, , q
0 for q + 1
where
0
= 1 and
j
= 0 for j > q.
6
Autocorrelation function:

satises

1

p

p
= 0 for > q.
You may think that the ACF satisfy the dierence equation (B)

= 0 for
q + 1 with
1
, ,
q
as initial conditions.
7. Generalized Yule-Walker Equation: Consider the above equations of ACF for =
q + 1, , q + p, we have
_

q+1

q+2
.
.
.

q+p
_

_
=
_

q

q1

q+2p

q+1p

q+1

q

q+3p

q+2p
.
.
.
.
.
.
.
.
.
.
.
.

q+p1

q+p2

q+1

q
_

_
_

2
.
.
.

p
_

_
which is referred to as a p-th order generalized Yule-Walker equation for the ARMA(p, q)
process. It can be used to solve for
i
s given the ACF
i
s.
8. Moment generating function:
(z) =
(z)(z
1
)
(z)(z
1
)

2
9. A simple example: The ARMA(1,1) case
Z
t
Z
t1
= c + a
t
a
t1
Stationarity condition: || < 1.
Invertibility condition: || < 1.
Mean: =
c
1
.
Variance:
0
=

2
(1+
2
2)
1
2
ACF:

1
=
(1 )( )
1 +
2
2
,

=
1
for > 1.
10. AR representation:
Z
t
=
c
1
+
1
Z
t1
+
2
Z
t2
+ + a
t
where
i
=
i1
( ).
11. MA representation:
Z
t
=
c
1
+ a
t
+
1
a
t1
+
2
a
t2
+
where
i
=
i1
( ).
7
1 Illurative Examples
Some examples of ARMA models are given below with demonstration.
U.S. quarterly growth rate of GDP.
Daily returns of the US-EU exchange rate.
Chemical concentration readings, Series A, of Box and Jenkins (1976).
8

You might also like