You are on page 1of 5

EM2511: Econometrics II Spring 2014

Lecture 3
Instructor: Prof. Chuang Date: April 22
Disclaimer: These notes have not been subjected to the usual scrutiny reserved for formal publications.
They may be distributed outside this class only with the permission of the Instructor.
Outline:
MA process
ARMA process
MA Process
MA(1)
If we consider a process X
t
dened by
x
t
=
t

t1
, (3.1)
where
t
WN(0,
2

) t. Now X
t
is linear function of the present and previous disturbances. The process
X
t
is called a moving-average process of order one, denoted by MA(1). The autocovariances of Equation
(3.1) are

0
= (1 +
2
)
2

; (3.2)

1
=
2

; (3.3)

2
=
3
= ... = 0, (3.4)
which gives the autocorrelation coecients as

1
=

1 +
2
; (3.5)

2
=
3
= ... = 0, (3.6)
Proof:
MA(1) and AR()
The MA(1) process may be inverted to give
t
as an innite series in x
t
, x
t1
,..., namely,

t
= x
t
+x
t1
+
2
x
t2
+... (3.7)
that is,
x
t
= x
t1

2
x
t2
... +
t
. (3.8)
Because this is an AR() series the partial autocorrelations do not cut o but damp toward zero. But,
the autocorrelations are zeros after rst. The pure MA process are thus the converse of those of a pure AR
process. The ACF of an MA process cuts o after the order of the process, and the PACF
damps toward zero. The Equation (3.8) makes sense only if || < 1. This condition is known as the
invertibility condition.
3-1
3-2 Lecture 3:
Figure 3.1: MA(1) Plot
Numerical Illustration
Simulate 200 observations from an MA(1) Process
x
t
=
t
0.3
t1
. (3.9)
set.seed(928)
x=rep(NA,200)
x[1]=0
v=rnorm(200,0,1)
for (t in 2:200)x[t]=v[t]-0.3*v[t-1]
X1=ts(x)
ts.plot(X1,ylab=x,lty=6,lwd=2,col=blue4,main=MA(1) Process)
windows()
acf(X1,lag=20,main=Autocorrelations of MA(1))
windows()
pacf(X1,lag=20,main=Partial Autocorrelations of MA(1))
Fitting and Forecasting
ma1=arima(X1,order=c(1,0,0))
ma1
predict(X1,5)
Lecture 3: 3-3
Figure 3.2: Autocorrelations of MA1
Figure 3.3: Partial Autocorrelations of MA1
3-4 Lecture 3:
Identify the orders of AR and MA processes
Therefore, it is fairly easy to distinguish between pure AR and pure MA process on the basis of the cuto
in the PACF or ACF. The expected theoretical patterns for various processes are summarized in Table 3.1.
Table 3.1: Correlation patterns
Process ACF PACF
AR(p) Innite: damps out Finite: cuts o after lag p
MA(q) Finite: cuts o after lag q Innite: damps out
ARMA process
A compact form for exible models.
ARMA(1,1)
It is naturally to consider the combination of the AR(1) and MA(1) models as
y
t
= +y
t1
+u
t
u
t1
. (3.10)
This mixed model is a combination of an AR(1) and an MA(1) on the RHS, which we refer to as ARMA(1,1).
The properties of ARMA(1,1) are
Stationarity: same as AR(1)
Invertibility: same as MA(1)
Mean: as AR(1), i.e. E(y
t
) =

1
.
Variance: Var(y
t
) =
0
=
12+
2
1
2

2
u
.
This is the dierence between AR(1) and ARMA(1,1) models.
PACF: does not cut of at nite lags.
ARMA(p,q)
The general ARMA(p,q) process is
A(L)(y
t
) = B(L)u
t
, (3.11)
where A(L) = 1
1
L
2
L
2
...
p
L
p
and B(L) = 1
1
L
2
L
2
...
q
L
q
. Both p and q are positive.
We can rewrite Equation (3.1) as
(y
t
) =
p

i=1

i
(y
ti
) +
q

j=0

j
u
t
, (3.12)
with
0
= 1,
p
= 0, and
q
= 0. The ARMA(p,q) process is stationary if all the solutions of
1
z +
2
z
2
+
... +
p
z
p
= 1 are outside the unit circle, |z| = 1. Here z represents a complex number.
Lecture 3: 3-5
ARMA Modeling
There are four steps in ARMA modeling:
1. Check the series for stationarity. If not, transform the series to induce stationarity;
2. Choose a few ARMA specications for estimation;
3. Test in order to arrive at a preferred specication with white noise residuals
1
;
4. Evaluate the forecasting performance over a relevant time horizon from the preferred specication.
1
For identication an ARMA(p, q) process, one may adopt the extended ACF (EACF) developed by Tsay and Tiao (1984),
Consistent Estimates of Autoregressive Parameters and Extended Sample Autocorrelation Function for Stationary and Non-
stationary ARMA models, to determine the orders.

You might also like