Peter Bartlett
www.stat.berkeley.edu/∼bartlett/courses/153fall2010
Last lecture:
1. ACF, sample ACF
2. Properties of the sample ACF
3. Convergence in mean square
1
Introduction to Time Series Analysis. Lecture 5.
Peter Bartlett
www.stat.berkeley.edu/∼bartlett/courses/153fall2010
1. AR(1) as a linear process
2. Causality
3. Invertibility
4. AR(p) models
5. ARMA(p,q) models
2
AR(1) as a linear process
Let {X
t
} be the stationary solution to X
t
−φX
t−1
= W
t
, where
W
t
∼ WN(0, σ
2
).
If φ < 1,
X
t
=
∞
j=0
φ
j
W
t−j
is the unique solution:
• This inﬁnite sum converges in mean square, since φ < 1 implies
φ
j
 < ∞.
• It satisﬁes the AR(1) recurrence.
3
AR(1) in terms of the backshift operator
We can write
X
t
−φX
t−1
= W
t
⇔ (1 −φB)
φ(B)
X
t
= W
t
⇔ φ(B)X
t
= W
t
Recall that B is the backshift operator: BX
t
= X
t−1
.
4
AR(1) in terms of the backshift operator
Also, we can write
X
t
=
∞
j=0
φ
j
W
t−j
⇔ X
t
=
∞
j=0
φ
j
B
j
π(B)
W
t
⇔ X
t
= π(B)W
t
5
AR(1) in terms of the backshift operator
With these deﬁnitions:
π(B) =
∞
j=0
φ
j
B
j
and φ(B) = 1 −φB,
we can check that π(B) = φ(B)
−1
:
π(B)φ(B) =
∞
j=0
φ
j
B
j
(1 −φB) =
∞
j=0
φ
j
B
j
−
∞
j=1
φ
j
B
j
= 1.
Thus, φ(B)X
t
= W
t
⇒ π(B)φ(B)X
t
= π(B)W
t
⇔ X
t
= π(B)W
t
.
6
AR(1) in terms of the backshift operator
Notice that manipulating operators like φ(B), π(B) is like manipulating
polynomials:
1
1 −φz
= 1 +φz +φ
2
z
2
+φ
3
z
3
+· · · ,
provided φ < 1 and z ≤ 1.
7
Introduction to Time Series Analysis. Lecture 5.
1. AR(1) as a linear process
2. Causality
3. Invertibility
4. AR(p) models
5. ARMA(p,q) models
8
AR(1) and Causality
Let X
t
be the stationary solution to
X
t
−φX
t−1
= W
t
,
where W
t
∼ WN(0, σ
2
).
If φ < 1,
X
t
=
∞
j=0
φ
j
W
t−j
.
φ = 1?
φ = −1?
φ > 1?
9
AR(1) and Causality
If φ > 1, π(B)W
t
does not converge.
But we can rearrange
X
t
= φX
t−1
+W
t
as X
t−1
=
1
φ
X
t
−
1
φ
W
t
,
and we can check that the unique stationary solution is
X
t
= −
∞
j=1
φ
−j
W
t+j
.
But... X
t
depends on future values of W
t
.
10
Causality
A linear process {X
t
} is causal (strictly, a causal function
of {W
t
}) if there is a
ψ(B) = ψ
0
+ψ
1
B +ψ
2
B
2
+· · ·
with
∞
j=0
ψ
j
 < ∞
and X
t
= ψ(B)W
t
.
11
AR(1) and Causality
• Causality is a property of {X
t
} and {W
t
}.
• Consider the AR(1) process deﬁned by φ(B)X
t
= W
t
(with
φ(B) = 1 −φB):
φ(B)X
t
= W
t
is causal
iff φ < 1
iff the root z
1
of the polynomial φ(z) = 1 −φz satisﬁes z
1
 > 1.
12
AR(1) and Causality
• Consider the AR(1) process φ(B)X
t
= W
t
(with φ(B) = 1 −φB):
If φ > 1, we can deﬁne an equivalent causal model,
X
t
−φ
−1
X
t−1
=
˜
W
t
,
where
˜
W
t
is a new white noise sequence.
13
AR(1) and Causality
• Is an MA(1) process causal?
14
Introduction to Time Series Analysis. Lecture 5.
1. AR(1) as a linear process
2. Causality
3. Invertibility
4. AR(p) models
5. ARMA(p,q) models
15
MA(1) and Invertibility
Deﬁne
X
t
= W
t
+θW
t−1
= (1 +θB)W
t
.
If θ < 1, we can write
(1 +θB)
−1
X
t
= W
t
⇔ (1 −θB +θ
2
B
2
−θ
3
B
3
+· · · )X
t
= W
t
⇔
∞
j=0
(−θ)
j
X
t−j
= W
t
.
That is, we can write W
t
as a causal function of X
t
.
We say that this MA(1) is invertible.
16
MA(1) and Invertibility
X
t
= W
t
+θW
t−1
If θ > 1, the sum
∞
j=0
(−θ)
j
X
t−j
diverges, but we can write
W
t−1
= −θ
−1
W
t
+θ
−1
X
t
.
Just like the noncausal AR(1), we can show that
W
t
= −
∞
j=1
(−θ)
−j
X
t+j
.
That is, we can write W
t
as a linear function of X
t
, but it is not causal.
We say that this MA(1) is not invertible.
17
Invertibility
A linear process {X
t
} is invertible (strictly, an invertible
function of {W
t
}) if there is a
π(B) = π
0
+π
1
B +π
2
B
2
+· · ·
with
∞
j=0
π
j
 < ∞
and W
t
= π(B)X
t
.
18
MA(1) and Invertibility
• Invertibility is a property of {X
t
} and {W
t
}.
• Consider the MA(1) process deﬁned by X
t
= θ(B)W
t
(with
θ(B) = 1 +θB):
X
t
= θ(B)W
t
is invertible
iff θ < 1
iff the root z
1
of the polynomial θ(z) = 1 +θz satisﬁes z
1
 > 1.
19
MA(1) and Invertibility
• Consider the MA(1) process X
t
= θ(B)W
t
(with θ(B) = 1 +θB):
If θ > 1, we can deﬁne an equivalent invertible model in terms of a
new white noise sequence.
• Is an AR(1) process invertible?
20
Introduction to Time Series Analysis. Lecture 5.
1. AR(1) as a linear process
2. Causality
3. Invertibility
4. AR(p) models
5. ARMA(p,q) models
21
AR(p): Autoregressive models of order p
An AR(p) process {X
t
} is a stationary process that satisﬁes
X
t
−φ
1
X
t−1
−· · · −φ
p
X
t−p
= W
t
,
where {W
t
} ∼ WN(0, σ
2
).
Equivalently, φ(B)X
t
= W
t
,
where φ(B) = 1 −φ
1
B −· · · −φ
p
B
p
.
22
AR(p): Constraints on φ
Recall: For p = 1 (AR(1)), φ(B) = 1 −φ
1
B.
This is an AR(1) model only if there is a stationary solution to
φ(B)X
t
= W
t
, which is equivalent to φ
1
 = 1.
This is equivalent to the following condition on φ(z) = 1 −φ
1
z:
∀z ∈ R, φ(z) = 0 ⇒ z = ±1
equivalently, ∀z ∈ C, φ(z) = 0 ⇒ z = 1,
where C is the set of complex numbers.
23
AR(p): Constraints on φ
Stationarity: ∀z ∈ C, φ(z) = 0 ⇒ z = 1,
where C is the set of complex numbers.
φ(z) = 1 −φ
1
z has one root at z
1
= 1/φ
1
∈ R.
But the roots of a degree p > 1 polynomial might be complex.
For stationarity, we want the roots of φ(z) to avoid the unit circle,
{z ∈ C : z = 1}.
24
AR(p): Stationarity and causality
Theorem: A (unique) stationary solution to φ(B)X
t
= W
t
exists iff
φ(z) = 1 −φ
1
z −· · · −φ
p
z
p
= 0 ⇒ z = 1.
This AR(p) process is causal iff
φ(z) = 1 −φ
1
z −· · · −φ
p
z
p
= 0 ⇒ z > 1.
25
Recall: Causality
A linear process {X
t
} is causal (strictly, a causal function
of {W
t
}) if there is a
ψ(B) = ψ
0
+ψ
1
B +ψ
2
B
2
+· · ·
with
∞
j=0
ψ
j
 < ∞
and X
t
= ψ(B)W
t
.
26
AR(p): Roots outside the unit circle implies causal (Details)
∀z ∈ C, z ≤ 1 ⇒ φ(z) = 0
⇔ ∃{ψ
j
}, δ > 0, ∀z ≤ 1 +δ,
1
φ(z)
=
∞
j=0
ψ
j
z
j
.
⇒ ∀z ≤ 1 +δ, ψ
j
z
j
 → 0,
ψ
j

1/j
z
j
→ 0
⇒ ∃j
0
, ∀j ≥ j
0
, ψ
j

1/j
≤
1
1 +δ/2
⇒
∞
j=0
ψ
j
 < ∞.
So if z ≤ 1 ⇒ φ(z) = 0, then S
m
=
m
j=0
ψ
j
B
j
W
t
converges in mean
square, so we have a stationary, causal time series X
t
= φ
−1
(B)W
t
.
27
Calculating ψ for an AR(p): matching coefﬁcients
Example: X
t
= ψ(B)W
t
⇔ (1 −0.5B + 0.6B
2
)X
t
= W
t
,
so 1 = ψ(B)(1 −0.5B + 0.6B
2
)
⇔ 1 = (ψ
0
+ψ
1
B +ψ
2
B
2
+· · · )(1 −0.5B + 0.6B
2
)
⇔ 1 = ψ
0
,
0 = ψ
1
−0.5ψ
0
,
0 = ψ
2
−0.5ψ
1
+ 0.6ψ
0
,
0 = ψ
3
−0.5ψ
2
+ 0.6ψ
1
,
.
.
.
28
Calculating ψ for an AR(p): example
⇔ 1 = ψ
0
, 0 = ψ
j
(j ≤ 0),
0 = ψ
j
−0.5ψ
j−1
+ 0.6ψ
j−2
⇔ 1 = ψ
0
, 0 = ψ
j
(j ≤ 0),
0 = φ(B)ψ
j
.
We can solve these linear difference equations in several ways:
• numerically, or
• by guessing the form of a solution and using an inductive proof, or
• by using the theory of linear difference equations.
29
Calculating ψ for an AR(p): general case
φ(B)X
t
= W
t
, ⇔ X
t
= ψ(B)W
t
so 1 = ψ(B)φ(B)
⇔ 1 = (ψ
0
+ψ
1
B +· · · )(1 −φ
1
B −· · · −φ
p
B
p
)
⇔ 1 = ψ
0
,
0 = ψ
1
−φ
1
ψ
0
,
0 = ψ
2
−φ
1
ψ
1
−φ
2
ψ
0
,
.
.
.
⇔ 1 = ψ
0
, 0 = ψ
j
(j < 0),
0 = φ(B)ψ
j
.
30
Introduction to Time Series Analysis. Lecture 5.
1. AR(1) as a linear process
2. Causality
3. Invertibility
4. AR(p) models
5. ARMA(p,q) models
31
ARMA(p,q): Autoregressive moving average models
An ARMA(p,q) process {X
t
} is a stationary process that
satisﬁes
X
t
−φ
1
X
t−1
−· · · −φ
p
X
t−p
= W
t
+θ
1
W
t−1
+· · · +θ
q
W
t−q
,
where {W
t
} ∼ WN(0, σ
2
).
• AR(p) = ARMA(p,0): θ(B) = 1.
• MA(q) = ARMA(0,q): φ(B) = 1.
32
ARMA processes
Can accurately approximate many stationary processes:
For any stationary process with autocovariance γ, and any k >
0, there is an ARMA process {X
t
} for which
γ
X
(h) = γ(h), h = 0, 1, . . . , k.
33
ARMA(p,q): Autoregressive moving average models
An ARMA(p,q) process {X
t
} is a stationary process that
satisﬁes
X
t
−φ
1
X
t−1
−· · · −φ
p
X
t−p
= W
t
+θ
1
W
t−1
+· · · +θ
q
W
t−q
,
where {W
t
} ∼ WN(0, σ
2
).
Usually, we insist that φ
p
, θ
q
= 0 and that the polynomials
φ(z) = 1 −φ
1
z −· · · −φ
p
z
p
, θ(z) = 1 +θ
1
z +· · · +θ
q
z
q
have no common factors. This implies it is not a lower order ARMA model.
34