Professional Documents
Culture Documents
ARIMA MODEL
(BOX – JENKINS METHODOLOGY)
• White noise:
✓E(ut) = μ
✓Var(ut) = σ2
✓Cov(ut, ut+k) = 0, với γ𝑘
Is white noise process stationary?
• Random series in EVIEWS (very much like the disturbance in
regression model)
=> Generate a random series in EVIEWS and observe its properties.
4.1.1 STATIONARITY OF TIME SERIES
=> Time series with trends, or with seasonality, are not stationary.
4.1.1 STATIONARITY OF TIME SERIES
=> Time series with trends, or with seasonality, are not stationary.
4.1.1 STATIONARITY OF TIME SERIES
Which series is stationary?
NON-STATIONARY SERIES
• When any of the 3 conditions is not met, the series is non-
stationary.
• Random Walk
Yt = Yt −1 + ut
• Is random walk stationary?
• The random walk is an autoregressive model of order 1 AR(1)
• The coefficient on Yt-1 is 1 => unit root process.
NON-STATIONARY SERIES
Yt = Yt −1 + ut
Yt = 1 + Yt −1 + ut
• Random walk with drift and trend
Yt = 1 + 2Tt + Yt −1 + ut
INTEGRATED SERIES
• One way to make a non-stationary time series stationary is to
compute the differences between consecutive observations –
differencing.
d (Yt ) = Yt = Yt − Yt −1
d (Yt , 2) = Yt − Yt −1 = Yt − 2Yt −1 + Yt − 2
• A random walk is not stationary but the 1st difference of a random
walk is stationary. (Prove that!)
• A series which is not stationary. Its 1st difference is stationary =>
integrated series of order 1. Denoted as I(1)
• A series which is not stationary. Its 2nd difference is stationary =>
integrated series of order 2. Denoted as I(2)
1st DIFFERENCE OF RANDOM WALK
• Random Walk without drift
Yt = Yt −1 + ut
Yt = ut
Yt = 1 + 2Tt + Yt −1 + ut
Yt = 1 + 2Tt + ut
4.1.2 TEST FOR STATIONARITY
Unit Root Test
(Dickey – Fuller Unit Root Test)
• For series Yt:
Yt = Yt −1 + ut
Yt = Yt − Yt −1 = Yt −1 + ut − Yt −1 = ( − 1)Yt −1 + ut
Yt = Yt −1 + ut
• If δ = 0 (ρ = 1) => the series is non-stationary
• Testing hypothesis:
H0: δ = 0 => the series is non-stationary
H1: δ < 0 => the series is stationary
=> Small p-value of the test indicates stationarity.
4.1.2 TEST FOR STATIONARITY
(Y − Y )(Y
t t −k −Y)
cov(Yt , Yt − k )
rk = t = k +1
n
=
(Y − Y )
var(Yt )
t
t =1
Q-STATISTICS
• Q-statistics provides an overall test for significant
autocorrelation.
• Null hypothesis: All population autocorrelation coefficients up
to lag m are simultaneously equal to 0.
• H0: ρ1 = ρ2 = ⋯ = ρ𝑚 = 0
m
rk2
Q(m) = n(n + 2)
k =1 n − k
2
• Q follows χ (𝑚)
• P-value(Q) < α => Reject H0 => There is non-zero
autocorrelation within the first m lags
4.2 AUTOCORRELATION AND
PARTIAL AUTOCORRELATION
• Partial Autocorrelation Coefficient 𝝆𝒌𝒌 : is the
partial correlation coefficient between the series Yt and lag
of itself Yt-k
✓“Partial correlation” between two variables is the amount of
correlation between them which is not explained by their
mutual correlations with a specified set of other variables.
✓Partial Autocorrelation Function at lag k
Trend
Seasonality
Random
CORRELOGRAM OF
STATIONARY SERIES
CORRELOGRAM OF
SERIES WITH TREND
CORRELOGRAM OF
SERIES WITH SEASONALITY
CORRELOGRAM OF SERIES WITH
TREND AND SEASONALITY
CORRELOGRAM OF RANDOM SERIES
4.3 AR MODEL
• Auto-Regression Process
• In an autoregression model, we forecast the variable
of interest using a linear combination of past values of
the variable.
• AR(1) Model
Yt = 0 + 1Yt −1 + ut
➢ut satisfies OLS’s basic assumptions.
➢ϕ0 is the constant level of the series.
➢-1 < ϕ1 < 1 => stationary series
➢If |ϕ1 | > 1 => explosive series
4.3 AR MODEL
• Auto-Regressive Process of order p
• AR(p) Yt = 0 + 1Yt −1 + 2Yt − 2 + ... + pYt − p + ut
p
Yt = 0 + iYt −i + ut
i =1
Yt = + ut + 1ut −1
• μ is the level (mean) of the process
• Yt depends on previous values of the errors (error term
at time t – 1 for MA(1) process)
4.4 MA MODEL
Consider a MA(1) process: 𝑌𝑡 = μ + 𝑢𝑡 + θ1 𝑢𝑡−1
• E(𝑌𝑡 ) = ?
• var(𝑌𝑡 ) = ?
• cov(𝑌𝑡 , 𝑌𝑡−1 ) = E[(𝑢𝑡 + θ1 𝑢𝑡−1 )(𝑢𝑡−1 + θ1 𝑢𝑡−2 )] = θ1 σ2
=> r(𝑌𝑡 , 𝑌𝑡−1 ) ≠ 0 and does not depend on time t
• cov(𝑌𝑡 , 𝑌𝑡−2 ) = 0
=> r(𝑌𝑡 , 𝑌𝑡−2 ) = 0
...
q
= 0 + 1qYt − q + 1i ut −i + ut
i =1
• q → ∞ then ϕ1 → 0 => Yt = 0 +
𝑞
1 ut −i + ut = MA()
i
i =1
4.4 MA MODEL
• Any MA(q) process is stationary (∀q)
• If a MA process can be written as an AR process => the
MA process is invertible.
• For example:
Yt = ut+0.5ut-1
Yt = ut+0.5(Yt-1 – 0.5 ut-2)
Yt = ut+ 0.5Yt-1 – 0.52 Yt-2+ …. = ut + σ∞
𝑗=1 (−θ 𝑗 )𝑌
𝑡−𝑗
Yt is an invertible process.
• The invertibility constraints are similar to the stationarity
constraints.
4.4 MA MODEL
• How to determine q?
=> Use the autocorrelation plot ACF
• we can select the order q for model MA(q) from
ACF if this plot has a sharp cut-off after lag q.
• If A gradual geometrically declining ACF or a
sinusoidal ACF is observed, then q = 0.
4.5 ARMA MODEL
• Most of time series are the mix processes of both AR
and MA terms.
• An ARMA (p,q) process:
Y t = C + 1Y t −1 + 2Y t − 2 + ... + pY t − p + 1u t −1 + 2u t − 2 + ... + qu t − q + u t
p q
= C + iYt −i + j ut − j + u t
i =1 j =1
MA(q)
AR(p)
ARMA(p, q)
AR MODEL
MA MODEL
ARMA MODEL
4.6 ARIMA MODEL
I(d): integrated series of order d (the series is stationary after
taking d-th difference)
Example of ARIMA(p,1,q):
Y t
= C + Y t −1 + Y t − 2 + ... + Y t − p + 1 u t −1 + 2 u t − 2 + ... + q u t − q + u t
1 2 p
n n n
ARIMA MODEL SELECTION CRITERIA
➢ARIMA(1,0,2)×(0,0,1)4