Professional Documents
Culture Documents
Time Series ARIMA Models
Time Series ARIMA Models
Ani Katchova
1
Time Series ARIMA Models
• Modeling relationships using data collected over time – prices, quantities, GDP, etc.
• Forecasting – predicting economic growth.
• Time series involves decomposition into a trend, seasonal, cyclical, and irregular component.
2
White noise
• White noise describes the assumption that each element in a series is a random draw from a
population with zero mean and constant variance.
2
0
white_noise
-2
-4
0 10 20 30 40 50
_t
• Autoregressive (AR) and moving average (MA) models correct for violations of this white
noise assumption.
3
Autoregressive (AR) models
• Autoregressive (AR) models are models in which the value of a variable in one period is
related to its values in previous periods.
𝑝𝑝
• AR(p) is an autoregressive model with p lags: 𝑦𝑦𝑡𝑡 = 𝜇𝜇 + ∑𝑖𝑖=1 𝛾𝛾𝑖𝑖 𝑦𝑦𝑡𝑡−𝑖𝑖 + 𝜖𝜖𝑡𝑡
where 𝜇𝜇 is a constant and 𝛾𝛾𝑝𝑝 is the coefficient for the lagged variable in time t-p.
• AR(1) is expressed as: 𝑦𝑦𝑡𝑡 = 𝜇𝜇 + 𝛾𝛾𝑦𝑦𝑡𝑡−1 +𝜖𝜖𝑡𝑡 = 𝜇𝜇 + 𝛾𝛾(𝐿𝐿𝐿𝐿𝑡𝑡 ) + 𝜖𝜖𝑡𝑡 or (1 − 𝛾𝛾𝛾𝛾)𝑦𝑦𝑡𝑡 = 𝜇𝜇 + 𝜖𝜖𝑡𝑡
2
2
1
0
ar_1a
ar_1b
0
-1
-2
-2
-4
-3
0 10 20 30 40 50 0 10 20 30 40 50
_t _t
4
Moving average (MA) models
• Moving average (MA) models account for the possibility of a relationship between a variable
and the residuals from previous periods.
𝑞𝑞
• MA(q) is a moving average model with q lags: 𝑦𝑦𝑡𝑡 = 𝜇𝜇 + 𝜖𝜖𝑡𝑡 + ∑𝑖𝑖=1 𝜃𝜃𝑖𝑖 𝜖𝜖𝑡𝑡−𝑖𝑖
where 𝜃𝜃𝑞𝑞 is the coefficient for the lagged error term in time t-q.
4
4
2
2
ma_1b
ma_1a
0
0
-2
-2
-4
-4
0 10 20 30 40 50 0 10 20 30 40 50
_t _t
5
Autoregressive moving average (ARMA) models
• Autoregressive moving average (ARMA) models combine both p autoregressive terms and q
moving average terms, also called ARMA(p,q).
𝑝𝑝 𝑞𝑞
ARMA(1,1) with 𝛾𝛾 = 0.8 and 𝜃𝜃 = 0.7 ARMA(1,1) with 𝛾𝛾 = −0.8 and 𝜃𝜃 = −0.7
5
4
2
arma_11a
arma_11b
0
0-2
-5
-4
0 10 20 30 40 50 0 10 20 30 40 50
_t _t
6
Stationarity
7
Detrending
• A variable can be detrended by regressing the variable on a time trend and obtaining the
residuals.
𝑦𝑦𝑡𝑡 = 𝜇𝜇 + 𝛽𝛽𝛽𝛽 + 𝜀𝜀𝑡𝑡
-20-10 0 102030
Residuals
200
150
100
8
Differencing
• When a variable 𝑦𝑦𝑡𝑡 is not stationary, a common solution is to use differenced variable: Δ𝑦𝑦𝑡𝑡 =
𝑦𝑦𝑡𝑡 − 𝑦𝑦𝑡𝑡−1 , for first order differences.
• The variable 𝑦𝑦𝑡𝑡 is integrated of order one, denoted I(1), if taking a first difference produces a
stationary process.
• ARIMA (p,d,q) denotes an ARMA model with p autoregressive lags, q moving average lags, a
and difference in the order of d.
9
Seasonality
• Seasonality is a particular type of autocorrelation pattern where patterns occur every “season,”
like monthly, quarterly, etc.
• For example, quarterly data may have the same pattern in the same quarter from one year to the
next.
• Seasonality must also be corrected before a time series model can be fitted.
10
Dickey-Fuller Test for Stationarity
Dickey-Fuller test
• Assume an AR(1) model. The model is non-stationary or a unit root is present if |𝜌𝜌| = 1.
𝑦𝑦𝑡𝑡 = 𝜌𝜌𝑦𝑦𝑡𝑡−1 + 𝑒𝑒𝑡𝑡
𝑦𝑦𝑡𝑡 − 𝑦𝑦𝑡𝑡−1 = 𝜌𝜌𝑦𝑦𝑡𝑡−1 − 𝑦𝑦𝑡𝑡−1 + 𝑒𝑒𝑡𝑡
Δ𝑦𝑦𝑡𝑡 = (𝜌𝜌 − 1)𝑦𝑦𝑡𝑡−1 + 𝑒𝑒𝑡𝑡 = 𝛾𝛾𝑦𝑦𝑡𝑡−1 + 𝑒𝑒𝑡𝑡
• We can estimate the above model and test for the significance of the 𝛾𝛾 coefficient.
o The null hypothesis 𝛾𝛾 = 0 (non-significant coefficient 𝛾𝛾) means that 𝑦𝑦𝑡𝑡 is not stationary.
Difference the variable and repeat the Dickey-Fuller test to see if the differenced variable
is stationary.
o The alternative hypothesis 𝛾𝛾 < 0 (significant coefficient 𝛾𝛾) means that 𝑦𝑦𝑡𝑡 is stationary.
Use the variable.
o Note that significance for 𝛾𝛾 means stationarity.
11
Augmented Dickey-Fuller test
• In addition to the model above, a drift 𝜇𝜇 and additional lags of the dependent variable can be
added.
𝑝𝑝−1
• The augmented Dickey-Fuller test evaluates the null hypothesis that 𝛾𝛾 ∗ = 0. The model will
be non-stationary if 𝛾𝛾 ∗ = 0 .
• Test the hypothesis that 𝛽𝛽 = 0 and 𝛾𝛾 ∗ = 0. Again, the model will be non-stationary or will
have a unit root present if 𝛾𝛾 ∗ = 0 .
12
Autocorrelation Function (ACF) and Partial Autocorrelation Function (ACF)
• ACF is the proportion of the autocovariance of 𝑦𝑦𝑡𝑡 and 𝑦𝑦𝑡𝑡−𝑘𝑘 to the variance of a dependent
variable 𝑦𝑦𝑡𝑡
Cov(𝑦𝑦𝑡𝑡 , 𝑦𝑦𝑡𝑡−𝑘𝑘 )
𝐴𝐴𝐴𝐴𝐴𝐴 (𝑘𝑘) = 𝜌𝜌𝑘𝑘 =
Var(𝑦𝑦𝑡𝑡 )
• The autocorrelation function ACF(k) gives the gross correlation between 𝑦𝑦𝑡𝑡 and 𝑦𝑦𝑡𝑡−𝑘𝑘 .
• For an AR(1) model, the ACF is 𝐴𝐴𝐴𝐴𝐴𝐴 (𝑘𝑘) = 𝜌𝜌𝑘𝑘 = 𝛾𝛾 𝑘𝑘 . We say that this function tails off.
• PACF is the simple correlation between 𝑦𝑦𝑡𝑡 and 𝑦𝑦𝑡𝑡−𝑘𝑘 minus the part explained by the
intervening lags
𝜌𝜌𝑘𝑘∗ = Corr[𝑦𝑦𝑡𝑡 − 𝐸𝐸 ∗ (𝑦𝑦𝑡𝑡 |𝑦𝑦𝑡𝑡−1 , … , 𝑦𝑦𝑡𝑡−𝑘𝑘+1 ), 𝑦𝑦𝑡𝑡−𝑘𝑘 )]
where 𝐸𝐸 ∗ (𝑦𝑦𝑡𝑡 |𝑦𝑦𝑡𝑡−1 , … , 𝑦𝑦𝑡𝑡−𝑘𝑘+1 ) is the minimum mean-squared error predictor of 𝑦𝑦𝑡𝑡 by
𝑦𝑦𝑡𝑡−1 , … , 𝑦𝑦𝑡𝑡−𝑘𝑘+1 .
• For an AR(1) model, the PACF is 𝛾𝛾 for the first lag and then cuts off.
13
ACF and PACF properties
AR(p) MA(q) ARMA(p,q)
ACF Tails off Cuts off after lag q Tails off
PACF Cuts off after lag p Tails off Tails off
14
ACF of AR(1) with coefficient 0.8 PACF of AR(1) with coefficient of 0.8
1.00
1.00
Partial autocorrelations of ar_1a
0.50
0.50
Autocorrelations of ar_1a
0.00
0.00
-0.50
-0.50
0 5 10 15 20 0 5 10 15 20
Lag Lag
Bartlett's formula for MA(q) 95% confidence bands 95% Confidence bands [se = 1/sqrt(n)]
ACF of AR(1) with coefficient -0.8 PACF of AR(1) with coefficient of -0.8
-0.60-0.40-0.200.00 0.20
0 5 10 15 20 0 5 10 15 20
Lag Lag
Bartlett's formula for MA(q) 95% confidence bands 95% Confidence bands [se = 1/sqrt(n)]
15
ACF of MA(1) with coefficient of 0.7 PACF of MA(1) with coefficient of 0.7
0 5 10 15 20 0 5 10 15 20
Lag Lag
Bartlett's formula for MA(q) 95% confidence bands 95% Confidence bands [se = 1/sqrt(n)]
ACF of MA(1) with coefficient of -0.7 PACF of MA(1) with coefficient of -0.7
0 5 10 15 20 0 5 10 15 20
Lag Lag
Bartlett's formula for MA(q) 95% confidence bands 95% Confidence bands [se = 1/sqrt(n)]
16
ACF of ARMA(1,1) with coeff 0.8 and 0.7 PACF of ARMA(1,1) with coeff 0.8 and 0.7
1.00
1.00
Partial autocorrelations of arma_11a
Autocorrelations of arma_11a
0.50
0.50
0.00
0.00
-0.50
-0.50
0 5 10 15 20 0 5 10 15 20
Lag Lag
Bartlett's formula for MA(q) 95% confidence bands 95% Confidence bands [se = 1/sqrt(n)]
ACF of ARMA(1,1) with coeff −0.8 and −0.7 PACF of ARMA(1,1) with coeff −0.8 and −0.7
0.50
0.00
0.00
-0.50
-0.50
-1.00
-1.00
0 5 10 15 20 0 5 10 15 20
Lag Lag
Bartlett's formula for MA(q) 95% confidence bands 95% Confidence bands [se = 1/sqrt(n)]
17
ACF of non-stationary series - The ACF shows a slow decaying positive ACF.
1.00 0.50
Autocorrelations of xt
-0.50 0.00
-1.00
0 5 10 15 20
Lag
Bartlett's formula for MA(q) 95% confidence bands
ACF with seasonal lag (4) – ACF shows spikes every 4 lags.
1.00
Autocorrelations of xt
0.00 0.50
-0.50
0 10 20 30
Lag
Bartlett's formula for MA(q) 95% confidence bands
18
Diagnostics for ARIMA Models
Testing for white noise
Goodness of fit
• Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC) are two
measures goodness of fit. They measure the trade-off between model fit and complexity of the
model.
AIC = −2 ln(𝐿𝐿) + 2𝑘𝑘
BIC = −2 ln(𝐿𝐿) + ln (𝑁𝑁)𝑘𝑘
where 𝐿𝐿 is the value of the likelihood function evaluated at the parameter estimates, 𝑁𝑁 is the
number of observations, and 𝑘𝑘 is the number of estimated parameters.
• A lower AIC or BIC value indicates a better fit (more parsimonious model).
19
The Box-Jenkins Methodology for ARIMA Model Selection
Identification step
20
Estimation step
• If the model fits well, then the residuals from the model should resemble a while noise process.
o Check for normality looking at a histogram of the residuals or by using a quantile-
quantile (Q-Q) plot.
o Check for independence by examining the ACF and PACF of the residuals, which should
look like a white noise.
o The Ljung-Box-Pierce statistic performs a test of the magnitude of the autocorrelations of
the correlations as a group.
o Examine goodness of fit using the Akaike Information Criteria (AIC) and Bayesian
Information Criteria (BIC). Use most parsimonious model with lowest AIC and/or BIC.
21