Professional Documents
Culture Documents
Topic 5
Topic 5
V. Forecasting
V.1 The Box-Jenkins approach
Having fitted an ARMA model to {x1, x2, ... xn} we have the equation:
?
• • •
1 2 n n+k time
In the Box-Jenkins approach, x̂ n (k) is taken as E(Xn+k | X1 , ... , Xn), i.e. x̂ n (k) is the conditional
expectation of the future value of the process, given the information currently available.
From result 2 in ST3053 (section A), we know that E(Xn+k | X1 , ... , Xn) minimises the mean square error
E(Xn+k – h( X1 , ... , Xn))2 of all functions h(X1 , ... , Xn).
• Replace random variables X1, ..., Xn by their observed values x1 , ... , xn.
• Replace random variables Xn+1 , ... , Xn+k-1 by their forecast values, x̂ n (1) , ... , x̂ n (k-1)
x n+k - xˆ n (k )
This is needed for confidence interval forecasts as it is more useful than a point estimate.
For stationary processes, it may be shown that x̂ n (k) ! µ as k ! " . Hence, the variance of the
forecast error tends to E(xn+k-µ)2 = )2 as k % &, where )2 is the variance of the process.
ˆ ˆ n (1) ! µˆ
xˆ n (2) = 2xˆ n (1) ! x n + zˆ n (2) = 2xˆ n (1) ! x n + µˆ +!(z
• The Box-Jenkins method requires a skilled operator in order to obtain reliable results.
• For cases where only a simple forecast is needed, exponential smoothing is much simpler (Holt,
1958).
A weighted combination of past values is used to predict future observations. For example, the first
forecast for an AR model is obtained by
( 2
xˆn (1) = ! xn + (1 " ! ) xn "1 + (1 – ! ) xn "2 + ... )
or
"
!
xˆn (1) = ! # (1- ! )i xn-i = xn
i =0 1- (1- ! ) B
!
!
• The sum of the weights is ! " (1-!)i = =1
i=0 1-(1-!)
• Generally we use a value of , such that 0 < , < 1, so that there is less emphasis on historic values
further back in time (usually, 0.2 6 , 6 0.3).
• There is only one parameter to control, usually estimated via least squares.
Xn-1 Xn Xn+1
5
?
5
| | |
n-1 n n+1
A linear filter is a transformation of a time series {xt} (the input series) to create an output series {yt}
which satisfies:
!
yt = #a k x t-k .
k= "!
The objective of the filtering is to modify the input series to meet particular objectives, or to display
specific features of the data. For example, an important problem in analysis of economic time series is
detection, isolation and removal of deterministic trends.
In practice, a filter {ak : k % Z} normally contains only a relatively small number of non-zero
components.
Example: regular differencing. This is used to remove a linear trend. Here a0 = 1, a1 = -1, ak = 0
otherwise. Hence yt = xt – xt-1.
Example: if the input series is a white noise and the filter takes the form {.0 = 1, .1, ... , .q}, then the
output series is MA(q), since
q
yt = " ! k et -k
k =0
If the input series, x, is AR(p), and the filter takes the form {,0 = 1, -,1, ... , -,p}, then the output series is
white noise
p
yt = xt " # ! k xt -k = et
k =1