You are on page 1of 8

Delft University of Technology Mekelweg 4

Faculteit Elektrotechniek, Wiskunde en Informatica 2628 CD Delft

Exam financial time-series (WI3411TU)


November 4, 2021, 18:30-20:30
Bringing your own notes, or books etc. is prohibited.
Unless stated differently, always add an explanation to your answer (no expla-
nation implies no points).
In exercise 6 you will need to read some time-series data into R. The command for that is
given in the exercise, but has already been copied for you to the file wi3411tu-examfall21.R

1. Let {Zt } be a sequence of independent normally distributed random variables. Assume


E [Z1 ] = 0 and Var(Z1 ) = σ 2 .

(a) [2 pt]. Let a, b and c be constants. Suppose Xt = a + bZt + cZt−4 , t = 0, 1, . . ..


Compute the autocorrelation function of {Xt } for lag 4.
2
(b) [2 pt]. Suppose Xt = Zt + Zt−1 − 1, t = 0, 1, . . .. Compute the mean and
autocorrelation function of {Xt }.

2. [3 pt]. Suppose {Xt } is a mean-zero stationary time series. The best linear predictor
for Xt+1 , based on Xt and Xt−2 is of the form αXt + δXt−2 . Show that [α, γ] satisfies
a matrix-vector equation for the form
   
α c
A = 1 ,
δ c2

where A is a 2 × 2 matrix and c1 , c2 ∈ R. Derive both the matrix A and the numbers
c1 , c2 .

3. [2 pt]. Consider data x to which a time-series model is fitted using the sarima com-
mand. Part of the output is

Coefficients:
ar1 ar2 xmean
0.3143 0.2762 -0.3144
s.e. 0.0975 0.0978 0.2366

sigma^2 estimated as 0.9671: log likelihood = -140.4, aic = 288.81

Write down the equation for the fitted model. Coefficients can be rounded to two
decimals.

1
4. [3 pt]. Suppose Zt ∼ WN(0, σ 2 ) and ψ > 0. Suppose the time-series {Xt } satisfies the
relation
Xt = ψXt−2 + Zt
with ψ such that a causal, stationary and invertible solution exists. Derive the best
predictor for Xt+3 based on {Xt , Xt−1 , . . . , X0 }. Carefully justify all steps in your
derivation.
5. (a) [1 pt]. Give the model equation(s) for a ARCH(1)-model. Be precise!
(b) [1 pt]. What is the best one-step-ahead linear predictor for the ARCH(2)-model?
(c) [1.5 pt]. State 3 stylised features of financial time-series that are incorporated in
the ARCH(2)-model.
6. In this exercise you will analyse time-series data for ‘coke‘ on the NYSE. The data can
be read into R by issuing the command
Y <- as.xts(read.zoo(’KO.csv’,header=TRUE))
Logreturns can be obtained with the command
LR <- diff(log(Y))[-1]

(a) [1 pt]. Is it reasonable to assume the data in Y are stationary?


(b) [2 pt]. Fit a mean zero MA(1)-model to the logreturns. Write down the fitted
model. Save the residuals of the fitted model in a vector and call this vector z.
(c) [1 pt]. Can we consider the residuals to be white-noise?
(d) [1 pt]. Investigate the time-series z for ARCH-effects. To motivate your answer,
include a rough sketch of each autocorrelation function that you generate in your
investigation (including the first 5 lags of the ACF-plot suffices). Do you find any
ARCH-effects in z?
(e) [2 pt]. Fit a GARCH(1, 1) model to z, assuming that the IID sequence appearing
in the definition of the GARCH(1, 1)-model has (marginally) the standard normal
distribution.
Give the equations of the fitted model (you may neglect the parameter mu in the
list of fitted coefficients).
(f) [1.5 pt]. Check whether the model assumption of the assumed GARCH model are
fulfilled using the standardised residuals. How can the model be adjusted to give
improved fit?

7. (a) [2 pt]. Construct a stationary ARMA time-series model for which the autocorrelation-
function ρ(h) satisfies ρ(h) = 0 for all h ∈ {1, 2, . . .} \ {2}, i.e. ρ vanishes for all
lags ≥ 1, except for lag 2. Don’t forget to verify stationarity.
(b) [2 pt]. Verify whether the model you propose is invertible and/or causal.

2
List of R commands

mean average
sd standard deviation
var variance
diff differencing of a vector
acf autocorrelation function
Acf autocorrelation function (lag 0 omitted)
pacf partial autocorrelation function
tsdisplay plot time-series, ACF and PACF
Box.test Box-Pierce or Ljung-Box test
kurtosis excess kurtosis
qqnorm and qqline make normal probability plot
ARMAacf theoretical ACF for an ARMA process
arima.sim simulate from a ARMA model
sarima fit ARMA-model (use “$fit” for the fitted object)
sarima.for forecast, assuming a ARMA-model
garchSpec specify GARCH model.
garchSim simulate from GARCH-model
garchFit fit GARCH-model
predict forecasts for fitted GARCH-model
residuals(gfit) residuals of fitted GARCH-model in gfit
residuals(gfit,
standardize=T) standardised residuals of fitted GARCH-model in gfit
volatility(gfit) volatility of fitted GARCH-model in gfit

An example of reading data from finance.yahoo.com


# r e a d t h e data
getSymbols ( ”PHG” , s r c =’yahoo ’ )
# subset
PHG[ ’2007 −06::2008 −01 −12 ’]
# OHLC c h a r t
c h a r t S e r i e s (PHG, s u b s e t =’ l a s t 2 months ’ , theme=chartTheme ( ’ white ’ ) )

3
4
Solutions

1. (a) Using half-linearity of covariance:

Cov(Xt , Xt−4 ) = Cov(a + bZt + cZt−4 , a + bZt−4 + cZt−8 ) = bcCov(Zt−4 , Zt−4 ) = bcσ 2

(b) Since EZt2 = 1, by linearity of expectation we get that EXt = 0. Now


2
− 1, Zt+1 + Zt2 − 1 = Cov Zt , Zt2 = EZt3 = 0.
 
Cov(Xt , Xt+1 ) = Cov Zt + Zt−1

It is easily seen that ρX (h) = 0 for h ≥ 2. Hence, {Xt } is white-noise.

2. α and δ follow from minimising

Ψ(α, β) := E (Xt+1 − αXt − δXt−21 )2 .


 

Setting partial derivatives to zero gives the equations

E [Xt (Xt+1 − αXt − δXt−2 )] = 0

and
E [Xt−1 (Xt+1 − αXt − δXt−2 )] = 0
As E [Xt ] = 0 and the proces is stationary, this implies

γX (1) − αγX (0) − δγX (2) = 0


γX (3) − αγX (2) − δγX (0) = 0

Hence     
γX (0) γX (2) α γX (1)
=
γX (2) γX (0) δ γX (3)
| {z }
A
3. We get
(Xt + 0.31) = 0.31(Xt−1 + 0.31) + 0.28(Xt−2 + 0.31) + Zt
where {Zt } ∼ IID N (0, 0.97).

4.
E [Xt+3 | Ft ] = E [ψXt+1 + Zt+3 | Ft ] = ψE [Xt+1 | Ft ] + E [Zt+3 | Ft ] .
The final term is equals E [Zt+3 ] by causality and independence. This in turn is zero.
Hence E [Xt+3 | Ft ] = ψE [Xt+1 | Ft ]. Now repeat the argument to get

E [Xt+3 | Ft ] = E [ψXt−1 + Zt+1 | Ft ] = ψ 2 Xt−1 .

5
5. (a) Let {Zt } ∼ IID(0, 1). Then Xt = σt Zt and σt2 = ω + αXt−1
2
. If students include
an intercept (so Xt = µ + σt Zt ), this is also OK.
(b) 0, as ARCH(p) is white noise.
(c) Heavy tails, volatility clustering, nonlinear dependence.

6. (a) A ts-plot shows a trend and hence clear nonstationarity.


(b) LRt = Zt + θZt−1 , {Zt } ∼ IID N (0, σ 2 ) with θ̂ ≈ −0.087 and σ̂ ≈ 0.001396.
(c) Either yes (all almost within dashed blue significance bands), or no (significant
autocorr at lag 2). Motivation of answer is important.
(d) Significant autocorr for squared logreturns, hence ARCH effect. See Fig 1.
(e) Zt = µ + σt Wt , {Wt } ∼ IID(0, 1), σt2 = ω + αZt−1
2 2
+ βσt−1 .

µ̂ ≈ 4.13 · 10−4 ω̂ ≈ 3.89 · 10−6 α̂9.17 · 10−2 β̂ ≈ 0.876.

(f) ACF of squared standardised residuals looks OK. However, standardised residuals
do not seem to satisfy the assumed normality assumption. So the model assump-
tions are violated. See Fig 2 Use of innovations {Zt } that have a heavier tail may
help to improve the model, for example t-distributed random variables Zt .

7. (a) AR will never work since the ACF decays exponentially. So we need a pure MA-
model. If {Xt } ∼ MA(2) then for sure all autocorrs at lage ≥ 3 are zero. So
consider
Xt = Zt + θ1 Zt−1 + θ2 Zt−2 . {Zt } ∼ WN(0, σ 2 ).
This process is causal and stationary. Then

γ(1) = Cov(Zt + θ1 Zt−1 + θ2 Zt−2 , Zt+1 θ1 Zt + θ2 Zt−1 ) = (θ1 + θ1 θ2 )σ 2 .

So either (i) θ1 = 0 and θ2 6= 0, or (ii) θ2 = −1 and θ1 ∈ R.


(b) Causality is immediate. For invertibility, in case of (i) this holds if 1 − θ2 B = 0
implies |B| > 1. This is equivalent of |θ2 | < 1. In case of (ii) we get the equation
1 − 1 · B = 0 which implies B = 1, so then invertibility fails.
Students need to come up with 1 option, not necessarily both.

6
Series z
0.8
ACF

0.4
0.0

0 5 10 15 20 25 30

Lag

Series z^2
0.8
ACF

0.4
0.0

0 5 10 15 20 25 30

Lag

Figure 1: ACF plots for exercise 6d.

7
qnorm − QQ Plot
4
Sample Quantiles

2
0
−6 −4 −2

−3 −2 −1 0 1 2 3

Theoretical Quantiles

ACF of Squared Standardized Residuals


0.8
ACF

0.4
0.0

0 5 10 15 20 25 30

Lags

Figure 2: Diagnostic plots for exercise 6f.

You might also like