You are on page 1of 2

Afdeling Wiskunde Tentamen Time Series

Vrije Universiteit June 3 2009

When it says “derive” or “prove” give a short, but complete argument. When it says
“show” you may refer to theorems. The seven problems have equal weight.

1. Consider the equation Xt = Xt−1 − 3Xt−2 + 3Xt−3 + Zt for a given white noise process (Zt )
with mean zero.
a. Show that the equation has no stationary solution (Xt ).
b. Show that there exists a solution (Xt ) such that X0 = 0 and Xt − Xt−1 is stationary.

P∞
2. Let Xt a stationary time series with h=−∞ γX (h) < ∞ and EXt = 0. Let λ ∈ (0, π) be
fixed, let (λn ) be a sequence of natural frequencies with λn → λ. Define, for given k,

k
1 X  2π 
fˆn,k (λ) = In λn + j ,
2k + 1 n
j=−k

where In is the periodogram of the series (Xt ).


a. Give the definition of the periodogram In of Xt .
b. Calculate limn→∞ Efˆn,k (λ).
c. Formulate a theorem concerning the limit in distribution of the sequence fˆn,k (λ), and
derive an asymptotically correct confidence interval for fX (λ) based on fˆn,k (λ).
d. If this confidence interval were exact for every k ∈ N, which value(s) of k would you prefer,
considering the length of the interval? Comment (maximally three lines) on this result.

3. Consider a sequence (Zt ) of i.i.d. random variables with mean zero and variance 1, and numbers
α > 0 and θ ≥ 0. Suppose that θ < e−µ for µ = E log Zt2 , and let

v
u ∞
X
u 2 Z2 2
Xt = Zt α + α
t θj Zt−1 t−2 · · · Zt−j .
j=1

P∞
a. Prove that the series j=1 θj Zt−1
2 2
Zt−2 2
· · · Zt−j converges almost surely, so that the vari-
ables Xt are well defined.
b. Prove that this series converges in mean if and only if θ < 1.
c. Prove that the time series (Xt ) is an ARCH(1) process relative to its own filtration.
d. Describe, in maximally three lines, the meaning of the assertion that the process (Xt )
exhibits volatility clustering. Does the volatility clustering increase or decrease if θ is
made bigger?

4.
a. Formulate the projection theorem for minimizing the distance to a subspace of a Hilbert
space.
b. Explain how to use this theorem to find a best linear predictor of future values of a station-
ary time series (Xt ) using observed values X1 , . . . , Xn . Derive the prediction equations
for predicting Xn+10 .
c. Let (Xt ) be a stationary, causal AR(p) series. Derive the best linear predictor of Xn+1
given X1 , . . . , Xn , for n > p.
P
5. Let (Xt ) be a stationary
P time series with spectral density fX , and let Yt = j ψj Xt−j for a
sequence (ψj ) with j |ψj | < ∞.
a. P
Derive a formula for the spectral density of (Yt ) in terms of the transfer function ψ(λ) =
−ijλ
j ψj e .
Pn
b. Derive the transfer function ψn for the filter Yt = n−1 j=1 Xt−j .

c. Show that ψn (λ) → 0 for every λ ∈ (−π, π) − {0} as n → ∞, and ψn (λ) ≤ 1 for every λ.
d. Express var Yn+1 in the spectral density of (Xt ).
e. Use the preceding three problems to prove the weak law of large numbers: var X̄n → 0 as
n → ∞.

6. Consider the strictly stationary time series (Xt ) satisfying Xt = σt Zt for (Zt ) an i.i.d. sequence
of standard normal variables, and (σt ) a time series satisfying σt2 = 1+φσt−12
, for a number φ ∈
(0, 1). The filtrations generated by (Xt ) and (Zt ) are equal and σt = E(Xt2 | Xt−1 , Xt−2 , . . .).
2

a. Prove that the conditional likelihood (“pseudo likelihood”) of (X1 , . . . , Xn ) given


X0 , X−1 , X−2 , . . . takes the form
n
Y 1  Xt 
ψ ,
σ
t=1 t
σt

for ψ the standard normal density.


b. Compute the partial derivative Ṁn (φ) of the log conditional likelihood relative to φ, and
prove that this time series (in n) is a martingale relative to the filtration generated by
(Xt ).
c. Comment, in maximally five lines, on the significance of the result of b).

7.
a. Give the definition of an m-dependent time series.
b. Give the definition of an α-mixing time series.
c. Prove that an m-dependent time series is α-mixing.
d. State and prove a central limit theorem for m-dependent time series.

You might also like