You are on page 1of 5

UNIVERSIDAD DEL PACIFICO

Academic Department of Finance


Quantitative Finance (1F0111)
Second Semester 2017
Instructor: F. Rosales, TA: J. Cardenas

Assessment 3
Instructions: the assessment contains five questions. Questions one, two and three must be answered,
but you can choose between answering question four or question five. Good luck!

1. Causality and Invertibility (3pts)


Determine which of the following ARMA processes are causal and which of them are invertible.
In each case Wt WN (0, 2 ).
(a) Xt + 0.2Xt1 0.48Xt2 = Wt
(b) Xt + 1.9Xt1 + 0.88Xt2 = Wt + 0.2Wt1 + 0.7Wt2
(c) Xt + 1.6Xt1 = Wt 0.4Wt1 + 0.04Wt2

Losung. Write the ARMA as (B)Xt = (B)Wt . The process {Xt } is causal if and only if
(z) 6= 0 for each |z| 1 and invertible if and only if (z) =
6 0 for each |z| 1.
(a) (z) = 1 + 0.2z 0.48z 2 = 0 is solved by z1 = 5/3 and z2 = 5/4. Hence {Xt } is causal.
(z) = 1. Hence {Xt } is invertible.
(b) (z) = 1 + 1.9z + 0.88z 2 = 0 is solved by z1 = 10/11 and z2 = 5/4. Hence {X t } is not
2
causal. (z) = 1+0.2z+0.7z
= 0 is solved by z1 = (1 69i)/7, and z2 = (1+ 69i)/7.
Since |z1 | = |z2 | = 70/7 > 1, {Xt } is invertible.
(c) (z) = 1 + 1.6z = 0 is solved by z = 5/8. Hence {Xt } is not causal. (z) = 1 0.4z +
0.04z 2 = 0 is solved by z1 = z2 = 5. Hence {Xt } is invertible.

2. Forecast (7pts)

(a) Consider the sinusoid process


Xt = A cos(t) + B sin(t),
where (0, ) is constant and A, B are uncorrelated random variables with mean 0 and
variance 2 . Show that Xnn1 Xn1 = 0 by arguing that Xn can be written as a linear
combination of its first two lags. Hint: remember that
sin( ) = sin() cos() cos() sin()
cos( ) = cos() cos() sin() sin()

(b) Consider the AR(2) process


Xt = 1 Xt1 + 2 Xt2 + Wt , Wt WN (0, 2 ).

1
i. Compute the asymptotic distribution of 2 for = 1, 1 = 1.5 and 2 = 0.75; and
the corresponding confidence intervals. For the latter use = 0.05 and n = 103 .
ii. What is the consequence of fitting an AR(2) model to an AR(1) process in terms of
the variance of 1 ?

Losung. (a) Note that Xt can be written as a deterministic function of its lags.

Xn1 = A sin((n 1)) + B cos((n 1))


= A {sin(n) cos() cos(n) sin()} + B {cos(n) cos() + sin(n) sin()}
= cos() {A sin(n) + B cos(n)} + sin() {B sin(n) A cos(n)}
= cos()Xn + sin() {B sin(n) A cos(n)}

Following the same reasoning,

Xn2 = cos(2) {A sin(n) + B cos(n)} + sin(2) {B sin(n) A cos(n)}


= cos(2)Xn + sin(2) {B sin(n) A cos(n)}

Note that

2 cos()Xn1 = 2 cos2 ()Xn + 2 sin() cos() {B sin(n) A cos(n)}


cos () sin2 () Xn + sin(2) {B sin(n) A cos(n)}
 2
Xn2 =
2 cos()Xn1 Xn2 = Xn ,

and Xn is perfectly forecastable using information at n 1.


(b) For the AR(2) process we have
i. The asymptotic distribution follows
    2 
1 1 1
N , 2 as n ,
2 2 n

where 2 can be computed from the expectations E[Xt Xti ], i = 0, 1, 2, as

(0) (1)1 (2)2 = 2


(1) (0)1 (1)2 = 0
(2) (0)2 (1)1 = 0,

with solution:
2 (2 1) 2 1
(0) = , (1) =
((2 1) 2 21 ) (2 + 1) ((2 1) 2 21 ) (2 + 1)

2 21 22 + 2
(2) = ,
((2 1) 2 21 ) (2 + 1)
hence
2 (2 1) 2 1

122
" #
((2 1)2 21 )(2 +1) ((2 1)2 21 )(2 +1) 1 (22+1)
2 = 2 1 2 (2 1)
1
2
2
1 (2 +1) 122
( 1) 2 2
((2 1)2 21 )(2 +1) ( 2 2 21 )(2 +1)
leading to the expression

2 1 1 22
   
1 1 (1 + 2 ) 1 0.4375 0.375
= =
n 2 n 1 (1 + 2 ) 1 22 n 0.375 0.4375

2
ii. Suppose {Xt } is an AR(1) process and the sample size n is large. If we estimate 1 ,
we have
1 21
V[1 ] = .
n
If we fit an AR(2) to this AR(1) process,

1 22 1 1 21
V[1 ] = = > ,
n n n
that is, the variance in the AR(2) process for the estimation of 1 is larger. For the
confidence intervals we obtain
 1/2 0.661438
1 1/2 1 2 = 1 1.96 = 1 0.0409963
n 11 10 10
 1/2 0.661438
2 1/2 1 2 = 2 1.96 = 2 0.0409963,
n 22 10 10

3. PACF & Missing Data (5pts)


(a) Consider the MA(1) process

Xt = Wt1 + Wt , Wt WN (0, 2 ).

Use the projection theorem to find the value at lag 2 of the corresponding PACF, i.e. find
2,2 in X32 = 2,1 X2 + 2,2 X1 .
(b) Consider the stationary AR(1) process

Xt = Xt1 + Wt , Wt WN (0, 2 ).

Assume you observed X1 and X3 , and you would like to estimate missing value X2 . Find
the best linear predictor of X2 given X1 and X3 .

Losung. (a) By the projection theorem

Cov(X3 X32 , Xi ) = 0, i = 1, 2.

Hence

Cov(X3 X32 , X1 ) = Cov(X3 2,1 X2 2,2 X1 , X1 )


= Cov(X3 , X1 ) 2,1 Cov(X2 , X1 ) 2,2 Cov(X1 , X1 )
= (2) 2,1 (1) 2,2 (0) = 0

and

Cov(X3 X32 , X2 ) = Cov(X3 2,1 X2 2,2 X1 , X2 ) = (1) 2,1 (0) 2,2 (1) = 0.

Since we have an MA(1) process it has ACVF



2 2
(1 + ), h = 0

2
(h) = , |h| = 1

0, otherwise

3
Thus, we have to solve the equations
2,1 (1) + 2,2 (0) = 0
(1 2,2 )(1) 2,1 (0) = 0.
Solving this system of equations we find
2
2,2 =
4 + 2 + 1

(b) We are interested in X21,3 = 2,1 X1 + 2,3 X3 . Following the same idea as in part (a), it
follows that
1 2 2,1
        
2,1 1
= = ,
2 1 2,2 2,2 1+ 1
2

hence:

X21,3 = (X1 + X3 )
1 + 2

4. Mathematical Proof (5pts)


(a) Consider an AR(2) process. Write a difference equation for the ACVF, parametrized only
with respect to 1 , 2 and 2 , and provide the steps to solve it.
(b) Consider a stationary AR(p) process. Argue that with probability 1 , the estimation
p is in the ellipsoid
  >   2 
Rn : p p p p p 21 (p) ,
n
where 21 (p) is the (1 ) quantile of the chi-squared with p degrees of freedom.

Losung. (a) Following the results from question 2 b, one arrives to the difference equation
2 (2 1)
h = 2 h2 + 1 h1 with 0 =
((2 1) 2 21 ) (2 + 1)
2 1
1 = .
((2 1) 21 ) (2 + 1)
2

The solution reads:


  p  p  
2h1 2 1 (2 + 1) 1 21 + 42 h 21 + 42 + 1 h
h = p
(21 (2 1) 2 ) (2 + 1) 21 + 42
p  p  p  
(2 1) 21 + 42 1 21 + 42 h + 21 + 42 + 1 h
+ p .
(21 (2 1) 2 ) (2 + 1) 21 + 42
The ACF is then provided by (h) = (h)/(0), with the values previously computed:
2h1
  q  q  
2 + 4 h 2 + 4 + h
(h) = p 1 ( 2 + 1) 1 1 2 1 2 1
(2 1) 21 + 42
q  q  q  
h
2
+ (2 1) 1 + 42 2
1 1 + 42 + 1 + 42 + 1 h
2

4
(b) Note that

n(p p ) N (0, 2 1 ), as n ,

and that with probability 1 , the j-th element of p , p,j , is in the interval

 1/2
p,j 1/2 1
p ,
n jj

where 1/2 is the 1 /2 quantile of the standard normal. Moreover, notice thath
n o n o 2
V 1/2 1/2 1/2
p (p p ) = p V p p p = I,
n
Thus,

2
 
1/2 n >
v= (p p ) N 0, I v v 2 (p),
n 2

which completes the argument.

5. Binary Options in Cryptocurrencies (5pts)


Download price data from:
Ether: https://etherscan.io/chart/etherprice
Bitcoin: https://www.coindesk.com/price
for time span 01.01.2016 to 13.02.2017. In addition, divide the data in two parts:
Training set: 01.01.2016 to 04.02.2017.
Validation set: 05.02.2017 to 13.02.2017.
(a) Suppose you are only interested in forecasting the sign of the return of ether. What is the
best ARMA model you can build for this task? Fit the model and provide your success
ratio in the training set. You can use lags and transformations for the prices of bitcoin
and ether to improve your estimation.
(b) Fix the coefficients obtained in the previous question (training set) and compute your
success ratio in the validation set. Should you modify your model selection criteria?
Comment on the results.

You might also like