You are on page 1of 16

Communications in Statistics—Simulation and Computation® , 37: 1167–1182, 2008

Copyright © Taylor & Francis Group, LLC


ISSN: 0361-0918 print/1532-4141 online
DOI: 10.1080/03610910701877579

Time Series Analysis

Predictive Density Order Selection


of Periodic AR Models

MOHAMED BENTARZI1 , HAFIDA GUERBYENNE1 ,


AND ROUKIA HEMIS2
1
Faculty of Mathematics, University of Science and Technology
Houari Boumediene, Algiers, Algeria
2
Faculty of Sciences, Ferhat Abbas University, Sétif, Algeria

This article considers the order selection problem of periodic autoregressive models.
Our main goal is the adaptation of the Bayesian Predictive Density Criterion (PDC),
established by Djuric’ and Kay (1992) for selecting the order of a stationary autoreg-
ressive model, to deal with the order identification problem of a periodic autoregressive
model. The performance of the established criterion, (P-PDC), is compared, via
simulation studies, to the performances of some well-known existing criteria.

Keywords Bayesian approach; Non informative prior densities; Periodic


autoregressive model; Predictive density criterion.

Mathematics Subject Classification 62.

1. Introduction
A considerable interest has been given, over the last two decades, to the class of
periodic linear autoregressive moving average models, PARMAS pt  qt  (cf., Adams
and Goodwin, 1995; Anderson and Vecchia, 1993; Bai et al., 1988; Bentarzi and
Aknouche, 2005; Bentarzi and Hallin, 1994; Herwartz, 1997; Osborn, 1991; Osborn
and Smith, 1989; Pagano, 1978; Ula, 1990, 1993; Ula and Smadi, 1997; Vecchia,
1985; Vecchia and Ballerini, 1992, and others), as a promising alternative to
the traditional class of time-invariant autoregressive integrated moving average
seasonal time-series models, SARIMA, popularized through the well-known book
of Box and Jenkins (1976). As consequences of the recognition that many times
series which come upon in practice exhibit periodical autocorrelation structures—
a feature that cannot be accounted for by the traditional seasonal models—the
identification, estimation, periodically stationary (causality), invertibility, and testing
problems tied to the autoregressive moving average models with periodic coefficients

Received May 16, 2007; Accepted December 19, 2007


Address correspondence to Mohamed Bentarzi, Université des Sciences et de la
Technologie, U. S. T. H. B., Algiers, Algeria; E-mail: mohamedbentarzi@yahoo.fr

1167
1168 Bentarzi et al.

were intensively studied by many authors. This article is mainly devoted to the
identification problem of Periodic Autoregressive (P-AR) models which are the most
frequently used in a diversity of theoretical and practical studies in many fields
such as transmission, telecommunication, signal treatment, meteorology, financial
markets, and many others.
The rest of this article is organized as follows. The basic notations, definitions,
and regularity conditions tied to the underlying periodic autoregressive process
are presented in the next section. In Sec. 3, we extend the classical Predictive
Density Criterion (PDC) derived by Djuric’ and Kay (1992), for selecting the order
of a stationary autoregressive model, to deal with the order identification of a
periodic autoregressive model. Moreover, we give a review of some other criteria
such as the Akaike Information Criterion (AIC) (Akaike, 1974; Shibata, 1976),
the consistent information criterion (CIC) (Ciftcioglu, 1994), and the Minimum
Description Length (MDL) (Rissanen, 1978; Schwarz, 1978). Section 4 is devoted to
many simulation studies to show the performance of the established periodic P-PDC
criterion and compare it with the performances of the aforementioned criteria.

2. Definitions and Regularity Conditions


A univariate, second-order, periodically correlated process, with period S, yt  t ∈
, is said to have an autoregressive periodic representation, with period S and order
kt , if it is a solution of a linear stochastic difference equation of the form:

yt + t1 yt−1 + · · · + tkt yt−kt = t  t ∈  (2.1)

where t is an uncorrelated process, with zero mean and variance t2 , and which is
uncorrelated with the past of the underlying process yt  t ∈ , i.e., Covs  yt  = 0,
t < s. Moreover, the time-dependent autoregressive coefficients tj , j = 1  kt ,
the variance t2 , and the order kt are periodic in time with period S, i.e., t+rS j =
tj  for j = 1  kt  t+rS
2
= t2 , and kt+rS = kt  ∀t r ∈ Z In this article, we confine
ourselves to the particular case where the order kt is time invariant and it is equal
to k = maxi∈12 S ki . Putting t = i + S  i = 1 2  S and ∈ , the model (2.1)
can be written in the equivalent form


k
k
yi+S + ij yi−j+S = i+S (2.2)
j=1

  
Let k = k
1
 k
2
  k
S
 be the kS × 1 column vector of the autoregressive
 ik  
k k k
parameters of the periodic model (2.2), where the vectors k
i
= i1  i2
k
with ij denotes the jth coefficient of the periodic autoregressive model of order
k, for the season i, i = 1 2  S, j = 1 2  k and consider the k + 1S × 1
vector of the unknown parameters of the model, k =  k  k  , where k =
 1  2   S  . And consider the following assumptions, under which we derive
k k k

our extended predictive density criterion.

A1. The distribution of the innovation process t is supposed to be Gaussian


with zero mean and periodic variance t2 , i.e., t ∼ N0 t2 .
Order Selection of Periodic AR Models 1169

A2. The probability distribution of the parameter vector k , given k, is given
by the family of non informative prior joint densities Box and Tiao (1992):

  S
1
 k
 /k ∝  k
∈ Sk  ∈ +∗S and k ∈ 
i=1 i

A3. The autoregressive parameters k satisfy the necessary and sufficient
condition for the periodic autoregressive process to be periodically stationary
(causal). Using the so-called “order span lumping” approach which considers
a lumping of the periodic process over a span of the order k (cf., Bentarzi,
1998; Bentarzi and Hallin, 1994; Ula and Smadi, 1997), we can obtain a
necessary and sufficient condition for a periodic P-AR model to be causal.
Indeed, letting YT = ykT  ykT −1   ykT −k+1  and T = kT  kT −1   kT −k+1 
where k = max1≤i≤S ki , one can rewrite (2.1) in a periodic k-variate VAR model:

T0 YT + T1 YT −1 = T  (2.3)

where the coefficient matrices T0 and T1




1 i=j
T0 ij = 0 i>j (2.4a)


kT −i+1j−i i < j

and

0 i<j
T1 ij = (2.4b)
kT −i+1k+j−i i ≥ j

are periodic with period , such that k is the least common multiple of k and S.
From the representation (2.3) which is equivalent to the periodic model (2.1), one
can easily verify that a necessary and sufficient condition for the model (2.3) and
hence for the model (2.1) to be causal is that the eigenvalues of the matrix (see
Bentarzi and Hallin, 1994; Ula and Smadi, 1997):

 = −1 −1 −1 −1
0 1 −10 −11 · · · 20 21 10 11 

are all less than 1.

3. Periodic Predictive Density Criterion

3.1. Calculation of the Predictive Density for Periodically


Autoregressive Process
Considering the notations

yk+jmS = yk+j  yk+j+S   yk+j+mS  


yk+j = yk+j−1+S  yk+j−2+S   yk+j−k+S  
1170 Bentarzi et al.


Vk+j m = I − Hk+j mWk+j m−1 Hk+j m

Wk+j m = Hk+j mHk+j m

where
 
Hk+j m = yk+j0  yk+j1   yk+jm
 
yk+j−1 yk+j−2 ··· yj
 
 yk+j+S−1 yk+j+S−2 yj+S 
 
= 
 
 
yk+j+mS−1 yk+j+mS−2 yj+mS

and where 1 ≤ j ≤ S = 0 1 2  m − 1 and m = t−k S


, where t−k S
denotes
t−k
the largest integer less than or equal to S Finally, let y1t−1 denotes the data
y1   yt−1 . We can, under the three assumptions A1 , A2 , and A3 , state the following
result which extends the result of Djuric’ and Kay (1992) to deal with a P-AR
model.

Proposition 3.1. Under the assumptions A1  A2 and A3 , the predictive density of a


periodically correlated process yt , solution of the S-periodic model (2.1) is given, for
t − k = J0 + mS 0 ≤ J0 ≤ S − 1, by:
(1) For k = 0, i.e., the process yt  t ∈  is a periodic white noise process, the
predictive density is then given as follows:
(a) For J0 = 0 and m ≥ 2

m   m−1/2
   ySm−2S ySm−2S
f yt /y1t−1  k = 0 = √ 2m−1   m/2  (3.1)
 2 ySm−1S ySm−1S

(b) for 1 ≤ J0 ≤ S − 1, k = 0, and m ≥ 1, we have

 m+1    m/2
 yJ m−1S yJ m−1S
fyt /y1t−1  k = 0 = √    0
2 0
m+1/2 (3.2)
 m2 y y
J mS J mS
0 0

(2) For k ∈ ∗ 
(a) For J0 = 0 m ≥ k + 2
   
 
1  m−k Wk+S m − 2 1/2
f yt /y1t−1  k > 0 = √  2

  m−k−1
2
Wk+S m − 1 1/2
   m−k−1
yk+Sm−2S Vk+S m − 2yk+Sm−2S 2
×   m−k (3.3)
yk+Sm−1S Vk+S m − 1yk+Sm−1S 2
Order Selection of Periodic AR Models 1171

(b) For 1 ≤ J0 ≤ S − 1 m ≥ k + 1
  
  1  m−k+1 Wk+J0 m − 1 1/2
f yt /y1t−1  k > 0 = √  2

  m−k 2
Wk+J0 m 1/2
   m−k
yk+J m−1S Vk+J0 m − 1yk+J m−1S 2
× 
0 0
 m−k+1 (3.4)
yk+J mS Vk+J0 myk+J mS 2
0 0

Proof. See Appendix 1


It is worth noting that the results given by this proposition reduce, in the
stationary case (i.e., S = 1), to the same predictive density fyt /y1t−1  k given in
Djuric’ and Kay (1992).

3.2. Computation of the Periodic PDC


Following Djuric’ and Kay (1992), the optimal order estimation of an autoregressive
model is given by the integer k which minimizes the following quantity:


N  
PDCk = − ln f yt /y1 t−1  k 
t=2

the order estimation is then given by


  

N
P = arg min − ln fyt /y1t−1  k 
0≤k≤K
t=2

where N is the size of the observed time series generated by a periodically stationary
process satisfying the underlying periodic autoregressive model of order k and K
is a positive integer large enough. For the applicability of the predictive density
criterion, we need to use the same number of observations. To solve this question
we adopt the same computation procedure used in Djuric’ and Kay (1992) in the
stationary case. Thus, for order k = 0, we set


N  
DK0 = − ln f yt /y1 t−1  k =0 
t=2S

and for k = 1, we set


N  
DK1 = − ln f yt /y1t−1  k = 1
t=4S

This last expression cannot be forwardly compared to DK0 . Indeed, DK0 and
DK1 are computed for two different numbers of observations (N − 2S + 1 and
N − 4S + 1, respectively). However, we have the appropriate form


4S−1   N  
DK1 = − ln f yt /y1t−1  k = 0 − ln f yt /y1t−1  k = 1 
t=2S t=4S
1172 Bentarzi et al.

and for k = 2, we have:


4S−1  
 6S−1  
DK2 = − ln f yt /y1t−1  k = 0 − ln f yt /y1t−1  k = 1
t=2S t=4S


N  
− ln f yt /y1t−1  k = 2
t=6S

In the same manner, we can express recurrently the quantities DKk , for k > 2, as
follows:

 2r+2S−1
k−1    
N  
DKk = − ln f yt /y1t−1  r − ln f yt /y1t−1  k 
r=0 t=2r+1S t=2k+1S

and which are all computed on the basis of the same number of observations,
(N − 2S + 1). Hence, they are adequate for comparison. The order estimation
is then given by the value of k which minimizes the expression PDC k, i.e.,
P = argmink∈012 K DKk .

4. Simulation Studies and Comments


4.1. Some Classical Order Selection Criteria and their Adaptation
to the Periodic Case
In this section, we are interested, on one hand, in studying, via Monte Carlo
experiments, the performance of the periodic PDC order selection criterion and, on
the other hand, in comparing the performance of the PDC to the performances of
some well-known classical criteria such as the Akaike Information Criterion (AIC)
(Akaike, 1974; Shibata, 1976), the consistent information criterion (CIC) (Ciftcioglu,
1994), and the Minimum Description Length (MDL) (cf., Rissanen, 1978; Schwarz,
1978). We recall that, in the stationary case, the essence of these criteria is to find the
argument which minimizes, respectively, with respect to k, the quantities, AICk =
ln ˆ k2  + 2k
N
, CICk = ln ˆ k2  + 2k lnNN  and MDLk = ln ˆ k2  + k lnN N
These criteria can be adapted to the periodic case as follows (for more details,
one can see Franses (1996, p. 105):
  S  2k  2kS
P = arg min P-AICk  where P-AICk = ln ˆ i + 
k∈012 K
i=1
M
  S  2k  ln M
P = arg min P-CICk  where P-CICk = ln ˆ i + 2kS  and
k∈012 K
i=1
M
  S  2k  kS ln M
P = arg min P-MDLk  where P-MDLk = ln ˆ i +
k∈012 K
i=1
M

2k
where M is such that the sample size N = MS, and where ˆ i is the ordinary least
square estimation of the white noise variance of the periodic autoregressive process
of order k and for the season i, i = 1 2  S.
The simulation studies of the performances of these criteria were assessed
on many different time series generated from periodic autoregressive processes of
Order Selection of Periodic AR Models 1173

Table A2.4
i 1 2 3 4
i1 17 1 95 22 15
i2 −7 − 95 −1 2 −1
2
i 4 9 16 9

Table A2.7
 i
1 2 3 4 5 6 7
i1 −2 12 −3 17 1 24 −1 4
i2 35 − 27 54 −7 − 24 20 − 45
2
i 1 2 9 16 9 16 25

different orders (PAR(2), PAR(3)) and periods (4 and 7) for a variety of sample
sizes (80, 100, 120, 140, and 190). For each model, we consider 10,000 Monte
Carlo replications. The autoregressive parameters and variances of each of the four
considered periodic autoregressive data-generating processes are given, for each
period, in Tables A2.4, A2.7, A3.4, and A3.7, respectively. The obtained selection
order frequencies are reported, respectively, in Tables B2.4.1–B2.4.3, B2.7.1–B2.7.3,
B3.4.1–B3.4.3, and Fig. 1.
Data generating processes: Data generating process yt + t1 yt−1 + t2 yt−2 = t ,
where t ∼ N0 t2 , t ∈ ;
Data generating process yt + t1 yt−1 + t2 yt−2 + t3 yt−3 = t , where t ∼
N0 t2 .

Table A3.4
i 1 2 3 4
i1 4 17 7 8
i2 5 5 − 15 1 06
i3 −9 − 72 6 −9
2
i 1 16 2 9

Table A3.7
 i
1 2 3 4 5 6 7
i1 43 5 −3 45 −3 25 −7
i2 14 −4 6 −1 7 − 34 65
i3 6 12 −8 5 −1 5 8 11
2
i 1 2 9 16 9 16 25
1174 Bentarzi et al.

Table B2.4.1
k = 2 S = 4 N = 80
p/C AIC CIC MDL PDC
0 and 1 00 00 00 00
2 22 72 76 57 38 93 94 50
3 64 12 23 37 58 71 4 35
4 6 55 0 03 1 69 0 77
5 3 31 0 01 0 36 0 24
6 1 50 00 0 07 0 10
7 1 80 0 02 0 24 0 04

Table B2.4.2
k = 2 S = 4 N = 100
p/C AIC CIC MDL PDC
0 and 1 00 00 00 00
2 23 11 73 61 39 33 96 23
3 66 25 26 39 59 60 3 31
4 6 71 00 0 91 0 36
5 2 15 00 0 13 0 08
6 0 95 00 0 02 0 02
7 0 83 00 0 01 00

Table B2.4.3
k = 2 S = 4 N = 120
p/C AIC CIC MDL PDC
0 and 1 00 00 00 00
2 23 06 72 10 40 17 97 37
3 68 02 27 89 59 33 2 26
4 5 78 00 0 45 0 27
5 1 89 00 0 03 0 09
6 0 71 0 01 0 02 00
7 0 54 00 00 0 01

4.2. Simulation Results and Comments


In this paragraph, the performances of AIC, CIC, MDL, and PDC criteria are
studied and compared using simulations. Indeed, we present the results of 10,000
Monte Carlo replications for each aforementioned model. The main goal of this
simulation study is, to see on one hand the performance of each of the well-known
criteria AIC, CIC, and MDL when used in the periodic case, and on the other hand,
to compare, in this situation, their performances to the performance of the adapted
PDC criterion. The autoregressive parameters are chosen to satisfy the causality
Order Selection of Periodic AR Models 1175

Table B2.7.1
k = 2 S = 7 N = 120
p/C AIC CIC MDL PDC
0 and 1 0 07 25 96 0 74 3 98
2 74 97 74 02 95 56 92 35
3 10 26 0 02 2 89 2 97
4 4 05 00 0 52 0 59
5 3 11 00 0 15 0 09
6 2 85 00 0 11 0 01
7 4 69 00 0 03 0 01

Table B2.7.2
k = 2 S = 7 N = 140
p/C AIC CIC MDL PDC
0 and 1 0 01 15 73 0 30 1 64
2 81 44 84 26 97 71 96 08
3 9 20 0 01 1 75 2 04
4 3 79 00 0 20 0 20
5 1 91 00 0 02 0 04
6 1 68 00 0 02 00
7 1 97 00 00 00

Table B2.7.3
k = 2 S = 7 N = 160
p/C AIC CIC MDL PDC
0 and 1 00 11 99 0 12 0 63
2 83 75 88 00 98 62 97 99
3 8 73 0 01 1 14 1 27
4 3 10 00 0 11 0 10
5 1 78 00 0 01 0 01
6 1 34 00 00 00
7 1 30 00 00 00

condition. To see the behavior of the performances of these criteria when, at least,
one characteristic root is close to the causality condition, we have chosen, in the
first model and in many others not reported here, the parameters such as one of
the roots of the characteristic equation is, in modulus, in the neighbourhood of 1.
As we can see later, in the last case the performances of AIC, CIC, and MDL, are
seriously affected, but the PDC’s performance is not altered, this is because this
last criterion does not require the parameter estimates. The parameter estimates, for
the AIC, CIC, and MDL criteria, were obtained using the recursive Boshnakov’s
1176 Bentarzi et al.

Table B3.4.1
k = 3 S = 4 N = 80
p/C AIC CIC MDL PDC
0 and 1 00 1 41 0 02 0 02
2 0 43 0 60 0 35 00
3 60 03 95 37 83 59 92 66
4 15 91 1 25 8 64 5 12
5 9 56 0 59 3 13 1 35
6 5 53 0 47 1 57 0 53
7 8 54 0 31 2 70 0 32

Table B3.4.2
k = 3 S = 4 N = 100
p/C AIC CIC MDL PDC
0 and 1 00 0 20 00 00
2 0 10 0 04 0 07 00
3 65 42 98 33 89 93 95 90
4 15 74 0 67 6 25 3 26
5 8 81 0 37 1 76 0 61
6 4 34 0 18 0 77 0 14
7 5 59 0 21 1 22 0 09

Table B3.4.3
k = 3 S = 4 N = 120
p/C AIC CIC MDL PDC
0 and 1 00 00 00 00
2 0 01 0 01 0 01 00
3 69 86 99 25 93 29 96 98
4 14 47 0 26 4 41 2 51
5 8 40 0 19 1 12 0 38
6 3 60 0 13 0 35 0 13
7 3 66 0 16 0 82 00

algorithm (Boshnakov, 1996). The programs are written in Matlab 7.0 environment.
First, we comment the simulation results obtained using data generated from the
periodic autoregressive AR4 3 and AR7 2 models.
From Tables B3.4.1–B3.4.3 (respectively, Tables B2.7.1–B2.7.3), one can easily
note that all the behavior properties of the criteria AIC, CIC, MDL, and PDC,
observed in the stationary case, are also met in our periodic case. Thus, the
AIC tends to overestimate the order of the model even for large sample sizes.
In contrast, the criteria CIC, MDL, and PDC do not significantly overestimate, as it
Order Selection of Periodic AR Models 1177

Figure 1. Frequencies of the true order of the model A3.7 identified by the four criteria.

is well known in the stationary case, the true order. Moreover, for these criteria,
one can clearly see that the identification percentage estimates of the true order
increase when the sample size increases; thus the consistency property satisfied by
these criteria, in the stationary case, is also met in our periodic case. Also, from
Tables B3.4.1–B3.4.3, we can see that the criteria CIC and PDC slightly outperform
the MDL criterion, but this advantage is disappearing quickly when the sample size
becomes sufficiently large. The simulation results assessed on time series generated
from a periodic autoregressive having, at least one root in absolute value near 1,
show, as we can note from Tables B2.4.1–B2.4.3, that the PDC criterion clearly
outperforms all other criteria. This is due to the fact that the estimation procedure
is not stable in this case, and hence the AIC, MDL, and CIC estimated percentages
are altered, but this of the PDC criterion is, since it does not use the parameter
estimations, safe from this undesirable behavior.
Finally, for the model A3.7, where the number of parameters is slightly large,
the simulation results are presented below in Fig. 1. It is worth noting that, for small
sample sizes, criteria PDC and CIC always give the best results. Moreover, when
the samples size are rather large, the frequencies of the choice of the true order by
the PDC and CIC are very close and strongly consistent.

Appendix 1
Proof of Proposition 3.1. Using the Bayesian approach, we obtain the predictive
density of probability fyt /y1t−1  k, of the periodically correlated autoregressive
process yt where y1t−1 denotes the data y1   yt−1 . Supposing that t − k = J0 + mS,
where 0 ≤ J0 ≤ S − 1 and m ∈ , then the predictive density can be written in the
form:
1178 Bentarzi et al.

(a) For J0 = 0 and k > 0, we have:


    mS 
  
S 
f yt /y1t−1  k > 0 = C2−t−k/2 × −1
k+j
−1
k+r
k k
j=1 r=1
 mS  2 
 1 
k
k
× exp − 2
yk+r + k+rj yk+r−j d k
d k

r=1 2 k+r j=1

Letting r = i + S  i = 1 2  S and = 0 1  m − 1, we can rewrite the last


expression in the form:
   m+1
  
S
f yt /y1t−1  k > 0 = C2− 2 −1
t−k

k k k+i


i=1
 m−1 2

S
1  
k
k
× exp − 2
yk+i+S + k+i+S j yk+i+S −j dk d k

i=1 2 k+i =0 j=1

Handling these integrals, we obtain the predictive density of probability, which is


given for J0 = 0 m > k and k ∈ ∗  by:
  
  m−k S
f yt /y1t−1  k > 0 = C2−Sm−k/2 × 2Sm−k−2/2 × 
2
S 
 −1/2
× Wk+j
k
m − 1
j=1


S  −m−k/2
× yk+jm−1S Vk+j m − 1yk+jm−1S  (A.1)
i=1

where the constant C will be calculated later.


(b) For 1 ≤ J0 ≤ S − 1, and k > 0, the predictive density of probability is given
by:
   m+1  
−t−k/2

S
−1

J0
−1
fyt /y1t−1  k > 0 = C2 k+j k+j
k k
j=1 j=1
 mS  2
 1 
k
k
× exp − 2
yk+r + k+ri yk+r−i
r=1 2 k+r i=1
 2 
J0
1 
k
k
− 2
yk+r+mS + k+ri yk+r−i+mS d k
d k

r=1 2 k+r i=1

These integrals can be written for r = j + S , j = 1 2  S and = 0 1  m, in


the form

fyt /y1t−1  k > 0


 

S  
−1 m+1

J0
 
= C2−t−k/2 k+j
−1
k+j
k
j=1 j=1
Order Selection of Periodic AR Models 1179

  

S yk+jm−1S Vk+j m − 1yk+jm−1S
× exp − 2
j=1+J0 2 k+j
  
J0
yk+jmS Vk+j myk+jmS
× exp − 2
j=1 2 k+j
    
 S  k
−  k
m − 1 W m − 1
k
− k m − 1
 k+j k+j k+j k+j k+j 
× exp− 2 
j=1+J0 k+j
k
2 k+r

× dk
k+j

J0     W k m
 k k+j
× exp − k+j − k m 2
j=1 k+j
k k+j 2 k+r
  
k
× k+j − k
k+j
m k
dk+j d k

Handling these integrals we find for 1 ≤ J0 ≤ S − 1:

     
  C −Sm−k+J0 /2 m − k + 1 J0 m − k S−J0
f yt /y1t−1  k > 0 = S   
2 2 2

S 
J0
× Wk+j m − 1 −1/2 Wk+j m −1/2
j=1+J0 j=1


S  
−m−k/2
× yk+jm−1S Vk+j m − 1yk+jm−1S
j=1+J0


J0
 
−m−k+1/2
× yk+jmS Vk+j myk+jmS (A.2)
j=1

Calculation of the constant C.


k k
The a posteriori density of the parameters  can be expressed in the form

     
 k
 k
/y1t−1  k > 0 = C k
 k
f yk+1t−1 /y1k  k
 k
k > 0

Letting t − k = J0 + S and integrating, we can find:


     
C −1 =  k
 k
× f yk+1t−1 /y1k  k
 k
k > 0 d k
d k
k k

    J0 −1+mS

S   
= 2−J0 −1+mS/2 −1
k+j
−1
k+r
k k
j=1 r=1
 J0 −1+mS
 k k 2
 yk+r + j=1 k+rj yk+r−j 
× exp − 2
d k
d k

r=1 2 k+r
1180 Bentarzi et al.

(a) For J0 = 0, and k > 0, this integral can be written, by putting r = j + S ,


j = 1 2  S and = 0 1  m − 1, in the form:

C −1 = 2−t−k−1/2
  S  
S−1  

yk+jm−2S Vk+j m − 2yk+jm−2S

−1 m −1
× k+j  exp −
k+j 2
k
j=1 j=1 2 k+j
  

S−1 yk+jm−1S Vk+j m − 1yk+jm−1S S−1

× exp − 2
j=1 2 k+j j=1

   k  
 m − 1 
k k
k+j − k
k+j
m − 1 Wk+j m − 1 k+j − k
k+j
× exp − 2
dk
k+j
k
k+j 2 k+r
   W   
k k+S m k
× exp − k+S − k m 2 k+S − k m dk
k+S
d k
k
k+S k+S 2 k+S k+S

k
Integrating with respect to k
k+j
and then with respect to k+j , j = 1 2  S,
we obtain for J0 = 0 and j = 1 2  S:
     
m − k −S−1 m − k − 1 −1 k
C = Sm−k−1/2 × 2   Wk+S m − 2 1/2
2 2

S−1
k 
S−1

m−k/2
× Wk+j m − 1 1/2 × yk+jm−1S Vk+j m − 1yk+jm−1S
j=1 j=1
 
m−k−1/2
× yk+Sm−1S Vk+S m − 1yk+jm−1S

Replacing the constant C by its value, in (A.1), we obtain the predictive density
given by (3.3).
(b) For J0 = 1, and k > 0. Let r = j + S , j = 1 2  S and = 0
1  m − 1, then this integral can be expressed in the form:

S     
−1 −t−k−1/2
  
−1 m+1
yk+jm−1S Vk+j m − 1yk+jm−1S
C = 2 k+j exp − 2
j=1 k+j 2 k+j
   W m − 1
k+j
× exp − k
k+j
−  k
m − 1 2
k
k+j k+j 2 k+r
  
× k
k+j
− k
k+j
m − 1 d k
k+j
d k+j

Hence, the constant C is given by:


   S
m − k −S 
C = Sm−k/2 × 2  Wk+j m − 1 1/2
2 j=1


S 

m−k/2
× yk+jm−1S Vk+j m − 1yk+jm−1S
j=1
Order Selection of Periodic AR Models 1181

Replacing in (A.2) the constant C by its value we obtain the predictive density,
for J0 = 1 and j = 1 2  S and k ∈ ∗ , given by (3.4).
For 2 ≤ J0 ≤ S − 1, we have (in the same manner), after some algebraic
manipulations:
     
Sm−k+J0 −1/2 m − k −S−J0 +1 m − k + 1 1−J0
C = 2 
S
 
2 2
J0 −1

S 
× Wk+j m − 1 1/2 × Wk+j m 1/2
j=J0 j=1

J0 −1
 
m−k+1/2
× yk+jmS Vk+j myk+jmS
j=1


S 

m−k/2
× yk+Sm−1S Vk+S m − 1yk+jm−1S
j=J0

Replacing C by its value in (A.2), we obtain the predictive density given


by (3.4). The obtainment of the predictive density for k = 0 and 0 ≤ J0 ≤ S − 1 is
straightforward.

Acknowledgments
The authors would like to express their most sincere thanks and grateful
acknowledgment to Professor N. Balakrishnan, Editor-in-Chief of Communications
in Statistics—Simulation and Computation and to the anonymous referee for their
fruitful suggestions and constructive remarks. Also, the first author presents his
thanks to the colleague F. Hamdi for his help in the programming task.

References
Adams, G. J., Goodwin, G. C. (1995). Parameter estimation for periodic ARMA models.
Journal of Time Series Analysis 16:127–145.
Akaike, H. (1974). A new look at the statistical model identification. IEEE Transactions on
Automatic Control AC-19:716–723.
Anderson, P. L., Vecchia, A. V. (1993). Asymptotic results for periodic autoregressive
moving-average processes. Journal of Time Series Analysis 14:1–18.
Bai, Z. D., Subramanyam, Zhao, K., L. C. (1988). On determination of the order of an
autoregressive model. Journal of Multivariate Analysis 27:40–52.
Bentarzi, M. (1998). Model-building problem of periodicallys m-variete moving average
process. Journal of Multivariate Analysis 66:1–21.
Bentarzi, M., Hallin, M. (1994). On the invertibility of periodic moving average models.
Journal of Time Series Analysis 15:263–268.
Bentarzi, M., Aknouche, A. (2005). Calculation of the Fisher information matrix for periodic
ARMA models. Communications in Statistics Theory and Methods 34:891–903.
Boshnakov, G. N. (1996). Recursive computation of the parameters of periodic
autoregressive moving-average processes. Journal of Time Series Analysis 17:333–349.
Box, G. E. P., Jenkins, G. M. (1976). Time Series Analysis, Forecasting and Control. Rev. ed.
San Francisco: Holden-Day.
Box, G. E. P., Tiao, G. C. (1992). Bayesian Inference in Statistical Analysis. Wiley-Interscience.
1182 Bentarzi et al.

Ciftcioglu, Ö., Hoogenboom, J. E., Dam, H. V. (1994). A consistent estimator for the model
order of an autoregressive process. IEEE Transactions on Signal Processing 42:1471–1476.
Djuric’, P. M., Kay, S. M. (1992). Order selection of autoregressive models. IEEE
Transactions on Signal Processing 40:2829–2833.
Franses, P. H. (1996). Periodicity and Stochastic Trends in Economic Time Series. Oxford:
Oxford University Press.
Herwartz, H. (1997). Performance of periodic error correction models in forecasting
consumption data. International Journal of Forecasting 13:421–431.
Osborn, D. R. (1991). The implications of periodically varying coefficients for seasonal time
series processes. Journal of Econometrics 48:373–384.
Osborn, D. R., Smith, J. P. (1989). The performance of periodic autoregressive models
in forecasting seasonal U.K. consumption. Journal of Business and Economic Statistics
7:117–127.
Pagano, M. (1978). On periodic and multiple autoregression. Annals of Statistics 6:1310–1317.
Rissanen, J. (1978). Modeling by shortest data description. Automatica 14:465–478.
Schwarz, G. (1978). Estimating the dimension of the model. Annals of Statistics 6:461–664.
Shibata, R. (1976). Selection of the order of an autoregressive model by Akaike’s
information criterion. Biometrika 63:117–126.
Ula, T. A. (1990). Periodic covariance stationarity of multivariate periodic autoregressive
moving average processes. Water Resource Research 26:855–861.
Ula, T. A. (1993). Forecasting of multivariate periodic autoregressive moving average
processes. Journal of Time Series Analysis 14:645–657.
Ula, T. A., Smadi, A. A. (1997). Periodic stationary conditions for periodic autoregressive
moving average processes as eigenvalues problems. Water Resource Research
33:1929–1934.
Vecchia, A. V. (1985). Periodic autoregressive-moving average (PARMA) modeling with
application to water resources. Water Resource Bulletin 21:721–730.
Vecchia, A. V., Ballerini, R. (1992). Testing for periodic autocorrelations in seasonal time
series data. Biometrika 78:53–63.

You might also like