The Stationary Bootstrap Author(s): Dimitris N. Politis and Joseph P.

Romano Reviewed work(s): Source: Journal of the American Statistical Association, Vol. 89, No. 428 (Dec., 1994), pp. 13031313 Published by: American Statistical Association Stable URL: . Accessed: 23/01/2012 00:46
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at . JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact

American Statistical Association is collaborating with JSTOR to digitize, preserve and extend access to Journal of the American Statistical Association.

89. be a sequence of iid random ? 1994 American Statistical Association Journal of the American Statistical Association December 1994. 1]. . a Previously. fundamental and consistency weak convergence of properties thestationary schemeare developed. L2.e. that of whole Suppose . University. willbe seen.Sometheoretical propertiesthemethod of are bution and variance complicated of statistics basedon iid investigatedSection inthecaseofthemean. thisarticle. pledpseudo-time series. resampling KEY WORDS: Approximate confidence limit.Letp be a fixed ofX1. wherethe length each block has a geometric of In distribution.. where = j(modN) i to j series generated resampling is blocksof random by size.infinite-dimensional)distributionthesequence joint of blocks" observations obtain of to asymptotically provalid cedures evenfor multivariate parameters thewhole of ( thatthe statistic interest to of may be recalculated based on the resampled data set. . of Department Statistics. For example. conditional the original on data X1.. Vol. resampling procedure repeated build is to on be and possibly . 47907. contrast theaforementioned resampling to block of distribution RN were the methods. XN). XN is generated an appropriate by thedistributionRN. thestationarity in of property theoriginal series theresamfrom of starting Xi.Time Series. Dimitris POLITis and Joseph P. the goal is to make inferences about . infinite-dimensional) distribution the stationary joint of sequence of observations. { X.. is a time bootstrap That to confidence statements . Joseph Romano is AssociateProIN P.u).. In Section2 the actualconstruction the stationary of is and bootstrap presented comparisons madewith are the Thebootstrap Efron of (1979)hasproven be a powerful blockresampling to method Kunsch(1989) and Liu and of nonparametric for tool approximating sampling the distri. procedures the stationary procedure and developedhereis indeedstationary possessesother desirable The properties. .tisa parameterthe Politis. XN. resampling Their technique amounts resampling deleting to or one-by-one 2.. These methodssharethe construction of resampling blocksof observations forma pseudo-timeseries. INTRODUCTION is natural require resampled the pseudo-time seriesto be stationary (conditionalon the original the data) as well. of Stanford CA fessor.The Stationary Bootstrap N. ROMANO* This article introduces resampling a procedure calledthestationary as bootstrap a meansofcalculating standard errors estimators of and constructing confidence regionsforparameters based on weaklydependentstationary observations.. thestationary method here bootstrap proposed totheblock resampling the techniques. general. in and number [0. Independent where length eachblock a geometric the of has distribution. * *. Similar cally.. is Purdue Professor. stationary about The is. For example. Xo = XN.verted yield is that to bootstrap a method can be applied approximate time seriesX *. has by a "blocksof blocks"resampling schemethatyieldsasymptotically valid procedures even fora multivariate of parameter thewhole (i.e. In particular. . of toobtain consistent procedures RESAMPLING SCHEME for parameter them-dimensional a of distribution marginal Supposethat{Xn. of the let resampling scheme thatis actually stationary..u. .t mightbe the mean of the or distribution function.Singh (1992). is also generally appli- 1.But in thecontext applying methodto stationary of this to data. XN. Department Statistics. n E Z } is a strictly and stationary ofthe stationary Their series. estimate thesampling and quired.. No.a pseudo. stationary is procedure based on resampling blocksof randomlength. Lai (1992)byresampling and "blocks of sumed (i. are we TN In thisarticle introduce newresampling we a method. pseudo-time series generated thestationaryidea is thatif thetruesampling by then known. Theory and Methods * Dimitris Politis Assistant N. an of of distribution TNis recableforstationary time weakly dependent series. infinite-dimensional) joint distribution stationary ofthe time process thespectral . weakly dependent stationary observations.. time where X. on depending thedata this recalculated. Hencethis procedure attempts mimic original to the model retaining by * * * Xi+b-1 } (1) Bi. areledto conwe involves resampling original toform pseudo-time the data a a or pivot series from which statistic quantity interest be sidering"root" an approximate RN = RN(X1. Although aforementioned lack thisproperty.thepseudo-time be theblockconsisting b observations As In thecase > N. resampling procedure been has weakly dependent series. WestLafayette. stationary bootstrap In is developed this for purpose. n E Z }. probability statements aboutRNcouldbe inmethod actually stationary series.. Stanford.e. Xi+l. Given dataX1. Typifor calledthestationary that bootstrap. the or of may which justsomefunctional is XN. THESTATIONARY BOOTSTRAP whole blocks observations. . technique based on resampling blocksofconsecutive observations introduced construct was to for confidence intervals a parameter the mof dimensional joint distribution m consecutive of wherem is fixed.In Section in 3 observations..RN might ofthe to up an approximationthesampling distribution staofthe formRN = TN.To describe algorithm.t as well. This procedure been generalized constructing observations. University.t. . . arefor the nowasgeneralized Politis Romano(1992a. Xj is defined beXi.. .. 428. Romano.t or possiblya studentized version.t based on some estimator = TN(X1. . .1992b)and by and by real-valued.b = {Xi. Recently. in interested constructing a confidence region . 1303 . Kiinsch (1989) and Liu and Singh 4 it is shown howthetheory be extended may beyond the (1992) haveindependently introduced nonparametric caseofthe vermean construct to valid asymptotically confidence sionsofthebootstrap jackknife are applicable regions general and that to for parameters.The In tistic.. 94305. XNlet L1.

number observationseachblock random XI2+L2-1 Of course. XNisthestationary to approximation thetrue "circle.1304 Journal of the American Statistical Association. B*.. theoriginal is essentially weighted For a blocks average thesemoving orestimates standard where observations . the variables having geometric the series are in b abilityof the event { Li = m} is (1 .1 ziN 1Xi.. stathe for stationarity theresampled of conditional distribution RN(X *.L1. .. . To a series. misthelargest suchthat. each fixed error 1.L..method. and thenext XI.. series thesequence b valuesin are of sothat prob. X *be picked random tribution Ii as the Let at sampling algorithm discrete uniform distribution made was and so fromthe original observations. course...If..on theother hand. begin. TN) given X1. stationary bootstrap proposed Muchmore actually if of is true. compute TN(X N*. does not blocksmethod XN.that Li havea geometric of one the distribution. The achieve or RN(X 1*.one could take the seriesX V. theseries . example. .. then the new seriesX . X Nisgeneratedthe way. . tinct). b at define blockoflength beginning XN(ifb > 1).bootstrap then X . andso on. X N is stationary. Notethat Thestationary resampling scheme proposed here if a2 is defined be thevariance N112XN. is notdivisible 2. bootstrap A . as a "smoothing"device. *. Markov to that aspect chain.p)m`p form = 1.. .. conditionalon X1. example.Now simply ..... some b for 1 i distinct j (and bothi andj between and N).. number observations in the second sampledblock BI2.. the truedistribution RN(Xl... . 1TNJR\fAX In blocks be viewed beusedas well.. . XN.. pseudo-time To rithms be block X are determinedby the first series X *. ... the ' X bk. blocks random number ofpseudo-time are Instead assuming of of length possible. resampling kovchain. obvious problem. . For the sake timeseries following In general. December 1994 on tribution { 1.thesemethods how to chooseb in the moving are and are Inscheme.Another to think with at random the way is blocksmethod thestationary and bootstrap as p. . bootstrap ineachblock a fixed number Inthestationary b.letkbethe sequence iidvariables havethediscrete of that X to N}. here The method estimator. XN) . I2. A 2. is no data after themoving of to timeseries arbitrary length be generated). distributionan estimate standard ofan or of Proposition Conditionalon X1. Now. a stationary theweights determined a geometric Itis are distribution.. thatX = XI.b to get rek b sampled blocks. BN-b+?. Bk*. is blocksmethod notstationary. probability.. *. is the at from original resampling probabilityletX 2 be picked random p. BII. .. XN) = XN = N. integer satisfying> N. depending deed. thenext observations thepseudo-time N in of . A N is a stationary and howto choose in thestationary second. under stationarity. . Once X *. .Let y = E(X1) and setTN(X1. as SupposethatN = kb. Bi. XN. will on particular attention focus the scheme termined theJth original series. Independent theXi and theLi. which say that .. tionary method "wraps"the data aroundin a bootstrap . .as a special of and a discase. . X N. be a theb values B'. can consider of imatedby the empirical distribution the B numbers other for Alternative distributions theIi can distributions. A 1. . . . time by observation inthe XJ 1with probability p andpicked that initially we letX *+ be equaltoXJ+1 proposed. blocksbootstrap and algotionary bootstrap the moving in The first observations thepseudo-time L1 BI2..byb. By simulating basedon resampling bootstrap Variants thestationary on B in a large series thesamemanner. error. XNareall distinct.L2I ..Z . X Nis a (m + 1)-order scheme..twoof theoriginal important keepinmind a difficult inapplying is blocks observations identical theremaining distinct. the and between staSomeofthesimilarities differences of blocksof randomlength the prescription by B1. by N 1. In thecase. . Resample bk smallest uniform dis.. X the j in following Sample sequence delete observationsJ'for > N. . methods in series theobservations are observations thepseudo-time of the blocks themoving technique. *.The choiceofLi having geometric follows.. X. X N and series in constructed add an independent thatXA wouldbe the"next"observation theoriginal as previously given thatX * is de.. thenewseries bootstrap then distributions of X1..b from In thissection...b. X has been generated. . . generated themoving by L2 XI +Ll-1. ofconcreteness. XN.... on thenumber identical of of subsequences observations. .. methods differ The allowsfor howthey method it (though is clearthattheresampling For there end because dealwith effects. Resample replacement theblocks from B1.p... AT is.." that "follows" bootstrap X1 so XNof sampling distribution RN(Xl. of observations In of Both involve resampling blocks observations. letII. . bootstrap of then to is distinct from by (1989) and Liu thatproposed Kiinsch and Singh(1992).The first observationsUnder assumption E]-i Icov(X1 X1)I < so... a . the series should apparent. . letA 2 = XI1+1 with SO ries.For example. 1X XAN.considered a first tojustify validity the are (and of stationary as the step Mar. Z Nto it. TN) for pseudo-time timeseries. u) can be approx. for sestationary theresamples schemes achieve 1 N observations..theissuebecomes "smoothing" chain.. THEMEAN if canbe made. in is the of serieshave been generated andhasa geometric vationsin the pseudo-time in also distribution. the N aboutthedifference between from original observations probabil. With so that resampled N Of other the series stationary. X... ui). . ity moving a For blocksize b. this themoving way may An alternative perhaps simpler description there. namelyXI2.bareidentical m = 0 ifall observations dis. 3. AT . .L2.. Their"moving blocks"method deis with scribed follows... .. a pseudo-timeseriesX * thepseudo-time distribution. kblocksas previously generate 1. is thisprocess stopped is onceN obser. specialcase of thesamplemeanis the andBj...scheme p An a order Markov generalization. . one can compute follows. B*. the ..In fact.. XN.*.

. . givenby(6). Moreover. and .where N = N-1 cause E(X 1 IX1.p maybe calculated without In resampling.. closecousinofthemoving a a mean andvariance 2 . . blocks it scheme. XN)=XN + 2 1 -NO -P)CN(M) N (5) RN(O) + 2 N-1 z i=l1 bN(i)RN( i) (6) oo and b/N -# 0.. * [(XJ - XN) (Xj+i XN)- XN) b1 i(Xi + XN-i+l)+ b (N . XN) = XN. weconsider more the general (posscheme sibly nonstationary) resampling where Li's are the iidwith common a (possibly nongeometric) distribution.One possibility to approximate distribution that.Politisand Romano: The Stationary Bootstrap 1305 is implied typical by assumptions weakdependence. . XN) giventhedata. RN(X *1.. XN) ] (seeLahiri 1992). including theclaim that bootstrap supposed be a general is to the pur. estimating or aN the of tion. where that TN .so that X on (conditional thedata). *... j=b j RN(i) = I N (4) Lemma 1. N}. estimator = TN(X1.. is notthe casethat E(X N*I X13. Moreover. Although authors theorems on havedeveloped many in the but maybe satisfactory caseofthemean... by for the recting thebias by recentering bootstrap distribua2 1 for accounting thefactor /27r. moving to the blocks bootstrap approximation ]N.) bythe(conof Let X2. stationarity . first we consider meanand variance N112X varianceofN112XN defined the of N by p8 var(N1/2X I X1.b+ l)b E(XNIX.. XN) -XN with aid of(4) or Alternatively.It is easyto seethat andtheusualcovariance estimates. stationary process ditional) distribution N1/2[ -E( TN IX1. XN. it is of of is Op(b/N1/2) as N -# Thusif b/N 0as N-oo.. to Thusone dowspectral of * tral density theoriginal process. UN. = Op(N-1/6). and As a first toward end.. a trivialconsequenceof UN. XN) = XN.p is given terms the in of circular autocovariances.XN).. but Moreover. /i) theIi's areiiduniform { 1.. . UyN. thelemma.. TN bN(i) +(1 that necessary b -# oo as N -so whereTN that mating bootstrap distribution necessarily mean0 and <soc. IX1.Ht) has mean 0.*) R (0) = PN ? .p iS obtained.which tends zeroquiteslowly.. itweakens theconsistency of properties spectral estimates. approach in In easily ourframework.... Assuming thatf( ) exists cannot the blocks to expect moving bootstrap possess any it of (which does undersummability covariances). the seetheproof (iii)in theorem ofLiu and Singh of 6 (1992).meanofX N isXN. to thesampling distributionN112(TN .p = thesampling distributionN12 (XN.and ofinterest itsown in step this We now consider stationary the bootstrap estimate of right.Thatis.. r. defined . XN) = XN.t Theassumptions thetheorem similar of are AXJ+s+r+v).I. hence of it doesnotaccount thebiasofTN as an estimator for of follows i -2__ a as N -s oc. .Suchan approach UN. a formulaforUN. To see why. . is closelyrelated a lag winis at strapdistribution centered a location..least without a2 where is given (3).... . Op(b/N1/2) estimate density * wheref() isthespecoff(O). X N). N X *. .. In particular. and Brillinger (1981). is f(O) second-order at not coroptimality properties. is in in to [given (2)] is equivalent estimatingf(O)a first-order N112(XN -t) bythe(conditional) distributionN1 2[X N of of We a sense. . of a . . as the of of joint Xj+r. v) 1. In Lemma 1.. then conditional on the . Lemma1 tellsus that bootstrap the of variance. . Assume p has = TN(X 1'.. . XN).Priestley (1981).. .N1/2(XN . Kunsch( 1989) (7) P) ( l p)N* N is proved thechoice ocN1/3 optimal minimize that b to the N ~ meansquared of error themoving blocks bootstrap estimate estimate Evidently. calculate meanand the simply of variance E(X NIX1. Henceit is clear simplya /22r. it follows of thatthe by to distribution of bootstrap approximation the sampling lN a2 = var(X ) + 2 z cov(Xl. b (seePolitis Romano1992c). Be.. In thiscase the approxi- with covariance function satisfying + ErIrR ( r) I R (.X)] of Theorem]1. Because thetrue is E(X N X1.. nowprove consistency asymptotic property -E(X N IX1. In summary. would nonefits this pose"automatic" technique..t (unless haszerobias). distribution N112(XN. . consider approximation those the TN usedbyBrillinger to (1981)and Rosenblatt (1984). As demonstrated .toachieve consistencythemov./i) hasa meanthat of of byLiu andSingh (1992).t) tends distributionthenormal in to distri. . E(X N* . . if Remark2. . CN(i) = NJ=1 Remark Forthemoving 1. notwork welloutside case ofthemean.p = CN(O) NJ=1 [(iXj-XN)(Xj+i - XN)]- + Op(b/N). moving For the blocksbootto ofvariance Np.Xl+i). bea strictly X1. In fact. thegeneral context estimatingparameter by some Xjj+s.Theorem K4(s. NPN -00o.. sucha choice. typically we have thatRN(X1. A primary ofthis blocks 0 bution with goal scheme yields correct that the bootstrap (conditional) the section toestablish validity thestationary is of bootstrap mean the for corresponding bootstrap distribution isobtained defined the conditional distribution byletting bethedistribution of approximation by mass assigning onetoa fixed Li X N. i=l (3) N1/2(XN - ) has thissame mean... . Zurbenko (1986). in is thefourth cumulant thedistribution(xj.where ingblocks bootstrap estimate variance N112XN.

Estimationerageoverb ofestimates variance of original propertiesthe of that suggests thechoice p in of b...b. (1 - PN) + iPN(- PN) X 109 (J)) bias. whereSi. of bootstrap blocks the of to itis lesssensitive thechoice p than moving of is bootstrap to thechoice b... To empirically of estimate variance. of Zi.b IX1. X200 First..Except end the blocks bootstrap for effects. (NpN) -..I 1-b (I -PN)b E NB)= CN(O)+ 2 0 00 i=1 b=+ 1 __ b (1-P)PNN(i) E(& k. length this blocks fixed fundamental difficult.bN)-. c the getting constant right by claimcan be madeifMN. Estimates. substantiate blocks the bootstrap Two SimulatedExamples.. Moreover. Theproof In [see shows muchmore. any in Bi. C\j 75 Comparing MN. where pNlog(pN) -O 0 as PN O 0. of Suppose.. wereconsidered.w IK4(U. simplicity. under sameconditions Theorem a^. yielding CN(O)2+2 MN 00 to &N. N = kb..P. shouldbe thesame orderas thevariance. for that estimate variance..PN) = (2- 2PN . ilarto Theorem var(UN. i=l1 iR(i) + o(PN) (9) = CN(O)+ 2 i= bN(i)Cl(i) b 1 and var( & choosep = . theorder thesquared PNSO if Consequently. . to similar Lemma1. thetwoare quite close.notethat MN.b for "lag"k. Bootstrap . is. = Zt + Zt-1 + Z-2 generated normal. Figure TheMoving U. particular (19)]. toward also The Thisoccurs ocN-1/3. with onlyslightly of 1.v. E(mAN. in (5). = Zt-Zt-l Next.. mN .. two variance overa each b and thenaverage by mN. the nearly Notethat autocovariances to of is estimate variance equivalent mN. Because + Zt-3 + Zt-4. with moving ^N. var(SI. December 1994 u. of estimate variance. were X1. variance N1/2XN.b is thesumoftheobservations ..pN) = of then of estimator aN is minimized.PN. an argument by to to It choosing optimally. blocks bootstrap the Remark We nowcompare stationary 3. where constantdepends intricateofvariance satisfy basedon resampling of such process. ofthis constantappears thanthechoiceofb in is scheme lesscrucial are of bootstrap unaffected not thestationary by consistency propertiesthe simblocks scheme.b 2P ifp = 1 for biasofthe the estimate variance of exactly blocks moving coincides with(9).PN tends to a2 inthesenseE(&. a perhaps size block One can compute is estimates as follows. of is bootstrap estimate variance kl basedon simulation. b (10) that size functions block b and I /p. that provided p-i is approximately Intuitively.b is replaced theexactmoving properties. Byan argument mNb = CN(O) +2 (1 --A)iN(i).PN tends a2 of estimate variance Thenthebootstrap in probability. . thegoal is to of thatthemean squarederror &2. (8) with distribution B has where (independently) a geometric mean PN . same 1 I . is important havep tending 0 themoving p 1. as :iiR(i).B). where theZt areiid standard X b* is a block fixed ) Bib.ip. Hencethestationary minimizing meansquared the avas approximatelya weighted maybe viewed _# on the c N1/3PN c. further . bN(i) For inchoosing thetwoapproaches roughly sameiftheexare in blockis number observations eachresampled of pected To substantiate claim the thesamefor both methods. can be shown In fact.b.. ifthe goalremains Because p the difficulty estimate bootstrap in of PN error NP. InFigure themoving XN). .. c Fortunately. N}. is very blocks stationary and 1.. compute ofb In distribution values. that 2 expansion Kc' Kuinsch's /b.PN -2 actually )-2 O.. then should bN( i) is given (7).PN at theproper to achieve The of 1 in seemsto enter third-order andtheconditions Theorem aresatisfied. particular. staof blocks random length scheme samples bootstrap tionary the 1/p. observations .. + Z4-2- the b. (10) showsthatthe moving blocks-andhence alsothestationary bootstrap-variance to estimates both are equivalent a lagwindow approximately kernel Priestley 1981 estimate Bartlett's (see using spectral 200 150 100 50 0 the way interesting to view But more for details). the that..p) fromthe model X.b defined (10) for Line) (Dotted Blocks(SolidLine)andStationary 1. 200 observations whereagaintheZt are Z-3 + Zt-4 weregenerated. it moreeffort.0 ifb = bN = 1/ but second-order rate properties. bootstrap defined (1) andI is chosen random N var(Xl + +f +X * IXI . moving 02 EXoXk> 25.b with '.iN-1) (I ..W)I=K<oo. 00 b. E 1O?0 Xi 5 E 12= the . some numerical examples theseclaims. . In fact. (1 view of the approximation . as of are estimates variance N1/2XN plotted of { in at from 1.where (X . Thenthemoving blocks - accordingto the model X.V. from b at length chosen random of withN = 200. calculation points ifPN )Pj ] bN(i).Notice thestationary of that is less estimate variance much variable.1306 Journal of the American Statistical Association.PN as an 0(1 where bN(i) = /NPN). XN) .

the theorem implies qN(l . and condition moment a mixing in assume for some E > 0 that to cumulants. in probability.. function. In Theorems impliedbyEl Xi16e < oo and >Lk k2[a(k)][e/(6+e)I < oo. the 1 . XN . beyondthe an is clear that as long as p satisfies -* 0 and Np is distribution close to thetruesampling properties.u. At this p in distribution the stage. 2. To case of movingblocksbootstrap.a quantileofthe(conwiththe goal of constructing of N12 (XNt). bootstrap of error.a.a) shouldbe defined be the 1 .. the condition(8) is ond-order Remark4.A steptowardunderstanding scope ofthepresent was by properties presented Lahiri(1992) in the 1 and 2.1 Multivariate Mean for intervals . con. methodor thebootstrap process In practice. having C\N . .Note thatbothFigure1 and Figure erally other bootstrap Figure2. example. A of generalparameters interest..1. 1 to confidence .u. N( 1 -a)} =1-at. 0o O..a /2).C..then of < oo. For example. (A. may be distribution Other bootstrapconfidenceintervalssimilarly to approximation thesampling The bootstrap valid in the sense of havingthe . The MovingBlocks (Solid Line) and Stationary our 2 confirm previousclaim thatthe stationary (DottedLine) as estimateof variance may be viewed approximately a BootstrapEstimates.whereA [XN. complicated also implies>Lr IrR(r) I < oo. < O} and {X. -* oo. n 1.Politisand Romano: The Stationary Bootstrap LO 1307 normal.Interestfocuses on the mean vector. is of estimate variance bootstrap thatthe stationary observed to is much less sensitive the choice of p than the moving blocksis to the choice of b.q^N(1 .XN). is chosento minimizethemean squarederror -2 p. EXTENSIONS solelyin termsof ditionsforTheorem2 may be expressed without referring In thissectionwe extendtheresults Section3 to more condition. Hence theconappreciate 4. simplepercentile correct asymptotic on ditional X1. p should satisfyN3PN .the true varianceof the sample mean is near 1.u = E(Xi ). 1) ofKiinsch( 1989). let on ofX assumption the original ditional)distribution XN. . basic themeis thatresults by aboutthesamplemean readily are conditions implied El Xi16+e < previously covariance with if mentioned. and B varyovereventsin the a fields .A strong G distribution be inf{q: G(q) ? 1 processwillbe in force.then a2 givenin (3) is finite. P{X N-XN-C jth component = E(X1. As before.In thiscase theautocovariances be obtainedfrom denoted by X1. Assume. ..XN.u .BI P(AB) . and can be estimated density the spectral by some sequence CN. N2(X. alternate signuntiltheybecome 0 for than 4. bea strictly thata data-basedchoice forp it R R function ( *) satisfying(0) + r IrR( r) I would be made. .. to can be generalized considera data-basedchoice forp. Then the bootstrap Subsequentworkwillfocuson a properchoice ofp. thestationary suchas coverage ? xl Xi..aI}. Then. This condition The immediate applicationofTheorem2 lies in thecon. In Figure2. say consistently.4. 2.. In thissecond model.a mixing for intervals .0 and NPN-* oo. estimates blocksbootstrap overb ofmoving average weighted ofvariance.. Then itimmediately XN are observedfroman infinite Z }. the movingblocks and co lags k greater again of of estimates variance N112XNare bootstrap stationary it of as plotted functions block size b and I /p. thechoiceof willnotenter intofirst-order p sense procedure. is inevitable stationary X2.That is.letqN(1 of struction confidence jth SupposethattheXi takevaluesin Rd. p < oo. .forsome d > 0.thisbehavior been observed block size in examples. .?(x/a)I supX IP{N1 Theorem2 withsome additionaleffort.. will be a third-order correct constants is thoughof vitalimportance. In fact. Let ax(k) = SUPA. Thenthemixing for implyresults muchmore = O(k-r) forsome r condition ax(k) the single mixing statistics. with component by distribution thebootstrap .. getting enterinto second-order 0 (12) x}I -P{N112(XN-u) Such consideration. XNt. it is assumedthatdata X1. where 1( *) is the standardnormal distribution Assume thatPN O. .1) The definition ~. Indeed. Theorem LetX1. in general.shownto be asymptotically of ofN112(XN a for coverage. The constant C will depend on that EIX Id+2 < oo and >Lk [ax(k)]dl(2+d) if > Moreover. I see why. . N 3113CN.For example.a) -#. One could then choose PN ( 1) 0.P(A)P(B) 1. Assume(8) in Theorem 1. n . XN} rate forp to tend to 0 will undoubtedly the right Getting N-XN) supl P{ N/1(X2 T "optimal" but properties._)?'x} . or the qN( of We nowtakeup theproblem estimating distribution Due to possiblediscreteness uniquenessproblems.) is thedistribution N12(X N. and (standardized) estimateis nearerto 1 fora wide bootstrap the stationary 200 150 100 50 0 quitegenhas rangeofp values. n ? k}.In summary.qN(a /2)] has asymptotic coverage E by generated { X. quantileof an arbitrary interval that follows thebootstrap sequence { X.a) in probability. secwork. > 3(6 + E)/E. iidstandard in fork = 1.

. (15) N1/2[T(FN) - T(F)] Z where is multivariate Gaussianwith mean0 and covariance matrix having( i.ifPN -O 0 and NPN -0 oo. * 1? s 5} P*{f(N)f( P N) ?0s}I -0O in probability. Then in vergence RP. .j). d) Variouschoicesforthenormyielddifferent-shaped regions.p). the bootstrapapproximation the distribution to of ferentiable. as 4. j) entry df(y1. that El xi 6+e . .3 Differentiable Functionals 4. Then the following < oo.t = f(0). . As an immediateapplication. then on back in the multivariate case.J..*) ]. hi(Xl+k)].. X N withmean X vector N* propriately smooth function fromRP to Rq. fof where the parameter interest is some functional of F. *. = 0. Zj) = cov[hi (Xl ). 00 E[hj(X ) 6+-. X1+k. Moreover. stationary the bootstrap resampling approachhandlesthe problemeasily.2 Smooth Function of Means .f()] P{ IIXN . constructing R(q)). .Let D be thep X q matrix with(i. for1 < i N' = N. Then. Ifforsomed ? 0.0) ' x}.0) is appropriately close in the sense hFd(G . yielding a pseudo-multivariate time seriesX 1*. whereOj= E[ hj( Xi ) ]. The immediateapplicationof the theoremis the construction joint confidence of for regions pt= (t1.. Xi+q).define to be the vectorin RP with comjth Yi k=l1 ponent h1 (Xe).but thiscan be generalized.. /[N(04) -f(ON)] C X-NII SI} O (13) in probability and SUp..I P { Lf(MN) -f(O) P* {N1 < X}) 0 in probability. d(P{N1/2[f(ON) . For concreteness. <o0 and thatforsome r> 3(6 + e)/E. .q.F) + o( |G .The only caveatis to notethatq is fixed N -* oo.. Qp). whered is any metricmetrizing weak connorm. Assume that T is Frechetdifhj and mixing of = O(k-r). suppose that 11 11 the supremum d is in probability. P{Z ' x}) O-0. .. . To handletheremainder in Xi term ( 16). T( G) = T(F) + N) X) X-} 0 bN.. Let 0N = (ON.1..fq). . on 0 or somefunction 0. . Noticehoweasilythebootstrap handlestheproblem conof structing simultaneous confidence regions. where K is any normon Rd and P* refers to a probability conditional theoriginal on series.0) < x}. the stationary is resampling algorithm the same. Q). Assumethataxx(k)= O(k-r) forsome r> 3(6 + E)/ is true.witha moment assumption thehi. cov(Zi. *. E[hF(Xl )]2d < oo and lk [ax(k)] d(2?d). assume thatthe Xi are real-valuedwith that0 = (01.j) + 2 cov(X1. whereEN is the empirical ON. whereR(i) = cov(XI. . is. Wi = (Xi.. yp))/dyj evaluatedat (Or.t of the sensibleestimate F is T(FN).p. XN. Again.1308 Journal of the American Statistical Association.. and interest now focuseson the parameter... yP) = (OI RP at .agy(k)? aex(k)...considerthe problemof uniformconfidencebands for (R(1). . .An asymptotic approachwouldinvolvefinding distribution thenorm the of of a multivariate Gaussian randomvariablehavinga complicated(unknown)covariancestructure. A of T . . (14) and (15) aij = cov(X1.SI I c x}.All . XI+j).1 * * .. wheref(y1. Theorem3. Deo ( 1973) has shown in Hence. Then. . The resampling approachavoids such a calculationand handlesall norms withequal facility. supposethat that N12(ON . yp)is a real-valued function from havinga nonzerodifferential (yi.. (14) d(P{N'12( N. Then the Yi are weaklydependentif the original are weaklydependent.Assume that for some E > 0. E.. regarded a randomelementofthe as z - .) Although even asymptoticdistribution theoryfor even Gaussian data seems formidable. hF(Xl+k)]. forsome e > 0. k= 1 hold. . (17) To see why.J). Suppose. (16) + 2 k= 1 cov[hi(X1). Assume that f = (f.. P* {N d(P { N' N(O- f forsome (influence) function centered thatf hF dF so hF.N1 /2[FN() )F(. As before. we are exactly thatif zk k2[ax(k)]l/>-r < oo forsome 0 < T < 2. December 1994 of ax( * ) readily appliesto the multivariate case.. where smoothnessassumptions. ThenN12(XN -t) tends distribution themultivariate in to Gaussian distribution withmean 0 and covariancematrix 4.j = i=1 hj(Xi )/N. fact. Suppose thatf satisfies aforementioned Theorem the X = (ai.. Moreover.i. Assume momentconditions of on conditions theXi.. j) component X. X1. thenN`1/2 1i hF(Xi) is asymptotically normalwithmean 0 and variance E[h2(Xi)] + 2 00 cov[hF(Xl). z Then ifPN-* 0 and NPN-* SUPS I P* { IIXcN* oo. suppose thatthe Xi take values in Rd.. Suppose For simplicity. Now supposethatfis an ap. Interest focuses common continuous distribution function Suppose that F. (To applytheprevious let theorem.bythemultivariate distribution X1.. ... hj (X)] 00 z = N-12 N z i=l1 hF(Xi) + o(N112'FN-F|).F||).

Otherexamplesfalling thisframework whereasympmeasureand cross-spectrum. and so notnormal. Let = kM( Si.L is an estimateof.L R OM from dM to R D.T(F)] is thedistribution. takingthisviewand consistency weak convergence point. F(*)] will appropriatelyconverge to those of will based on simulations be addressed.L. The onlytechnicaldifficulty showing In fact..24OXt-4 maybe appliedto yieldvalidinferences bootstrap stationary .1..04Zt-2. Suppose that paths. for validity quantilefunctionals asymptotic 4.. thewell-known Canadian lynxdata are disin the played. pracin vergence the assumedfunction the just distributionsof ticalimplementation.M. the error resampling confidence Section 4.Politisand Romano: The Stationary Bootstrap 1309 norm.t = of finite-dimensional distribution theprocess. can lett'skernelestimateoff(w). constructing butthisis theoretically of negligibility essentially applicable. The bootstrap approximationto the distributionof 4. . towiththeartificial an argument the bootstrapempiricalprocess. NUMERICALEXAMPLES FUR-THER AssumethatXi E Rd.. seriesfollows ARMA model the .M.T(FN)].* X(i-)L+M) is of variables (and c = 0). Suppose thatTi.a * 0 is terms to showp(N12[F error N(*)]. problems.mean 0. .t](x). whereY.F(t). important workwillfocuson three Subsequent .M.just regard T1.representing numberof lynxtrappings the in theyears1821 to 1934. In thissectionwe discusshow the Xt. can similar Deo's. A realization theYtseries exhibited the {Xi } by a "winThese subseries be obtainedfrom can M dow" ofwidth "moving"at lag L. greatadvantageofthe individualperiodogram approach is that it easily yields simultaneous resampling + I E[gs(X1+k)9t(X1)b oversome finite density for regions the spectral confidence k= 1 in gridof w values. by ofX1. 1992b).T(F)] is asymptotically larlyintractable.*). measureovera finite bands forthespectral because Theorem 2 is will behave correctly. To apprewith supremum the endowed spaceofcadlagfunctions the consider problem of to weakly Z( *). where FN is the emXN. Si.. samplevalidity 5. fora parameter E RD thatmay depend on the whole in. TQ. it compareprocedures. normal N(0.The keyto justifying confidence bands over the whole continuousrangeof w.whereQ= t to 0 To function.4 will readilyallow one to construct thenit is clearthat of approximation T(FNJ) are negligible. schemeby Politisand Romano (1992a. appliesif actually argument In fact.M. whereZ( *) is a Gaussian process ciatetheapplicability thisapproach. is clear that the finite-dimensional the especially choiceofp.5 Future Work on conditional Xi. gridofw values. 1) random withthe Zt's being independent Consider subseries the 1.L/Q. higher-order in probability. By Theorem3. T is only assumed compactly follows. Z(s)] in fact.L = (X(i-l )L?+ *. wherep is any metricmetrizing are especiallyto asymptotics can establish Indeed. so Ti. forsome function + 1.* Note thatM. wheregt(x) = I[o. and havingcontinuous is theperiodogram at evaluated w basedon data TL. technicalcomplication so arrayof variables.338Xt-2 .662Xt-3 + . tightness be shownif NpN -* oo.M..t = 1. and thefinite of tightness is Z(. For differentiable. N1/2[T(FN) . 200. as in theiid case. here {[(N-M)/L]} TN= I T. of N1/2 [ T(FN1) conuniform to results construct theoretical establish First. Leger. . of and Q may depend on N.L) as a time seriesin its own right. IXII + c and theX. Z(. . converges of estimatingthe spectral densityf(w). X Nobtained thestationary distribution pirical fidencebands forthe spectralmeasure. . Otherkernelestimators = E[gs(X1)gt(X1)] + I E[gs(Xl)gt(Xl+k)] of tapering the obtainedby appropriate be (approximately) k= I A estimates.t based on the subseriesSi. of properties thestationary blocksresampling in has been appliedfruitfully themoving In Figure3. .M. a histogram of MackenzieRiver Politis.L.M.L. space. thedata reveals is skewed it and Romano (1992) analyzedthe Canadian lynxdata.M. thatwe are dealingwitha triangular By thatTheorem2 mustbe generalized.4 Linear Statistics Defined on Subseries *N1/2[FN(. basis forgoodness-of-fit weak con.L) .The discussionin If termsin the differential procedure. gether = X. Weak dependenceproperties 12 90 18 80 14 080 into weak dependence the originalseriesreadilytranslate 0~~~~~~~~~ya back of properties thisnew series. L. foregoing sketchy the example..352Xt-I + 1. mean 0 and variancegivenby (17).ond. Finally. applyresampling integer [ ] is the greatest of ( the approximate distribution TN. Hence Deo's resultimplies are thespectral are distributions particuto normal with toticapproximations sampling that N1/2[T(FN) . Then.2Zt_j+ .. the bootstrap uniform By unsatisfying.M.thisapproach bootstrap. The to technical detailswillappear elsewhere.L(W) cov[Z(t). Seccan procedures be established.Hence we are essentially yxTapns nulNmero Fiue3(h0ieSeiso is A into the sample mean setting.TN(w) is approximately equal to BartSi.

0050 .413] [-.1034 [-.e. .1 . closeas theX2 distribution 1 degree (as with of freedom totheX2 distribution 2 degrees freedom).826.000replications with = . (I1992)constructed et confidence intervals for themeanoftheLynxseries wereconstructed usingthemoving blocks technique.) This is remarkably close to the Moving Blocks 95% conficoJ dence interval [ 1.5.662] [-. Sample Autocovariance Y series.030.4. of in Figure4.1.e. Y. where choiceofb . the First. p Turningto the artificial series. of trimmed meanswouldbe moreefficient thesample than meanforestimation thelocation of parameter because c.4 . the blockmethod theorder 10 (see of of analogyto the iid case.based on 500 replications = . . do this.0094 ..233.071 of et presented L'eger by al.T) ' x} .05. Note thatin the stationary bootstrap simulation.345] [-.. var 1:y vr200 i= i =-200 1 200 \200~~~~~_ /0 (var(Yi) 200 + 2 200~00 cov(Yl.1. 957].The stationar bootstrap "hybrid" (i.2 .0386 . . autocovariance sequence the of is in 6. to The stationary inbootstrap (i.202] [-. thetwo-sided distribution 1 degree freedom X2 with of can be thought as being"close" to thedoubleexponential of distribution.05. December 1994 c'J O 0 O 0 0 10 Lag 20 30 50 100 150 200 TimeSeries Y.thechoice p = .233.b. "hybrid" 95% confidence ) due~~~~~~~~~~~ topheavyitails tervalforthemedian m of the Lynxdata was [ was mentioned that Y. a histogram revealsthisdata is also not normal due to heavytails. the and means(i.series pictured Figure It is seenthat autocothe variances lags for greater 6 arenotsignificantly than different x2 it with Indeed. constant wassetto 0.282.832. forsome fixedt. problem choosing for stationary of p the bootTo strap.25 the 802 was explainedby Le'ger al.2.3. . (The sample mean of the Lynx data is 1.37. .816.984] . based on 1. is non-Gaussian.Bytheapproximate and correspondencethe of moving blocks method the and stationary with bootstrap p = 1/b. at/ a Table 1. Politis Romano1992).3 . In L'eger al. T he stationary Theanc sta-ate o bootstrap ) the distribution Y..whichwas equal to 771.018. But we need to attach standard a error confidence or interval thisestimate.e.0028 . leadtoan empirically acceptable choice centered symmetric and aroundtheconstant By of b for moving freedom.. c.1.. Figure4. (1992).082.7191.1310 0 c\J Journal of the American Statistical Association.089. Trimmed Mean Confidence Intervals astat. Basedon 500 stationary of bootstrap replications. 1.080. TheArtificial Figure6.092.u of the Lynx data was withp [1.and . Yl+) The stationary bootstrap estimates thevariance the of of meanfor sample different of choices p E (0.) < x} ) 95% confidenceintervalfor the mean .with = .P{ Vn(t T.105] [-. lookat thesample meancase.1 is suggested.5 .based on theapproximation P* { 7n( T*. whichwas again based on 500 replications with b = 25. thea-trimmed observations throwing the[na] largest remaining after away andthe[na] smallest a ones). it is expectedthatthe median and Leger. . .8) arepictured in Figure andthesample 5. is a two-sided distribution I degree of from Thiswould 0.for which a simple expression thevariance of exists: >Y. of The obvious estimator is thesamplemedian. p waschosen suchthatI lp . (andhenep. (1992).538. is with of Forthesimulation.letus proceed comparing to in thesixproposed estimators c. et But we mightalso considerthe median m of the Lynx data as the parameter interest.0159 . Table 1 reports stationary the bootstrap estimate variance thecorresponding of of estimator.. and sixdifthe c ferent estimatorsc were of considered: sample the mean.themeanofthe median.bootstrap 95% confidence interval 0 . of Having decided usep = .They also discussedthe choice of b (and hencep = I /b forthe stationary bootstrap).

yielding To get(6). note 0 - 1.0 0.|1iIiII N-I + S. Also. . Proof of Theorem 1. (N"2). rameter Forcompact a with equalto 0 and . andif1 ? i ? N 00 |. confidence the and variance.3 0. of Figure8. an adaptivemanner(see Legerand Romano 1990a. Mean.1 0. 0.XI.2 0.4 XN = .5 1. ac(A.O(O) SN= 2 1 bN(i)RN0P(i)i i=lI N-1 By where RN. So. and . Hence itsuffices show To inprobability. Let (A. complish we show biasand variance SNtendto 0. The bootstrap median is of distributionthesample p N theory Basedon theasymptotic the clearly leastdisperse. BootstrapDistribution the Y Series Sample Median. IIIIIIwmml .1) cnJ 00 -0.plingscheme. RN(O) that (5).1) satisfies /2S e SN thestimatorin o Mean.5 -1.o(i)I = (N -i/N)R(i). .6 0.2 0.. + fC2[I _(I -p)i]. By that itfollows (7) andE[RN.0 the = Letting] NN-i in thelastsumyields result. seemsto be correct.Recall L1 in theconstruction thestationary the reasoning.b).3 trimmed and replications bootstrap basedon 1000 are series pictured. =N-1 z Jjl_p)i to approximations each of the the justifying bootstrap J=l Hence cov(X *.. BootstrapDistribution the Y Series . X *+) = CN(i)(l X * series. aN SNPN = RN0P-XN- 2X bN bN( i). In theproof.1 Trimmed that we of Forpurposes theproof. lll mEm IIIiiEEE -0. .thenCN(z) = RN(i) + RN(N - by i)..0 -0.O(i) = oJfh XjXr+o/N. on are and all be should preferred. In Figures through thestationary of distribution thesamplemean.which implies Nz N+l bN( i) = Op( 1). means.2 0.Politisand Romano: The Stationary Bootstrap 0 0 (.4 0. as trimmed were denoted the suggesting the that intuition the from table Itis obvious the Indeed.pN = RN..8 1.3 0.4 0.0 0 0 1311 C14 o) o I o III 0. Oj to X PN. APPENDIX: PROOFS as median mostefficient bootby (estimated thestationary hasthesmallest median of interval Proof Lemma 1.0 0. BootstrapDistribution the Y Series Sample Mean.0 Elm. 0.p = CN(O) + 2 (1 - N)(1 N = -P)1CN(i). Figure Bootstrap of the this. . (5).3 Trimmed 9.thebootstrap further shownto be a for interval thepa. expectations covariances conditional c that was for recall c. + E(X *X+1 L1 ? i)P(LI ? i) = .1 . yields shortest strap) to taken equal 0 in thissimulation. meanandthemedian viable methodof choosingamong competing c. (5).1 of grams the E(X 1X *+ I L1 > i)P(LI > i) E(X 1X *j)= of and thesamplemedian the Y. CN(O). XN.thea . by (2) applied to the UN.2 0.5 0. ofthe Distribution Y Series.0 0. be can means.0 of Figure7.5. median to According this resamof histo.Then bootstrap 7 10. {%hjl bN(i) ? 2/ Under theassumptions.trimmed confidence as wellas the95%bootstrap in estimators the notation. means. of Figure10. Therefore. 0 Lfl - p)i.1 0. mayassume E(Xi ) = 0.1.

Also. whichtendsto 0 by (C3).R)IRI 0. o. CN(O) + 2 z-= 1 (I -PN) PN o __ C() 2+6 - as mN -. and YmN 2 . XN. Also.1312 Journal of the American Statistical Association.E(YN.) = XNPN. +( P~N5) XNE[I R .u ( 2.r -2P)PN r MN iN nN1 S j r .by XN(1 ..ofthem.L..5) sequence X2. theforegoing By observation.p) = b bN(i)bN(j)cov[AN. N-1 N-1 ? S/N.2PN .R) = -E(Rm2.R) = E[var(SI. Thus mN <_ N ifNPN -> bN(i) bN(W < . iR(i) + o(PN).. . (A. . conditional (R1.3) tendsto the normaldistribution withmean 0 and variance .. NpN/mN -* Show thatforany fixedsequence m = 1. ofSI.O(I)] 2M=-co IR(m)I +K. To calculate the varianceof SN.j)is the same as the varianceof MN/N timesthe variance Hence E(SN) = a2 + O(PN).PN 12+511/(2+5). r'-var(SI. In our case.. Now. To completestep2.R SIR - PN (A. But E[SI. Note that let the thatR. RN. areiiduniform {1. is Proof Theorem2 of NPN N CN(O) + NPN i=l N loss Without ofgenerality. for every sequence X2. ) [El SI R . -k)k(1 -PN)R(i). andtheL1.).RI in by MN..L1 + * * . beensampled. . N' 2(EN.Also.R I R) (A...* Xi.R. .RI R)]. Thisfollows from "memoryless" the property the of distribution.. which Minkowski's by inequality boundedabove is probability.1 (1980).X2 . immediately corollary ofHallandHeyde from 5.21) of Priestley (1981) originally due to Bart.7) .R) = N1E[var(SI.3.R) + ?(1) (1-PN)CN(i)+O(1). Set EN. by It suffices showthat(A..R 12+5 N 25.R I RI of to Pn 1/(2+6) = RXN.r defined (10).J. December 1994 E(SN) = R(O) + 2 1 2'j( 1+ 2 ( pN)R(i) (C1) and NPN var(SI.4) is var(RXN) = XN(1 R = r) is in factgiven -PN)/PN -* 0 by (C1). I on ) -* N6/2 PN E|SI. has a geometric on with write termin questionas distribution = Theproof this of claim be given five will in In steps. . .+ Op(N'1l2p 1/2) on + 2N + SI. use the 1 . (C2) Bytheassumption thatvar(YN.4) var(SN. Second.2. For 1 < i < MN. theproof.R) = N1/2XN/(NPN)-> 0. Then (A. To see why..XN) tends to normal with mean 0 and variance a. Claim.RIR)I var(RXN) = + N-'var[E(SI. Thus. 0- N'+6/2 EN SI.m = (SI1. .RI = R] But var[E(SI.RI R) + var[E(SI. variable (sequence) satisfying Xi.ipN to getthistermis approximation . Now.M . whereI is uniform { 1.meanPN. But the leftside of (A. and N .. R1bethe Let exact number observations of required (C2) holds in probability an argument by very so from blockBIM. let L1 + .PN) PN .triangular of arraysetting) Chung (1974).. and (C3). . satisfying (C1). then M/NpN -* 1 in probability. N} and R is geometric on with by the result(5. where S = 2R(O) var(SI. enough show itis to that by themeanand variance N`'/2SI. The distribution N"12(EN. R = LM .tendsto to 0 in (conditional) 0 in probability. R(i) .R tends 0.' N PN(l PI N N -PN) N var(S1. (C2). yieldingN1 mN Step 2. tends weakly thenormal to distribution mean0 withmean 0 and variance U2 . L2. . (A.M . First. let YNj /T/2 N-I N-1 =M1/2 TheE /ndAI\ = . This follows step2 and (C1). The distribution N'"2(X -XN).applytherelationship The absolute value of the last term is bounded above by 2 IiR(i) I/N = O (N-').. areiidgeometric mean1/PN.Finally. as in step 1. LM-1 (C2).b to in defined in(1). Again.pN) (1 NXN lVkJ~NmN NPN/ lett(1946).)r-1PNX In (C3). Si.LM that observations theM blocks N from have the convergence similar Theorem1.O(i). showthat to to (C3) holdsin probability. thedistribution of N 1/2( EN M satisfying (A.J. Show that N'/2(ENM .Lm)/N.L /N1/2 ME(YmN)I. N-1E[var(SI. N} the on Step 5.3. fornow assumethefollowing three hold for the convergences EI m /2 YNi .. First. of Step 3..To prove it suffices showthat to ( 12).PN)/PN. assume = 0. Combine steps 1 and 4 to provetheclaim. geometric HenceEN.R1.X N is equalin distribution toN-'SI.2) = I= YN I/MN is the averageof iid variables. Cov[RN. J. where is uniform { 1. is.ifM is any random with U2 and variance .. P - ( -pN showthattheconvergence (C1). Theresult11)follows ..j) NpN/mN -* 1 and (C2).R1 = N . itfollows (C 1) NX N/ (NpN) -_0. .o(j)] The second termon the right side of (A. This essentially followsby an extensionof Theorem 7.. Thustheresult proved.But.note that E(SI. . M = NPN sumedconditional XA.R) - 0 as well. J.. LetM be with Now to deduce (12).b is defined be thesumofobservationsBi.RXN I1]' ( (N51/2 Step 1..XN) tendsto normal of . .r - _ N (1 . oc andPN -> 0.R. by a subsequenceargument suffices it to thesmallest m L1 integer suchthat + *** + Lm> N. The distribution N"12(EN..XN meanI/PN. SO that N-12E(SI.. .2 (to a all calculations to referringthis will bootstrap distribution be as. Then EN.. To handle the first summation.3) is mN [ YmN .o(i) s-N-1 i=-(N-1) N1S j=-(N-1) RN.6) raisedto thepower(2 + 5)'.LM. I2.M .XN) tendsto normal withmean 0 and variance U2 .5) is (by conditioning R) on equal to MN (C3) N'+12 N Si. i=1 i=1 var(YN.oo.i)I2+1 _>0 (A.MN . by Katz's (1963) Berry-Esseen bound. where II. . XN.M-X is just N1 for original the sequenceX1. N}. Xi.6) ( N-'var(SI. and (C3) holdin probability and J = LM + J. (C1) holdsin probability times sumoftheobservations BIM. deleting first the in after the because N"12XNis order 1 in probability NPN -> oo. conditional of on Step 4.

Technology and D. for Procedure Multivariate Resampling (1992b)." in Exploring Limits Bootstrap. of quences. and Application. It to tendsto 0 in probability. Billard.Stationary hauser..r . have El Si. (1989).B. eds. (1992). Politis.378-398.D. now suffices show El N612 PNp' [ReceivedApril1992.r - note that if 1 i < i + r. LePage and L. Then. (1992c). 1217-1241. New York: JohnWiley.. "BootstrapMethods:AnotherLook at the Jackknife. M. thatitsexpectation z PNl+62: To bound El Si. R. 1107-1108. thatis. Statistics. (2nd in Theory ed. J. Journal Statistics. 45undVerwandte ZeitschriftWahrscheinlichkeitstheorie fur Theorem from as follows (14) and (15) are immediate The proof 57.. "Bootstrap 4. Gebiete. the of CaptureWeak Dependence." Mean ExThe Estimation: Trimmed Adaptive (1990b). "MomentBoundsforStationary tendsto 0." Annals Statistics.r E[ I Si. Rosenblatt. of I.pp.R RXNI . Analysis Time M. 98-103." Choice of TuningParamLeger. P. M.870-875. (1992). (1981).D. (1980). (1973). whichis appliand applying tionsof thecomponents mapping by cable by Remark4. Mathematics. 263-270. rXNI ](1 -PN) < PN - 0. applies.] S.F - rXN I. by Minkowski'sinequalityand we thenYokoyama's inequality. 1-26. Politis. "A Note on EmpiricalProcessesof Strong-Mixing 1. New York: Springer-Verlag. "A Nonparametric and in Science Confidence Regionsin Time SeriesAnalysis.and Lai.K. "Asymptotic 12. A Course Probability AcademicPress. (A.r = (Xi + * + XL+r-I -N)." in Exploring Limits Bootstrap.Politisand Romano: The Stationary Bootstrap 1313 A DZD'. Bands Confidence Politis. "A GeneralResamplingScheme for to of Arrays a-MixingRandom VariablesWithApplication Triangular TheAnnals Statistics. Approximation New York: JohnWiley. Priestley. Brilhinger. Statistics. LePage. SeDeo." Computing eds. B. Applications. Limit Theory its Hall. where 1 < r < N.. (1946). "MovingBlocksJackknife Bootstrap eds. pp. (1980). Liu. write Si.r = + XN) 1 > Nbut r < N. New York: Acand Series. on assumptions f implythat N'12[f( N) Zurbenko. of ample. on of Proceedings the 22nd Symposium theInterface. Society San and DataAnalysis Theory. of the Problemof SpectralDensityEstimation..709-735."Annals Probability..S. J. "Bootstrap 18.rXNI2+1. and Spectral Mixing. "On theTheoretical of Time Series.rXNI < (3K)2+r'1+(6/2).27-41. wherethe constant dependsonlyon the mixingsequence { a(k) }. G. Romano. Billard.C. Analysis Time with North-Holland. "Note on the Berry-Esseen Statistics.." Canadian and K. In thegeneral case. J. Bartlett. 1985-2007.R. (1980)." Yokoyama. Apply Minkowski's inequality and Yo- (j- 1)NXN + Si. R. R. or in probability. Serfling. "The Jackknife the Bootstrap GeneralSta17. R. Then Si. see theorem of Serfling (1980.C. of Annal the of Institute Statistical eters. 122). Efron. (1974). and Stationary Nonstationary New York: JohnWiley. Proofof Theorem 4 52. In the case + (Xi + * i + r- to koyama'sinequality getEl Sir I < 22+'Kr' 652)." Boston: Birkand Fields.rI2+5 <Kr'+(512. of DensityEstimates. LePage and L.1) + r.. (1981). 1206-1215. (1979). (1990a). Strong Normality. MixingSequences." for Spectra 40. "Bootstrap 34. Martingale New York: AcademicPress.).Pr SO E I Si. yielding K El Si."Annals MathKatz. R.7) is oforder XNN'12[NPN] -(1+5)/(2+5). Supplement. "A CircularBlock ResamplingProcedureforStationary eds. find IS.r. suppose that r + N(j . Sequences Random (1985). of The Observations. J. arguing 2+6 El we as earlier.. C."in Exploring Limits Bootstrap. "Bootstrap on IEEE TransactionsSignal Processing. (1992). tionary for by Correction Moving Block Bootstrap Lahiri." Journal theRoyalStatistical ertiesof Autocorrelated 8. T.8) is and the general bound (3K)2+r'1+(6/2) boundedabove by PN 00 (3K)25r' ya z N52r=1 ) +/( Nr-1p PN /12(1 . ematical for and Kiinsch. Francisco: Series: D. Gaussiandistribution mean multivariate -f() ] hasa limiting . RevisedApril1993. Technometrics. New York: JohnWiley. and Cross-Spectra.. rXNI2+6 ?< [K1(2+6)r1+(/2)][1/(2+?)] I2+6 + (E IrXNJ )2+)1/(2+b) < [K'/(2+6rY'+(b/2)][1/(2+b)] + -rKN +(/2)][1/(2+5)112+6 N < (2K)2+br( 1+6)/2. Hence (A. Billard. Time Holden-Day. 1167-1180. C. S.1 N. Annals Probability. and Heyde. 297-314.0 REFERENCES Propof Specification Sampling M. (1992a). Theorems Mathematical of R." 7. and Romano. and the smoothness Amsterdam: Series. Thus. of TheAnnals Statistics. Leger. 34. which 0 and covariancematrix p.H.C. Then (13) follows thecontinuous continuouswith theorem(because a norm is almost everywhere to respect a Gaussian measure). and Singh. TheSpectral 3. in The secondterm (A. (1992). Y.PN) = (PN st/2 kN * PN 1+ 1 2) / = o( Proofof Theorem 3 linearcombinaby immediately considering The prooffollows Theorem2. Page and R.r- rXNI I El S. (1986). and Romano.8) < then Yokoyama's (1980) moment inequalityapplies. and Romano. Spectral ademic Press. New York: Chung. (1984). LePage and L. "Edgeworth the of Data. (1963). of the Data. of Theorem." 20.

Sign up to vote on this title
UsefulNot useful