This action might not be possible to undo. Are you sure you want to continue?

1

Introduction

EXAMPLES OF TIME SERIES: wine.tsm, signal.tsm, Draw cos(t), cos(1/10)t, cos(10t). Period T = 2π, 10(2π), 2π/10. Frequency= 1/period. Periodic functions: sin, cos used in frequency domain analysis (make pictures of cos(t), cos(2πt), cos( 2πt ), yt = cos( 2πt )+wt. 20 20 Frequency domain analysis (Ch. 4): FOURIER SERIES a0 ∑ + [an cos nx + bn sin nx] 2 n=1 under certain conditions a smooth enough periodic function can be expressed with the help of sin and cos function as an inﬁnite sum. If the function is not periodic Fourier integrals replace the sums. Earlier: observations are X1, X2, . . . , i.i.d.r.v.’s (independent identically distributed random variables) It is often a good model: e.g. measurements; often it is not: e.g. daily minimum temperature. When the independence is not tenable, dependence has to be measured Correlation: measures linear dependence Corr(Xi, Xi+1) = 0?. Recall: Cov(X, Y ) = E((X − µX )(Y − µy ))

1 ∞

. . ±2. . . . . . X2 ≤ u2. . minute. um) = P (X1 ≤ u1. . i=1 i=1 for all u1.Cov(X. . . Xt+2. . . u2. Xt+1. ±2.: {Xt. . Xt+m). . . in case of independence. . . Xt+m) has a joint d. u2. Xt+1. um) = n P (Xi ≤ ui) = n FXi (ui). we can use the distribution function to calculate expected values: ∫ ∞ µXt = E(Xt) = ufXt (u) du −∞ 2 . For example.g. month. .f. day. . un. e. }. t = 0. Stochastic process with discrete time scale/ continuous time scale. Xt+2. . . Y ) = √ V ar(X)V ar(Y ) The question arises are the observations IDENTICALLY distributed? In new terminology: Is the process STATIONARY? Time series: index is time. . . 4) best for long series Time series: discrete time t = ±1. ∏ ∏ We have F (u1. It describes the joint probabilistic behavior of vector (Xt. . etc. . ±1. . Xm ≤ um). vector (Xt. Y ) Corr(X. Two main approaches: Time domain approach best for short series Frequency domain approach (Ch. F (u1. . . . .

’s. MA(1). 2: Autoregressive model AR(p) Xt = ϕ1Xt−1 + ϕ2Xt−2 + · · · + ϕpXt−p + Wt where Wt is white noise. p are constants. Time domain approach: Examples: 1: White noise Notations {Wt}. . e. AR(1): Xt = ϕXt−1 + Wt Calculate mean E(Xt) = . (try model. . 4: Models with trend and seasonality 3 . {Wt}. MA(3). simulate) Calculate ACVF and ACF. .g. If it is normally distributed (we say Gaussian). q are constants. Calculate mean of MA(q). uncorrelated.where fXt (u) = ∂FXt (u) ∂u is the density of Xt (we assume it exists). . ϕi. . . specify. . i = 1. . .e.v. Wt has 2 mean 0. or {Zt}. . and variance σw all t. identically distributed r. i. (see Figure in book) You can create one in ITSM. i = 0. Expression in past white noise terms by iteration: Xt = ϕk Xt−k + [ϕk−1Wt−k+1 + · · · + ϕWt−1 + Wt] 3: Moving Averages MA(q) q ∑ = µ+ ψiWt−i i=0 Yt = (µ)+ψ0Wt+ψ1Wt−1+ψ2Wt−2+· · ·+ψq Wt−q where wt is white noise. ψi. then independent random variables.

and the fourth from ”Testing for changes in the covariance structure of linear processes”. 139. 2010. They show that it is important to detect change in the model.ca/ gombay/ The ﬁrst three ﬁgures at the end are form my paper ”Change Detection in Linear Regression with Time Series Errors”. I. 2009. You can ﬁnd papers about this topic on my website: http://www. as otherwise both the model estimation and forecasting will be wrong. 4 . Horvth. 67-79. 38. (joint with Berkes.math.(a) Trend: US population. L. 2044-2063. Lake Huron level 1875-1972 (b) Seasonality (accidental deaths series) (c) change in model over time: change-point problem.) Journal of Statistical Planning and Inference.ualberta.. The Canadian Journal of Statistics.

t) = 2/9σW . WEAKLY STATIONARY PROCESS: 1. 5 . |s − t| = 1 2 γ(s. t. s = t 2 γ(s. . t) = 0σW = 0. . then γX (s. . s ̸= t Example: Yt = 1 (Wt−1 +Wt +Wt+1). Recall: γX (s. If Xt. |s − t| ≥ 3 LAG h = |s − t|: time separation.2 AUTOCOVARIANCE FUNCTIONS (ACVF) γX (s. Xs are normally distributed. |s − t| = 2 2 γ(s. ) = 0 means they are independent. it does not mean there is no relationship/dependence between them. t) = V ar(Xt). s = t = 0. t. (moving 3 averages) ACVF: 2 γ(s. ±1. t) = 3/9σW . AUTOCOVARIANCE/correlation depends on |s − t| only. t) = E{(Xs − µXs )(Xt − µXt )} It measures linear dependence between two points of the series. Example: White noise 2 γW (s. t) = 1/9σW . t = 0. Notation: γ(t. ) = 0 means Xt and Xs are not linearly related. MEAN IS CONSTANT 2. t) = E(WsWt) = σW .

In particular. Z we have EX 2 < ∞. t) γ(s. Y. Z). s)γ(t. Z) = aCov(X. EZ 2 < ∞ then for any constants a. t) . c Cov(aX + bY + c.v. EY 2 < ∞. b. that the covariance function has the linearity property: If for random variables X. 6 . note that adding a constant to a r.AUTOCORRELATON FUNCTION (ACF): ρ=√ γ(s. Z). −1 ≤ ρ(s. t) ≤ 1 ∑t j=1 Wt Example: Autocovariance of random walk Xt = 2 γX (s. t)σW . it is not stationary. is Note. t) = min(s. +bCov(Y. does not change the covariance. Verify this.

This implies γ(t. . 2 Example: 2. assume γ(t + h. . all t. t = 0. t) = var(Xt) = γ(0) = V ar(X0) < ∞ variance does not change. Xtk +h) for all ﬁnite collection of times t1. Xt2+h. t2. t) = γ(h. . tk . 9 2 γ(h) = . ±2. White noise 2 γW (h) = E(Wt+hWt) = σW . . Notation: Autocorrelation function: γ(t + h. . has ACVF 3 γ(h) = . . and all h = 0. we work with weakly stationary series. Xt2 . h ̸= 0. . t + h)γ(t.3 Stationary Time Series Strictly (strongly) stationary time series: (Xt1 . . . t) γ(0) Example: 1. Mostly. . Xtk ) has the same joint probability distribution as (Xt1+h. h = 0. . We saw. ±1. γW (h) = E(Wt+hWt) = 0. . 9 7 |h| = 0 |h| = 1 . . . . that with σW = 1 Moving Average Yt = 1 3 (Wt−1 + Wt + Wt+1 ). . . . ±1. . 0) = γ(h). t) γ(h) ρ(h) = √ = γ(t + h. h.

Y ) = E((X − µX )(Y − µY )) ≤ var(X)var(Y ) gives √ γ(h) = Cov(Xt+h. 8 . X) and |s − t| = |t − s| = h. 9 γ(h) = 0. 2 var(Xt) = 2σW = γ(0) 2 Cov(Xt−1. h ≥ 2 Let σW = 1 and graph ρ(h) as a functions of h. Y ) = Cov(Y. Maximum of γ(h) is at 0.) γ(0) = V ar(Xt) for all t 2.1 γ(h) = . 3. as by the Cauchy-Schwarz Inequality √ Cov(X. Xt) = −σW = γ(1) cov(Xt. as Cov(X.) γ(h) = γ(−h). µ = 0. {Wt} white noise. symmetric about 0. |h| = 2 |h| > 2 (used in smoothing) PROPERTIES OF THE ACVF OF STATIONARY TIME SERIES: 1. Xt+2) = 0 = γ(h). Xt) ≤ V ar(Xt+h)V ar(Xt) = γ(0) Calculations: MA(2): Xt = µ + Wt − Wt−1. From the deﬁnition it is easy to see that the series is stationary (strongly and weakly).

xt2 . . where H is a ﬁxed. The estimate of the mean is n 1∑ x=µ= ˆ xt . xtn are observations on Xt1 . . n i=1 i SAMPLE AUTOCOVARIANCE FUNCTION: n−h 1∑ ˆ γ (h) = (xt − x)(xt+h − x) n t=1 γ (h) = γ (−h) ˆ ˆ SAMPLE AUTO-CORRELATION FUNCTION: γ (h) ˆ ρ(h) = ˆ γ (0) ˆ THEOREM: large sample approximation Under some general conditions. the sample ACF ρ(h) is approximately norˆ mally distributed with mean zero and standard deviation given by 1 σρX (h) = √ . .ESTIMATION OF CORRELATION We assume the stationary property for these estimations. . and the {Xt} observations are independent. where Xt = 1 w. . . 2. Xtn . Xt2 . H. Yt weakly. 9 . 1/2 and Xt = −1 with probability 1/2.7Xt−1. otherwise they do not make sense. . then for large n. Is Xt. if xt is a white noise process. ˆ n Example: Let Yt = 5 + Xt − 0. . arbitrary number. strongly stationary? Calculate the mean and covariance function for both processes. . .p. . Data xt1 . . and for lag h = 1.

For example. . . . ▽2 = ▽(▽xt) = ▽xt − ▽xt−1 = (xt − xt−1) − (xt−1 − xt−2) = xt − 2xt−1 + xt−2 removes quadratic trend. calculate E{▽xt}. then what does x estimate? DETRENDING the series (see graph with a trend).4 EXPLORATORY DATA ANALYSIS Stationarity is important for estimation. 10 . Etc ▽k xt = ▽(▽k−1xt). with µ = β1 + β2t we can detrend ˆ ˆ the series. . where µt is the trend. . Exercise: Let xt = β0 + β1t + wt. DIFFERENCING: ▽xt = xt − xt−1 removes linear trend. Estimate µ by µt.g. . Xt = µt + Yt. If Xt = β1 + β2t + Wt. t = 1. if the mean is not constant for observations {xt. Perform these functions in ITSM. . ˆ then as EXt = β1 + β2t = µt. . then ˆ ˆ Yt = Xt − µt ˆ is stationary. E. n. Yt is a stationary series. n}. t = 1.

Exercise: Let xt = β0 + β1t + β2t2 + wt. . Is it stationary? Calculate mean. . ▽Xt = Xt −Xt−1 is diﬀerenced. ▽d = (1 − B)d: diﬀerences of order d.. Sometimes diﬀerencing with integer value d is too much. B k xt = xt−k . . calculate E{▽xt}.5 < d <. k) = In case of ”nice” STATIONARY TIME SERIES: ACF decays exponentially to zero as lag h increases 11 . Example: random walk ∑t with a drift Xt = i=1(Wi +µ) (stationary?). E{▽2xt} BACKSHIFT OPERATOR B: Bxt = xt−1 B 2xt = B(Bxt) = xt−2. 5 we may get (1 − B)dxt = wt (∗) where wt is white noise and ∞ ∑ (1 − B)d = (−1)k binom(d. . . where 1 is the identity operator. ▽2xt = (1 − B)2xt = (1 − 2B + B 2)xt = xt − 2xt−1 + xt−2. ACVF. 1 (see binomial expansion for non-integer powers) binom(d. Diﬀerencing with backshift operator notation: ▽xt = (1 − B)xt = xt − xt−1. (d − k + 1) k(k − 1) . k)B k k=0 where d(d − 1) . Then we use FRACTIONAL DIFFERENCING: with −..

5. AR MA. Xt2 . rapid ﬂuctuations are removed: low pass ﬁlter. Xt = mt + Yt. . mt = ˆ 2q + 1 −q≤j≤q so Wt approximates mt = at + b. See example. more sophisticated ﬁlters exist. or ARMA). . . . After that we shall try to ﬁt a well-known model (e. i. the noise. and try to do it in ITSM. The new process Wt is a smoothed version of Xt.4 to remover trend and seasonality.36-40 on the other diagnostic tests. t = 1. The analysis of residuals evaluates the success of the above operation of removing the trend and/or the seasonality. then for q ≤ t ≤ n − q ∑ 1 Xt−j . (a) Sample autocorrelation function Read book p.e. We want to see the ACVF (ACF) decay exponentially fast to zero. . . If the ﬁt is good the residuals should be approximately independent identically distributed random variables. n. . 12 . EYt = 0. Other. . 2q + 1 −q≤j≤q If Xt has a linear trend. From process Xt1 .SMOOTHING with moving average ﬁlter: Let q ≥ 0 be an integer. Xtn we can create a new process ∑ 1 Wt = Xt−j . Perform operations of Example 1.g.

If X. Proof: ∞ ∫ ∞ xf (x. Theorem: a) b) E(X) = E{E(X|Y ))}.CONDITIONAL EXPECTATION: ∫ ∞ E(X|Y ) = xf (x|y) dx ∫ E(g(X)|Y ) = −∞ −∞ ∞ g(x)f (x|y) dx PROPERTIES: 1. E(g(X)|X = x) = g(x). Y )|X = x) = E(g(x. E(g(X. Note.v.) fX (x)fY (y) fY (y) . E(f (X)(g(Y )|X) = f (x)E(g(Y )|X). or more generally. V ar(X) = E{V ar(X|Y )} + V ar(E(X|Y )). Y are independent. y) dx dy . ∫ a) E(X) = −∞ 13 −∞ where V ar(X|Y ) = E(X 2|Y ) − [E(X|Y )]2. then (as f (x|y) = fY (y) ̸= 0. E(X|Y ) = E(X|Y = y) is a random variable if y varies through the values r. E(X|X = x) = x. Y )|X = x). in particular. provided E(X|Y ) = E(X|Y = y) = E(X) 2. Y can take. Y )|X) = E(g(X.

∫ ∞ ∫ ∞ = xf (x|y)fY (y) dx dy = −∞ −∞ ∫ ∞ ∫ ∞ ∫ ∞ = fY (y)[ xf (x|y) dx] dy = E(X|Y = y)fY (y) dy. 14 . −∞ −∞ −∞ b) E{V ar(X|Y )} = E{E(X 2|Y = y) − [E(X|Y = y)]2} = E{E(X 2|Y = y)} − E{[E(X|Y = y)]2}. V ar(E(X|Y = y)) = E{(E(X|Y = y))2} − [E(E(X|Y = y))]2 V ar(X) = E(E(X 2|Y = y))−[E(EX|Y = y))]2±E{(E(X|Y = y))2} = E{V ar(X|Y = y)} + V ar(E(X|Y = y)).

Change is detected in 1899. ˆ 1400 Annual volume 600 800 1000 1200 annual volume overall fit before change after change 1870 1890 1910 time 1930 1950 1970 Figures to show changes in the model: 15 .Figure 1: Annual discharge of Nile with regression lines. τ = 28.

τ = 39.0 12. Change is detected in 1981.Figure 2: Collegeville annual temperatures (in Celsius) with regression lines.5 11.0 Annual temperature 10.5 temperature overall fit before change after change 9. Change is detected in 1955.0 1935 1955 time 1975 1995 Figure 3: Chula Vista annual temperatures (in Fahrenheit) with regression lines.5 1915 10.0 11. ˆ temperature overall fit before change after change Annual temperature 58 1918 60 62 64 1938 1958 time 1978 1998 16 .5 12. ˆ 13. τ = 62.

n = 289. o sunspot 0 1700 50 100 150 1750 1800 1850 Time 1900 1950 17 .Figure 4: The graph of the W¨lfer sunspot numbers between 1700 and 1998.

- 11ACFPA
- 1205-ts
- SSRN-id2411493
- ARMA Introduction
- 03 Spring Final Soln
- fdhdhfhgchgfthdgcggd
- Lecture 4
- Nevill Andrew
- Sp Magazine Final Paper
- Chap003 (1).pptx
- 125chapter 7-05
- CAPM APT
- CAPM
- McClung Thesis Laysan Finch
- Estimation of ROI in Share Market
- 44315933 Portfolio Markowitz Model
- PID2132765[1]
- MANAGERIAL STAT LECTURE
- Chapter 15 Unanswered)
- (34)-radiation-fundamentals-3.pdf
- SAP Information Design Tool
- vectfuncdef
- Solution Assign
- splineregresion4
- Tech Drilling DrDrilling1
- WSEAS_Corfu_2010_Gillich 646-437
- Understanding Acoustic Emission Testing- Reading I.pdf
- “Evidence for van der Waals adhesion in gecko state
- Evidence for Van Der Waals Adhesion in Gecko Setae
- B Spline Theory

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue reading from where you left off, or restart the preview.

scribd