Professional Documents
Culture Documents
Ginger Davis
• Lag Operator
– Represented by the symbol L
Lxt xt 1
• Mean of Yt = μt
White Noise Process
• Basic building block for time series processes
t t
E t 0
E t2 2
E t 0
White Noise Processes, cont.
• Independent White Noise Process
– Slightly stronger condition that t and are
independent
• Gaussian White Noise Process
t ~ N 0, 2
Autocovariance
• Covariance of Yt with its own lagged value
jt E Yt t Yt j t j
• Example: Calculate autocovariances for:
Yt t
jt E Yt Yt j E t t j
Stationarity
• Covariance-stationary or weakly stationary
process
– Neither the mean nor the autocovariances depend on
the date t
E Yt
E Yt Yt j j
Stationarity, cont.
• 2 processes
– 1 covariance stationary, 1 not covariance
stationary
Yt t
Yt t t
Stationarity, cont.
• Covariance stationary processes
– Covariance between Yt and Yt-j depends only on
j (length of time separating the observations) and
not on t (date of the observation)
j j
Stationarity, cont.
• Strict stationarity
– For any values of j1, j2, …, jn, the joint
distribution of (Yt, Yt+j1, Yt+j2, ..., Yt+jn) depends
only on the intervals separating the dates and
not on the date itself
Gaussian Processes
• Gaussian process {Yt}
– Joint density
f Y1 ,Yt j1 ,,Yt jn yt , yt j1 , , yt jn
j1 , j2 , , jn
is Gaussian for any
• What can be said about a covariance stationary
Gaussian process?
Ergodicity
• A covariance-stationary process is said to be
ergodic for the mean if
1 T
y
T
y
t 1
t
Yt t t 1
• “moving average”
– Yt is constructed from a weighted sum of the two
most recent values of .
Properties of MA(1)
E Yt
E Yt E t t 1
2 2
E t2 2 t t 1 2 t21
1 2 2
E Yt Yt 1 E t t 1 t 1 t 2
E t t 1 t21 t t 2 2 t 1 t 2
2
E Yt Yt j 0 for j>1
MA(1)
• Covariance stationary
– Mean and autocovariances are not functions of time
• Autocorrelation of a covariance-stationary
process j
j
0
• MA(1)
2
1
1 1
2 2 2
Autocorrelation Function for White Noise:
Yt t
1.0
0.8
A u to c o rre la tio n
0.6
0.4
0.2
0.0
0 5 10 15 20
Lag
Autocorrelation Function for MA(1):
Yt t 0.8 t 1
1.0
0.8
A u to c o rre la tio n
0.6
0.4
0.2
0.0
0 5 10 15 20
Lag
Moving Average Processes
of higher order
• MA(q): qth order moving average process
Yt t 1 t 1 2 t 2 q t q
• Properties of MA(q)
0 1
1
2 2
2
2
q
2
j j j 11 j 2 2 q q j 2 , j 1,2, , q
j 0, j q
Autoregressive Processes
• AR(1): First order autoregression
Yt c Yt 1 t
• Stationarity: We will assume 1
• Can represent as an MA () :
Yt c t c t 1 c t 2
2
c 2
t t 1 t 2
1
Properties of AR(1)
c
1
0 E Yt
2
2
E t t 1 t 2
2
1 2 4 2
2
1
2
Properties of AR(1), cont.
j E Yt Yt j
E t t 1 2 t 2 j t j t j t j 1 2 t j 2
j j 2 j 4 2
j 1 2 4 2
j 2
1
2
j
j j
0
Autocorrelation Function for AR(1):
Yt 0.8Yt 1 t
1.0
0.8
A u to c o rre la tio n
0.6
0.4
0.2
0.0
0 5 10 15 20
Lag
Autocorrelation Function for AR(1):
Yt 0.8Yt 1 t
1.0
0.5
A u to c o rre la tio n
0.0
-0.5
0 5 10 15 20
Lag
2
1
0
-1
-2
Gaussian White Noise
0 20 40 60 80 100
2
1
0
-1
-2
-3
AR(1), 0.5
0 20 40 60 80 100
4
2
0
-2 AR(1), 0.9
0 20 40 60 80 100
4
2
0
-2
-4
AR(1), 0.9
0 20 40 60 80 100
Autoregressive Processes
of higher order
• pth order autoregression: AR(p)
2 p
1 1 z 2 z p z 0
Properties of AR(p)
c
1
1 2 p