You are on page 1of 34

An Introduction to Time Series

Ginger Davis

VIGRE Computational Finance Seminar


Rice University

November 26, 2003


What is a Time Series?
• Time Series
– Collection of observations  y1 , y2 ,  , yT 
indexed by the date of each
observation

• Lag Operator
– Represented by the symbol L
Lxt  xt 1
• Mean of Yt = μt
White Noise Process
• Basic building block for time series processes

  
t t  

E  t   0
 
E  t2   2
E  t     0
White Noise Processes, cont.
• Independent White Noise Process
– Slightly stronger condition that  t and   are
independent
• Gaussian White Noise Process

 t ~ N 0,  2

Autocovariance
• Covariance of Yt with its own lagged value

 jt  E Yt   t Yt  j   t  j 
• Example: Calculate autocovariances for:
Yt     t
 jt  E Yt   Yt  j     E  t  t  j 
Stationarity
• Covariance-stationary or weakly stationary
process
– Neither the mean nor the autocovariances depend on
the date t

E Yt   
E Yt   Yt  j      j
Stationarity, cont.
• 2 processes
– 1 covariance stationary, 1 not covariance
stationary

Yt     t
Yt   t   t
Stationarity, cont.
• Covariance stationary processes
– Covariance between Yt and Yt-j depends only on
j (length of time separating the observations) and
not on t (date of the observation)

 j j
Stationarity, cont.
• Strict stationarity
– For any values of j1, j2, …, jn, the joint
distribution of (Yt, Yt+j1, Yt+j2, ..., Yt+jn) depends
only on the intervals separating the dates and
not on the date itself
Gaussian Processes
• Gaussian process {Yt}
– Joint density


f Y1 ,Yt  j1 ,,Yt  jn yt , yt  j1 ,  , yt  jn 
j1 , j2 ,  , jn
is Gaussian for any
• What can be said about a covariance stationary
Gaussian process?
Ergodicity
• A covariance-stationary process is said to be
ergodic for the mean if

1 T
y 
T
y
t 1
t

converges in probability to E(Yt) as T  


Describing the dynamics
of a Time Series
• Moving Average (MA) processes
• Autoregressive (AR) processes
• Autoregressive / Moving Average (ARMA)
processes
• Autoregressive conditional heteroscedastic
(ARCH) processes
Moving Average Processes
• MA(1): First Order MA process

Yt     t   t 1
• “moving average”
– Yt is constructed from a weighted sum of the two
most recent values of  .
Properties of MA(1)
E Yt   
E Yt     E  t   t 1 
2 2


 E  t2  2 t  t 1   2 t21 

 1 2  2 
E Yt   Yt 1     E  t   t 1  t 1   t  2 

 E  t  t 1   t21   t  t  2   2 t 1 t  2 
  2
E Yt   Yt  j     0 for j>1
MA(1)
• Covariance stationary
– Mean and autocovariances are not functions of time
• Autocorrelation of a covariance-stationary
process  j
j 
0
• MA(1)
 2 
1  
1    1   
2 2 2
Autocorrelation Function for White Noise:
Yt   t
1.0

0.8
A u to c o rre la tio n

0.6

0.4

0.2

0.0

0 5 10 15 20
Lag
Autocorrelation Function for MA(1):
Yt   t  0.8 t 1

1.0

0.8
A u to c o rre la tio n

0.6

0.4

0.2

0.0

0 5 10 15 20
Lag
Moving Average Processes
of higher order
• MA(q): qth order moving average process

Yt     t  1 t 1   2 t  2     q t  q
• Properties of MA(q)

 0  1         
1
2 2
2
2
q
2

 j   j   j 11   j  2 2     q q  j  2 , j  1,2, , q
 j  0, j  q
Autoregressive Processes
• AR(1): First order autoregression

Yt  c  Yt 1   t
• Stationarity: We will assume  1
• Can represent as an MA () :
Yt  c   t    c   t 1    c   t  2   
2

 c  2
    t   t 1    t  2  
 1   
Properties of AR(1)
c

1   
 0  E Yt   
2

 2

 E  t   t 1    t  2  
2


 1  2   4   2 
2


1   
2
Properties of AR(1), cont.
 j  E Yt   Yt  j   
  
 E  t   t 1   2 t  2     j  t  j     t  j   t  j 1   2 t  j  2  

  j   j 2   j 4    2

  j 1  2   4  2 
 j  2
 
 1  
 2


j
j   j
0
Autocorrelation Function for AR(1):
Yt  0.8Yt 1   t

1.0

0.8
A u to c o rre la tio n

0.6

0.4

0.2

0.0

0 5 10 15 20
Lag
Autocorrelation Function for AR(1):
Yt  0.8Yt 1   t

1.0

0.5
A u to c o rre la tio n

0.0

-0.5

0 5 10 15 20
Lag
2
1
0
-1
-2
Gaussian White Noise

0 20 40 60 80 100
2
1
0
-1
-2
-3
AR(1),   0.5

0 20 40 60 80 100
4
2
0
-2 AR(1),   0.9

0 20 40 60 80 100
4
2
0
-2
-4
AR(1),   0.9

0 20 40 60 80 100
Autoregressive Processes
of higher order
• pth order autoregression: AR(p)

Yt  c  1Yt 1  2Yt  2     pYt  p   t


• Stationarity: We will assume that the roots of
the following all lie outside the unit circle.

2 p
1  1 z  2 z     p z  0
Properties of AR(p)
c
 
1  
1  2     p 

• Can solve for autocovariances /


autocorrelations using Yule-Walker
equations
Mixed Autoregressive Moving
Average Processes
• ARMA(p,q) includes both autoregressive and
moving average terms

Yt  c  1Yt 1  2Yt 2     pYt  p   t


 1 t 1   2 t 2     q t q
Time Series Models
for Financial Data
• A Motivating Example
– Federal Funds rate
– We are interested in forecasting not only the
level of the series, but also its variance.
– Variance is not constant over time
12
10
8
6
4
2
U. S. Federal Funds Rate

1955 1960 1965 1970 1975


Time
Modeling the Variance
• AR(p): yt  c  1 yt 1  2 yt  2     p yt  p  ut
• ARCH(m)
– Autoregressive conditional heteroscedastic process
of order m
– Square of ut follows an AR(m) process
ut2    1ut21   2ut2 2     mut2 m  wt

– wt is a new white noise process


References
• Investopia.com
• Economagic.com
• Hamilton, J. D. (1994), Time Series
Analysis, Princeton, New Jersey: Princeton
University Press.

You might also like