You are on page 1of 1

Note that this model has a similar form to the multiple linear regression model.

The
difference is that in regression the variable of interest is regressed onto a linear function of other
known (explanatory) variables, whereas here X t is expressed as a linear function of its own past
values thus the description "autoregressive". As the values of X t at p previous times are
involved in the model, it is said to be an AR(p) model.
Now we need to calculate the parameters i for prediction. There are two such methods: least
squares estimation and maximum likelihood estimation (MLE).
To calculate the least squares estimators, we need to minimize following expression (here we
let p=2):
N

(X
t 1

1 X t 1 2 X t 2 ) 2

(15)

with respect to 1 and 2 . But since we do not have the information for t=1 or t=2, an
assumption is made here that X1 and X2 are fixed, and excluding the first two terms from the sum
of squares. That is, to minimize
N

(X
t 3

1 X t 1 2 X t 2 ) 2

Then, a similar approach to linear regression is used to obtain the parameters.


Maximum likelihood estimation (MLE) is attractive because generally they are
asymptotically unbiased and have minimum variance. Therefore, we introduce this method here.
Suppose that we have a sample of dependent observations Xt, t=1,,N each with pdf f(Xt).
Then the joint density function is:
N

f ( X 1 , X 2 ,..., X N ) f (X t | X t 1 )

(16)

t 1

where, X t denotes all observations up to and including Xt. f ( X t | X t 1 ) is the conditional


distribution of Xt given all observations prior to t.

You might also like