You are on page 1of 57

Time Series Econometrics …

Steps to improved forecasts


Definition of Time Series
• Realization of a particular
stochastic process is called
a TIME SERIES.

• Every realized observation


of a stochastic process is
understood as a TIME
SERIES.
TIME SERIES - Meaning
• Time Series deals with statistical data which are collected, observed
or recorded chronologically.
• A time series is a set of observations taken at specified times, usually
at equal intervals.
• A time series may be defined by the values of Y1, Y2, … of a variable Y
at times t1, t2,…
Issues related to Time Series are -

 UNDERSTANDING THE BEHAVIOUR

 ESTIMATION & FORECASTING

 ESTABLISHING THE RELATION


Some specific issues in Finance
related to Time Series
 Forecasting – Share Prices, Interest Rates.

 Estimating – Mean (Return) and Variance (Risk)

 Establishing the relation – Relation between the spot rates


and forward rates
Tools to Identify a underlying behavior of Time Series

•Time Series Plot

•Correlogram

•Stationarity
Look at this graph?

REALIZATION OF A VALUE WHICH IS HAVING SHOCKS


Look at this graph?
REALIZATION OF A VALUE WHICH IS HAVING SHOCKS
Time Series –
Challenge #1

Understanding the
behaviour of a time
series!
TIME SERIES - AutoCorrelation
• 1. Sampling objects
• Cross-section: Population & Sample
• Time series: Process & Realisation
• 2. Information content in Order of data
• Cross-section: Design-based
• Time series: Chronological
• 3. Presence of serial correlation
• Cross-section: Rare
• Time series: Inefficient estimates
Correlogram
Correlogram
• Correlogram is graph of correlations
• Correlations are computed and plotted in a bar chart
• It is useful to have a benchmark to determine if an autocorrelation
• is significantly large.
• A simple rule is to only consider autocorrelations that are larger than
the critical values of 2/√n in magnitude
• OR Look at the output graph and look for values above the critical
value
Correlogram
Notion of Stationarity
Key basic Processes in Time Series
• White Noise Process
• Random Walk
• Random Walk with Drift
• Deterministic Trend Process
• Autoregressive Process
• Moving Average Process
White Noise Process
• In regression models, the error terms are generally assumed to arise from a
white noise process

• White noise process is a process whose realizations are regarded as a sequence


of serially uncorrelated random variables with zero mean and finite variance

• In Simple words, the white noise process is the process where the sequence of
error terms is serially uncorrelated with 0 mean and constant variance
• Simplest representation of the white noise process could be when the
realizations have the same probability distribution e.g. independent and
identically distributed
White Noise Attributes
Random Walk with deterministic trend Process
Appreciating ways of modeling a time series!
• A Time Series may be a function of time!
 It has a trend! Yt = f(t)! Note the
difference!

• A Time Series may be a function of its own history!


 Past determines FUTURE! Yt = f(Yt-1)!
Note the
difference!

• A Time Series may be a function of some exogenous


variables!
 Independent Factors determine it! Yt = f(Xt)! Note the
difference!
These charts are showing a fact that time series are not
stationary.

x(t)

600

500

400

300

200

100

0
1 501 1001 1501 2001 2501 3001 3501 4001 4501 5001

-100
Time
What if …you perform regression analysis on the non-stationary
time series?....You will be in trouble!
You may get a spurious regression !!
Examples of spurious regression !!
Testing Violations in Regression
Assumptions of Regression Model
Our discussion on the violation of
assumptions will be as per the following
structure:

• What it is?
• What are its consequences?
• How to detect it?
• How to take care of the violation?
HETEROSCEDASTICITY
The Variance of the Error Term in a
regression model is not allowed to vary!!!
• It is assumed that Error Term has a constant variance and this
assumption is known as the
• Assumption of HOMOSCEDASTICITY.
• It means that with change in observations the variations in the error
term should not change its variation should remain constant and
does not have any tendency of any kind of trend
Possible causes for heteroscedasticity
• Measurement error can cause heteroscedasticity

• Missing variables: Misspecification of a regression model may also


• result in heteroscedasticity

• Sampling strategies can produce heteroscedasticity

• Quite common also in cross sectional data

• Heteroscedasticity can arise as a result of existence of outliers

• Skewness in the distribution in one or more regressors may cause variance of the
error term to vary.
Consequences for heteroscedasticity
• Heteroscedasticity leads to a very serious problem - our estimated
standard errors are wrong.

• They are no-longer EFFICIENT!

• Our Conclusions from Hypothesis-Testing and Estimations may be


WRONG!
How to Detect heteroscedasticity ?
• One Method is – Graphical Method: Take a scatter plot of residuals
against the explanatory variables or predicted values
Detection of heteroscedasticity: GRAPHICAL
Other methods to detect heteroscedasticity
1. The White’s Test
2. The Breusch-Pagan Test
3. The Harvey-Godfrey Test
4. The Goldfeld-Quandt Test
5. The Glesjer Test
How to take care of such a violation?
I. One approach is – Derive consistent estimates of Standard Error using
White Heteroscedasticity-Consistent Standard Errors and
Covariance; ( Coefficients don’t change)

II. Second approach is to use Generalized (or Weighted) Least Square Method;
(Coefficients may change)

III. Third Approach, Correction in specification:


a) Deflate variables by taking log of dependent etc.
b) By Taking reciprocal of variables – one can deflate the variables and avoid
heteroscedasticity
AUTOCORRELATION
Error Terms in a regression model are not allowed
to have any relationship : Autocorrelation
• It is assumed that Error Term has no correlation within themselves;
that is they are independent

• It means that covariance among them is zero!!!

If the Error Terms are correlated or have covariance among them which is not zero,
then it is called a problem of SERIAL CORRELATION or AUTOCORRELATION !!!
Possible Reasons of Autocorrelation
1. Omitted Explanatory Variables
2. Mis-specification of the Mathematical form of the model
3. Over-reactions and ‘inefficiencies’ in financial markets
4. Inertia of the dependent variable

• Autocorrelation problem is more observed in Time Series Data


Consequences of Autocorrelation
• The estimates of the standard errors will become biased and
unreliable when autocorrelation is present.

• This leads to problems in hypothesis testing about estimators


and other statistics and confidence intervals.
How to Detect autocorrelation ?
• One Method is – Graphical Method: Plot residuals against
observations of against time if the series is Time Series

• Scatter plot of residuals (et) against lagged residuals (et-1).


How to Detect autocorrelation ?
How to Detect autocorrelation ?
How to Detect autocorrelation ?
• One Method is – Graphical Method: Take a scatter plot of residuals
How to Detect autocorrelation ?
Statistical methods to detect
Autocorrelation
1. The Durbin-Watson test
• It is test of autocorrelation of first order.
• No autocorrelation is its null hypothesis.

OTHER methods to detect autocorrelation


2. The Breusch-Godfrey Serial Correlation LM Test
3. Run test
The Durbin-Watson test (d)
• It is test of autocorrelation of first order i.e. (only et-1 is considered
when testing for autocorrelation.
• No autocorrelation is its null hypothesis.
• But d has no unique critical value.
• Sample size (n) and the number of regressors (k) are used to calculate
• Upper (dU) and lower bounds (dL) to determine rejection regime
How to take care of autocorrelation ?
• Ignore and use OLS

• Use the lag term in model

• Use Generalised Least Square (GLS) procedure for pure


autocorrelation (not due to specification errors)

• Use HAC Newey-West method but its more suitable for large
samples
MULTICOLLINEARITY
What MULTICOLLINEARITY does impact?
• If the inter-correlation among explanatory variables (X1, X2…)is PERFECT,
then the estimates of coefficients become indeterminate
• The standard errors become infinitely large

• Ordinary Least Square estimators will still be unbiased even if we have


multicollinearity

• If Correlation between independent variables is higher than correlation


b/w dependent and independent variables, then the there could be issues
of multicollinearity could is likely
What MULTICOLLINEARITY does impact?
• Estimates become unstable and unpredictable; that’s to say, when
multicollinearity is present, the estimated coefficients are unstable in
the degree of statistical significance, magnitude and sign

• R2 becomes very high though the coefficients are not significant!!!!

• The OLS estimators and their standard errors can be sensitive to small
changes in data.
TOOLS to identify MULTICOLLINEARITY
• Testing for significance the Correlation Matrix among all X’s

• High R2 but few significant t Ratios

• Tolerance and Variance Inflation Factor (VIF):


• If Centered VIF > 10 is severe multicollinearity

• and, Tolerance is the reciprocal of VIF.


Solutions to address Multicollinearity …

• Ignore if correlations less than 0.9

• Drop all but 1 Variable which are causing correlations

• Transforming the variables

• Identify underlying factors

• Use factor analysis if there are several highly correlated


independent variables

You might also like