You are on page 1of 3

ECONOMETRIC METHODS II: TIME SERIES SYLLABUS 2014

KRISTOFFER P. NIMARK

Class time and place: Wednesdays and Fridays 9.00-11.00 (20.017) TA sessions: Fridays 12.00-13.00, (Room TBA) Office hours: Thursdays 12.00 -13.00 Office 23.408 (Wellington building) Email Address: knimark@crei.cat Course website: www.kris-nimark.net

Overview This course aims at equipping students with the tools needed to produce applied macro economic research. Most of the material can be found in Time Series Analysis by James D. Hamilton (Princeton University Press, 1994), Bayesian Econometrics by Gary Koop (Wiley-Interscience, 2003) and Cochrane’s “text” but reading articles may also be required. Administrative matters Grades will be based on the final exam (60%) and 3 homework assignments (2×10% + 1×20%).

Lecture 1 Introduction. • What is Time Series Statistics and what is it good for? • Philosophical issues related to probability and statistics – Why do we need probabilistic models to describe the world? – Is there true randomness in nature? Does it matter? • Course overview • Basics – Matrix algebra – Stochastic processes • Difference Equations • Lag operators Readings: Appendix A4-A5 Ch. 1-2 in Hamilton.
Date : April 1, 2014.
1

7 in Cochrane + Articles.3-4 in Cochrane.9-11 in Hamilton and Ch.4 in Hamilton. • Unit roots • Cointegration Readings: Ch. Boivin and Eliasz (2005) Lecture 9-10 Unit Roots and Cointegration.2 KRISTOFFER P. Lecture 5-7 Vector Auto Regressions (VARS). Prediction and Forecasting.3 in Hamilton and Ch. NIMARK Lecture 2-3 Stationary ARMA processes.3 in Cochrane and lecture notes on the Projection Theorem. • Conditional expectations • Prediction and ARMA processes • Interval forecasts Readings: Ch.5 and Ch. Lecture 4 Estimation of Linear Models. • Latent variables • State Space Models • The Kalman Filter • Numerical optimization • DSGE models • Latent Factor models • Smoothing . Lecture 8 Principal Components and FAVARS. Ch.8 in Hamilton. • Estimation – Lag specification – Hypothesis testing • Identification – Contemporaneous (Cholesky) restrictions – Long-run (Blanchard-Kahn) restrictions – Sign restrictions • Variance decompositions • Impulse response functions Readings: Ch. • Maximum likelihood • Least squares • Linear projections Readings: Ch. AR and White Noise Processes • Auto Covariance Function Readings: Ch.11 in Cochrane.18-19 in Hamilton and Ch. • Basics of stochastic processes • MA. • Dimension reduction • Dynamic Factor Models • Forecasting Readings: Stock and Watson (2010) and Bernanke. Lecture 11-13 State Space Models and the Kalman Filter.

Time Series Analysis. 2-3 Lecture 16 Simulating posterior distributions. Princeton University Press. . • Metropolis-Hastings Algorithm • Model fit Readings: Koop Ch.edu/john. [3] Koop. Wiley-Interscience. posterior.. 1994. • What is Bayesian statistics? • Objects of interest: Prior. 8 and 11 and An and Schorfheide (2007) Lecture 20 Course review. Bayesian Economtrics.ECONOMETRIC METHODS II 3 Readings: Ch. John. • Combining natural priors with sample information • Model Comparison • Monte Carlo integration Readings: Koop Ch. James D. References [1] Cochrane. 1-2 Lecture 15 Linear Regression Model with Natural Priors.chicagobooth. • Local level/Unobserved component model • DSGE models • Prior Predictive Analysis • Bayesian Model Averaging Readings: Koop Ch. Time Series for Macroeconomics and Finance. 2005. 2003. 4 Lecture 17 Non-linear models. Lecture 14 Introduction to Bayesian Statistics.5. Goffe et al (1994).cochrane/research/Papers/Time Series Book. marginal likelihood. • Gibbs Sampling • Inequality constraints Readings: Koop Ch.pdf [2] Hamilton. Gary. lecture notes on the Kalman Filter.7 and 13 in Hamilton. 5 Lecture 18-19 Bayesian estimation of State Space Models. posterior odds ratio • Bayesian Computation Readings: Koop Ch. http://faculty.