Professional Documents
Culture Documents
55–69
Simon Wren-Lewis*
Abstract: The New Classical Counter Revolution changed macromodelling methodology and ush-
ered in the hegemony of microfounded models. Yet, in reality, at any particular point in time there is a
trade-off between data coherence and theoretical coherence, and insisting that all academically respect-
able modelling should be at one extreme of this trade-off is a major mistake. The paper argues that if
more traditional structural econometric models (SEMs) had been developed alongside microfounded
models, links between finance and the real economy would have been more thoroughly explored
before the financial crisis, allowing macroeconomists to respond better to the Great Recession. Just as
microfoundations hegemony held back macroeconomics on that occasion, it is likely to do so again.
Macroeconomics will develop more rapidly in useful directions if microfounded models and more tra-
ditional SEMs work alongside each other, and are accorded equal academic respect.
Keywords: macroeconomic modelling, new classical counter revolution, microfoundations, structural
econometric models, empirical coherence
JEL classification: B22, B41, E10
I. Introduction
When Olivier Blanchard moved from being Director of the Research Department
of the International Monetary Fund to become a fellow of the Peterson Institute in
Washington, he published a piece on the future of DSGE modelling, inspired by prepa-
rations for this volume. It was very widely read, and comments led to a second follow-
up piece. Reactions to that inspired a third in the series (Blanchard, 2017) which I think
it is fair to say startled many academic macroeconomists.
In this third piece, published in January 2017, he argued that there should be two types
of model: DSGE models, which he called theory models, and models that were much closer
to the data, which he called policy models. He described these two classes of model thus:
DSGE modelers should accept the fact that theoretical models cannot, and thus
should not, fit reality closely. The models should capture what we believe are the
macro-essential characteristics of the behaviour of firms and people, and not try
to capture all relevant dynamics. Only then can they serve their purpose, remain
simple enough, and provide a platform for theoretical discussions.
Policy modelers should accept the fact that equations that truly fit the data can
This was shocking for most academic macroeconomists. They had long argued that
only DSGE models should be used to analyse policy changes, because only these mod-
els were internally consistent and satisfied the Lucas critique. Largely as a result of
these claims, pretty well all academic papers analysing policy in the top academic jour-
nals are DSGE models. Even the Bank of England’s core policy model was a DSGE
model, and institutions like the Federal Reserve seemed old fashioned in retaining what
appears to be a Blanchard-type policy model as their main model. But here was a highly
respected academic macroeconomist arguing that a different type of model should be
used for policy analysis.
Blanchard’s position is one that I have been publicly advocating for a number of
years, and in this paper I want to give my own reasons why I think Blanchard’s position
is correct. I want to go substantially further, and argue that the failure of academic
macroeconomics to adopt an eclectic methodological position was the key reason why
it was so unprepared for the global financial crisis. If academics ignore Blanchard and
insist on continuing the microfoundations hegemony, this will hold back the develop-
ment of macroeconomics just at it did before the financial crisis.
I start in section II with a brief history of how we got here, focusing on the seminal
contribution of Lucas and Sargent. Section III argues that microfoundations modelling
is a progressive research programme, and that therefore, unlike most heterodox econo-
mists, I do not think it should be abandoned. But I also focus on the central flaw in the
claim that only DSGE models can give valid policy advice. The implication is that the
revolution inspired by Lucas and Sargent was regressive in so far as it argued for the
abandonment of more traditional ways of modelling the economy.
Section IV argues that if these more traditional modelling methods had not been aban-
doned following the New Classical Counter Revolution, but had been allowed to coexist
with microfounded macromodels, then links between the financial sector and consump-
tion behaviour in particular would have been explored before the global financial crisis.
I draw on evidence from the UK, where what Blanchard calls policy models continued to
be developed until 2000, and the empirical work of Chris Carroll and John Muellbauer.
This suggests that the microfoundations hegemony has had severe costs for macroeco-
nomics, and could by implication have significant costs in the future. I would argue that
ending the microfoundations hegemony is the most important step macroeconomists can
take to improve their policy advice, and advance their discipline. Section V concludes by
looking at how the two types of models can not only coexist, but learn from each other.
scientists. As one colleague once asked me, do medics spend a lot of time talking about
there is no attempt in this paragraph to link this stagflation failure to the identifica-
This hegemony, and the adoption by some key central banks (such as the Bank of
obvious biological justification, whereas at the other end of the spectrum would be a
treatment based on biology but with, as yet, very little evidence that it worked.
The ideal, of course, is to have a model that is both consistent with microeconomic
theory and consistent with the data. At any particular point in time that is not achiev-
able either because we need more research or better ideas, or because other problems
(such as aggregation) make the ideal unachievable. As with medicine, the option of
waiting until this ideal is achieved is not available to policy-makers and those that
advise them.
If you accept this framework, then it is clearly quite legitimate to build models that
represent some kind of compromise between the extremes of DSGEs and VARs. It is
these models that I describe as SEMs. (The various names on Pagan’s curve occupy-
ing this middle ground need not concern us here.) SEMs are likely to be attractive to
policy-makers because they typically combine enough theory to ‘tell a story’, but give
the policy-maker the confidence that the models predictions are consistent with past
evidence. To an academic, however, the compromises involved may make model build-
ing seem more like an art than a science.
The attraction of microfounded models is that their theoretical content is not just
plausible but rigorously derived. An academic paper containing a new DSGE model
will derive every equation from individual optimization, even though the derivation for
particular equations might have been seen before in countless other papers. Only by
doing this can the paper demonstrate internal consistency (with microeconomic theory)
for the model as a whole. But how do we know that such modelling is making progress
in explaining the real world?
Ending the microfoundations hegemony 61
The answer is puzzle resolution. The modeller will select a ‘puzzle’, which is essen-
DSGE modelers, confronted with complex dynamics and the desire to fit the
To use the language of Wren-Lewis (2011), these are ‘tricks’ that do not in fact sum-
marize extensive ‘off-model’ microfounded research.
That a research strategy is progressive is not in itself a reason or justification for hegem-
ony. It is quite possible that an alternative modelling strategy might also be progressive,
and the two could co-exist. The reason to suspect that might be the case with macro-
economics is that microfounded models tend to be very poor at fitting the data. SEMs,
in contrast, can (almost by definition) fit the data much better. Now it could be that the
reason for this is entirely spurious correlation, and that the microfounded model is simply
the best we can do. But this seems highly unlikely, in part from our own experience with
the development of microfounded models. The explanatory ability of these models today
is far greater than the simple RBC models of the past. The additional explanatory power
of SEMs may simply represent a set of puzzles that has yet to be solved.
The response of most economists who use microfounded models to the idea that
SEMs could be an alternative to DSGE models is normally very simple. SEMs are
internally consistent only through luck and good judgement, are therefore subject to
the Lucas critique, and therefore cannot produce valid policy conclusions. But the coun-
terargument is straightforward. Microfounded models, because they are unable to
match so much of the data (or at least much less than an SEM), are misspecified and
therefore cannot produce valid policy conclusions. They are implicitly ignoring real
world phenomena that we currently do not fully understand.
We can show this using a simple hypothetical example. Suppose we find that the fit
of a model with a standard intertemporal consumption function in a DSGE model can
be improved by adding a term in unemployment to the equation. The term is signifi-
cant both in statistical terms and in terms of how it influences model properties. In ad
hoc terms it is possible to rationalize such a term as reflecting a precautionary savings
motive, and as a result the term would be immediately included under SEM criteria.
Now it is quite possible that the model including the unemployment term, which
would no longer be a DSGE model but which would be an SEM, is no longer internally
consistent and fails to satisfy the Lucas critique. A microfounded model of precaution-
ary saving, were it to be developed, might be able to justify the existence of the unem-
ployment term, but with certain restrictions which were absent from the SEM. But
the project to build such a model has yet to be undertaken, and will not bear fruit for
perhaps years. So the perfectly valid question to ask is: which is the better model today
at providing policy advice: the DSGE model which is internally consistent but excludes
the unemployment term, or the SEM that includes it? There is no reason to believe that
the advantages of internal consistency outweigh the disadvantage of a likely misspecifica-
tion relative to an SEM.1
1 I am sometimes amazed when I hear DSGE modellers say ‘but all models are misspecified’ as if that
The response that academics should wait until some microfoundations are developed
better than the exact consistency of the New Keynesian DSGE model with bad theory
This is a strong claim, and as a result I want to proceed in stages, to make three inter-
2 Not all academic work on SEMs stopped after the NCCR (see the continuing work of Ray Fair). By
‘equivalent academic status’ I mean that this work was considered publishable in the top academic journals.
66 Simon Wren-Lewis
reduced form of New Keynesian models, with a forward dynamic IS curve and a New
those links would have been much clearer. There is an obvious precedent here. RBC
3 The Bank of England came close to this in its BEQM model that preceded its current DSGE model.
However it chose to instead create a core and periphery model structure, which created a degree of complex-
ity that proved too much even for the Bank.
68 Simon Wren-Lewis
In this method of building an SEM, the role of the DSGE model is absolutely clear.
References
Aron, J., Duca, J. V., Muellbauer, J., Murata, K., and Murphy, A. (2010), ‘Credit, Housing Collateral
and Consumption: Evidence from the UK, Japan and the US’, Review of Income and Wealth, 58,
397–423.
Ending the microfoundations hegemony 69
Blanchard, O. (2017), ‘The Need for Different Classes of Macroeconomic Models’, Peterson Institute