You are on page 1of 17

Policy & Politics • vol XX • no XX • 1–17 • © Policy Press 2021

Print ISSN 0305-5736 • Online ISSN 1470-8442 • https://doi.org/10.1332/030557321X16225469993170


Accepted for publication 01 June 2021 • First published online 05 July 2021
This article is distributed under the terms of the Creative Commons Attribution
4.0 license (http://creativecommons.org/licenses/by/4.0/) which permits
adaptation, alteration, reproduction and distribution for non-commercial use,
without further permission provided the original work is attributed. The derivative works do not need to
be licensed on the same terms.

article
The neglected politics behind evidence-based
policy: shedding light on instrument constituency
dynamics
Arno Simons, arno.simons@posteo.de
German Centre for Higher Education Research and Science Studies (DZHW),
Germany
Humboldt-Universität zu Berlin, Germany

Alexander Schniedermann, schniedermann@dzhw.eu
German Centre for Higher Education Research and Science Studies (DZHW),
Germany

Puzzled by the question why evidence-based policy (EBP) thrives despite evidence against it, we
reconstruct the development and spread of EBP in inter- and transnational contexts and find that
this process is characterised by some of the same dynamics (including ‘structural promises’ and
‘problem chasing’) that have also been observed in many policy instruments. We therefore propose
a double reframing: EBP is (1) a ‘meta-instrument’ aiming to establish a particular role for research
in policymaking (our ideational reframing) and (2) co-evolving with an ‘instrument constituency’
motivated not only by normative goals but also by the prospect of securing an occupational niche
for itself (our social reframing). Taken together, these reframings reveal the neglected politics
behind EBP and prompt us to treat EBP as a political device rather than as an analytical framework
to explain how policymaking actually relates to research.

Key words evidence-based policy • instrument constituencies • science–policy nexus •


expertise • policy instruments • meta-instruments • multiple streams framework •
science and technology studies (STS)

To cite this article: Simons, A. and Schniedermann, A. (2021) The neglected politics behind
evidence-based policy: shedding light on instrument constituency dynamics, Policy & Politics,
vol XX, no XX, 1–17, DOI: 10.1332/030557321X16225469993170

1
Arno Simons and Alexander Schniedermann

Introduction

For over three decades, scholars have fiercely debated the prospects and challenges of
‘evidence-based policy’ (EBP).  A series of recent articles in Policy & Politics on evidence
in policymaking and the role of experts (Fleming and Rhodes, 2018; French, 2019;
Christensen, 2020; Sayer, 2020; Straßheim 2020) underscores the current interest in
the topic.While EBP seems to gain momentum, it has become apparent that attempts
to implement EBP into practice have not yielded the promised results.
The evidence on EBP is not at all in its favour. According to a recent systematic
review of nearly 400 EBP publications, the bulk of empirical studies comes to the
conclusion that ‘the successful use of research in policy-making is no straight-forward
matter’ (French, 2018: 435), because ‘[s]cience and politics are intimately entwined
in policy-making’ (431), ‘[a]daptability, insight into others, stamina, persuasiveness,
and self-mastery matter far more than analytical capacity’ (430), and ‘[t]he proportion
of organised knowledge relative to other forms of information in the best policy-
making processes is more modest than proponents of EBP imagine’ (433).This shows
‘how shallow the central dogma of the EBP movement – that major improvements
in policy-making would result if only governments took research seriously – really
is’ (436). Earlier reviews have arrived at similar conclusions. Innvaer et al (2002: 239)
conclude that ‘there is, at best, only limited support for any of the many opinions
put forward in the literature on the use of research evidence by policy-makers’ (see
Oliver et al, 2014).
Is it time, then, to give up on EBP, as French (2019) asks? The critics say yes.While
not all of them reject EBP’s core dogma that more evidence is generally good for
policymaking, many have contested EBP’s definition of evidence and the claim that
evidence should play a major role in the policy process, arguing instead that EBP may
afford ‘policy-based evidence’, characterised by ‘modes of black-boxing, knowledge
monopolisation, blame avoidance and over-simplification’ (see Greenhalgh and
Russel, 2009; Straßheim and Kettunen, 2014: 260). In sharp contrast, the majority of
proponents would defend the claim that policymaking should be based on evidence
to the benefit of society as a whole. However, some proponents acknowledge ‘the
irony… that the evidence base confirming any benefits (or indeed, dysfunctions) of
an evidence-based approach … is actually rather thin’ (Nutley et al, 2006: 2), and
they hope that EBP can be ‘reformed’ or ‘reinvented’ (French, 2019). Overall, it seems
that proponents and critics take ‘fundamentally different philosophical perspectives’
(Boaz et al, 2008: 239) and ‘rarely engage with each other’s arguments’ (Newman,
2017: 1). Some proponents have sensed ‘open warfare’ (Porter and O’Halloran, 2009:
742). Others pride themselves that ‘despite the emergence of a significant critical
literature … the “evidence-based practice” juggernaut rumbles on, largely undeterred’
(Nutley et al, 2007: 17).
What creates and maintains the momentum of EBP? Why does EBP thrive when
its flaws have been pointed out again and again? The key to solving this puzzle, we
contend, lies in a double reframing of EBP as (1) a (procedural) meta-instrument that
co-evolves with and is driven by (2) an instrument constituency. Our ontological
claim is the following:While EBP is not a policy instrument in the traditional sense –
rather a ‘meta-instrument’ that frames the use of other instruments on the basis of
a normative programme – its development and spread are powered by some of the
same fundamental dynamics that underlie the development and spread of many

2
The neglected politics behind evidence-based policy

policy instruments (Simons and Voß, 2018; 2017). It is these dynamics that we call
the neglected politics behind EBP.
Our normative claim is that since EBP is a meta-instrument it should not be
misunderstood as a scientific theory of real-world policymaking. EBP embodies a
distinct vision of how policymaking should be done, namely that it should be ‘based on
evidence’. However, we cannot use this framework to understand how policymaking
actually relates to research – and how this relationship is mediated by meta-instruments
and their constituencies. To reach such an understanding, we should instead turn to
analytical frameworks and taxonomies, preferably those that build on insights from
the science and technology studies (STS) literature.
In the following section, we introduce the notions of meta policy instrument and
instrument constituency as two revised framings of EBP.Then, we devote three sections
to the reconstruction of the development and spread of EBP, from its UK origins over
its international spread to its transnational theorisation. Finally, we discuss a number
of wider implications of our theoretical and empirical contributions.

Two revised framings of EBP


What is EBP ontologically? Given the broad interest in the topic and the intensity
of the debate, the EBP literature has surprisingly little to say on that matter. EBP
is most often referred to either in ideational terms (as an ‘idea’, an ‘approach’, a
‘framework’) or in social terms (as a ‘project’, a ‘community’, a ‘movement’), but rarely
is it conceptualised any further. We propose that both framings can be sharpened
by reframing EBP as a procedural meta-instrument co-evolving with and pushed
by an instrument constituency. To do this, we draw from a recent, STS-informed
strand in the policy instrumentation literature (Lascoumes and Le Gales, 2007;Voß
and Freeman, 2015).
The ideational framing of EBP addresses the latter’s normative and prescriptive
content, that is, its ‘effort to reform or re-structure policy processes in order to
prioritise evidentiary or data-based decision-making’ (Howlett, 2009a: 153). In this
regard, EBP is like a policy instrument: ‘a device that is both technical and social,
that organizes specific social relations between the state and those it is addressed to,
according to the representations and meanings it carries’ (Lascoumes and Le Gales,
2007: 4).
More specifically, EBP aims to reorganise the relationship between policy and
scienc – the science–policy nexus – according to specific assumptions about the role of
research in the policy process.According to Jung et al (2014), any reorganisation of the
science–policy nexus operates on one, two, or all of the following three levels: 1) the
generation and communication of science-based expertise, 2) the regulation of the science–policy
nexus, and 3) reflexive discourses in which both the epistemic authority and the political
relevance of expertise are continually (de-)legitimised and renegotiated.We use these
levels as a search heuristic for our case study and will refer to them as ‘levels 1–3’.
EBP is not an instrument in the ordinary sense. First, it is ‘procedural’ since it
‘informs the policy process, rather than aiming to directly affect the eventual goals
of the policy’ (Sutcliffe and Court, 2005: iii; see Howlett et al, 2018). Second, EBP
works as an umbrella or frame for the use of other instruments believed to increase
the use and quality of research for policy, such as randomised controlled trials (RCTs),
systematic reviews, knowledge brokering, and regulatory impact assessments (RIA).

3
Arno Simons and Alexander Schniedermann

We therefore propose reframing EBP as a procedural meta-instrument.  A meta policy


instrument can be defined as a prescriptive model for policymaking that frames the use of
particular instruments to achieve a set of substantial or procedural goals on the basis of distinct
assumptions about the nature of the policy process. EBP is procedural in a double sense: the
meta-instrument is procedural and so are the instruments it frames.
The social framing of EBP is invoked in statements such as that ‘the EBP
movement gathers momentum’ (Young et al, 2002: 216) and that the former ‘must
be anchored by powerful and effective institutions that play clearly delineated roles
in the movement’ (Haskins, 2018: 22). To better capture this social aspect of EBP,
we propose applying the instrument constituency framework, ‘an exciting new
approach for understanding … the creation of policy instruments’ (Weible, 2018: 72).
This framework re-conceptualises instruments in terms of their ‘social life’, that is,
regarding the actors and practices that develop and spread these instruments (Simons
and Voß, 2018; 2017).
We define the EBP constituency as the network of actors and practices oriented towards
developing, maintaining, and expanding EBP, both as a theoretical model and as implemented
policy practice. We further assume that the constituency is dynamically structured
by functional and structural promises (Simons and Voß, 2018; 2017). Functional
promises refer to the ability of instruments to achieve normative public goals, such
as ‘better government’ in the case of EBP. They help to enrol governments and
stakeholders into adopting EBP measures. Behind the scenes, ‘unofficial’ functional
promises such as ‘depoliticisation’ or ‘blame avoidance’ (Clarence, 2002) may play
a role as well. Structural promises are less salient. They refer to the opportunities,
roles, and positions in a world in which EBP thrives, both as a theoretical model
and as implemented practice. Consider all the new research institutes, government
agencies, think tanks, consultancies, and advisory positions that have been created
in the name of EBP.
To the extent that structural promises align individual interests towards the
development, retention, and expansion of EBP, we can expect the constituency
to begin self-organising as a collective actor (Simons and Voß, 2018; 2017). This
produces a self-reflexive discourse about shared interests leading to the establishment
of organisational structures that cater to the needs of constituency members
(associations, information platforms, training programmes). During the agenda
setting and formulation stages of the policy process, the constituency links EBP as
a solution to problems the constituency itself helps to frame, such as ‘priesthood’
and ‘ideology’ (Solesbury, 2001). Such ‘problem chasing’ has long been described
for individual ‘policy entrepreneurs’ (Kingdon, 2003[1984]; Cairney, 2018). Only
recently, it has also been understood as a much more networked activity of instrument
constituencies (Voß and Simons, 2014; Mukherjee and Howlett, 2015; Simons and
Voß, 2015; Béland and Howlett, 2016; Simons and Voß, 2018; 2017).Whenever EBP
becomes adopted in a given jurisdiction, the constituency will offer its expertise
during the implementation and evaluation stages, not least in anticipation of securing
long-term commitments in the day-to-day administration of EBP initiatives (Simons
and Voß, 2018; 2017).
We hope to show in the following case sections that the proposed double reframing
of EBP generates new insights into the dynamics of EBP’s development and spread.

4
The neglected politics behind evidence-based policy

UK origins

The term ‘EBP’ was coined around the mid-1990s by proponents of ‘evidence-based
medicine’ (EBM), a network of North American and British epidemiologists who
advocated ‘a new paradigm for medical practice’ (Guyatt et al, 1992) emphasising
the production and use of research synthesis as a means to extract ‘the evidence’
from all available research findings on a given topic (Ham et al, 1995; Sackett et al,
1996). EBM is itself a ‘meta-instrument’ – albeit for practice – since it frames the
use of systematic reviews, meta analyses and other ‘evidence tools’ to reorganise the
generation and communication of science-based expertise (level 1 in Jung et al’s
2014 model), shifting the emphasis from research ‘findings’ to ‘evidence’. By virtue
of being linked to functional and structural promises, EBM also gathered its own
constituency, a new ‘evidence movement’ (Hansen and Rieper, 2009), whose first big
success was the foundation of the Cochrane Collaboration in the UK ‘to prepare,
maintain, and disseminate systematic, up-to-date reviews of RCTs of health care’
(Chalmers, 1993: 158). Since then, Cochrane has become an international flagship
of the EBM and later the EBP constituency, and it has served as a role model for the
institutionalisation of evidence synthesis in health care and beyond.
Chasing after problems on which to plug the new meta-instrument, during the
second half of the 1990s the EBM constituency expanded into other disciplines as well
as into the policy domain. Actors from non-medical backgrounds became enrolled
by the functional and structural promises linked to the new meta-instrument: its
potential both for policy change in the actors’ desired direction (functional promises)
and for securing new opportunities for themselves (structural promises).A prominent
example for the latter is the presidential address of Adrian Smith to the Royal Statistical
Society in 1996. Looking for a way to strengthen the society’s ‘profile and image’,
Smith (1996: 369–70) argued:

But what’s so special about medicine?… Perhaps there is an opportunity here


for the Society – together with appropriate allies in other learned societies
and the media – to launch a campaign, directed at developing analogues of
the Cochrane Collaboration, to provide suitable evidence bases in other areas
besides medicine, with the aim of achieving a quantal shift in the quantitative
maturity of public policy debates.

One such analogue emerged in 2000 after Cochrane founder Iain Chalmers had
teamed up with social scientists in the US and in Britain to create the Campbell
Collaboration, which ‘[d]oes for public policy what Cochrane does for health’ (Davies
and Boruch, 2001: 294). Another one emerged between 1993 and 2001 in Britain,
when efforts of Ann Oakley and other education scholars to set up a database for
evidence synthesis were supported by the Department for Education and Skills and
gradually led to the development of the ‘Evidence for Policy and Practice Information
and Co-ordinating Centre’ (EPPI, Oakley et al, 2005).
Through the enrolment of new actors, the emerging EBP constituency gained
momentum, especially in Britain. Looking for a way to set itself off from the previous
conservative government (a structural promise), The Labour Party (1997) adopted
EBP together with the formula ‘what counts is what works’ in 1997 as a marker of
‘modernisation’ and ‘post ideology’ (a functional promise).This meant an accolade for

5
Arno Simons and Alexander Schniedermann

EBP, and it opened an important window of opportunity for the EBP constituency.
A key goal of Tony Blair’s new government was to produce policies ‘shaped by the
evidence rather than a response to short-term pressures’ (Cabinet Office, 1999: 15),
a commitment that explicitly included all policy domains (but see Boaz et al, 2008:
243). In the name of EBP, New Labour not only funded the establishment of evidence
(synthesis) providers and knowledge brokers – such as EPPI, the National Institute
for health and Care Excellence (NICE) (Timmins et al, 2016), and the Centre for
Management and Policy Studies (CMPS) (Haddon, 2012) – but it also used the
meta-instrumental frame to introduce RIA as a means to incorporate EBP through
‘an analysis of costs and benefits’ of all new policy proposals (National Audit Office,
2007: 11). New Labour also supported the use of pilot studies, programme evaluation,
and EBP-related funds (Boaz et al, 2008). These initiatives not only spurred the
generation and communication of science-based expertise but also a re-regulation
of the science–policy nexus (levels 1 and 2).
In 2013, the UK Cabinet Office and Treasury set up the What Works Network ‘to
embed robust evidence at the heart of policy-making and service delivery’ through
evidence and capacity building, advice and guideline development (What Works
Network, 2018: 3). T   he network consists of ten national centres – ‘[l]loosely based on
the model of ’ NICE (9) – which together have secured themselves an occupational
niche through the expansion ‘into new policy areas’ (7) and the attraction of ‘increasing
international attention’ (8). This gives the members of the What Works Network
‘every reason to be optimistic about the future’ (36) – again an illustration of how
structural promises drive constituency formation and integration.
Taken together, the official implementation of EBP by the UK government has
‘led to a growth in the numbers of analysts and in their status … social researchers
have become organised as a formal cadre within UK central government in the same
way as economists and statisticians’ (Boaz et al, 2008: 235).
Outside of government, more and more social scientists joined the EBP constituency
and began developing a more general theory of EBP, resulting in a more explicitly
defined meta-instrumental frame (level 3). Researchers at the newly funded Centre
for Evidence Based Policy and Practice (CEBPP) spearheaded this movement and
claimed a leading role both in ‘taking forward the development of social science
evidence for policy debate’ and in ‘developing appropriate methods and capacities
to meet the needs of the moment’ (Young et al, 2002: 223). The group developed
authority through a series of influential publications (Davies et al, 2000a; Pawson,
2001a; 2001b; Solesbury, 2001; Nutley et al, 2002; Young et al, 2002; Boaz et al,
2002) and the foundation of a new academic journal, Evidence & Policy – ‘the place
to explore [EBP’s] many meanings, how it is operationalised and how it works’.1 To
foster the reflexive organisation of the constituency, the group was also driving the
formation of an Evidence Network (2000–2008), which linked ‘some 900 researchers,
practitioners and policy makers world-wide’2 and contributed to capacity-building,
research and consultancy activities.3
The group’s ambition was to specify EBP as a meta-instrument and to spread it to
other countries, which is also a form of problem chasing (identifying similar problems
in different countries). As CEBPP’s director William Solesbury stated in 2001:

At present, evidence-based policy seems to be principally a British


commitment. The underlying generic issue of how research and policy can

6
The neglected politics behind evidence-based policy

better relate is debated in other countries but the concept of evidence-based


policy and practice has not entered into political discourse in other European
or North American states. However, the recent European Commission
White Paper on governance recognises that: … scientific and other experts
play an increasingly significant role in preparing and monitoring decisions.
(Solesbury, 2001: 6–7)

International spread
With Britain emerging as a forerunner of EBP, the prospects and challenges of EBP
were increasingly discussed in inter- and transnational policy circles. EBP-related
publications by CEBPP and others were featured and discussed in anglophone
journals, many of which had special issues on the topic (Davies et al, 1999; Campbell,
2002; David, 2002; Clarence, 2002; Sherman, 2003). Such events created both social
linkages among researchers across the Atlantic and ideational linkages between the
developments in Britain and the history of RCTs in the US.
The OECD picked up on EBP in the context of a workshop report about the role of
the social sciences for knowledge and decision making (OECD, 2001). CMPS director
Ron Amman (2001: 74) made ‘[t]he case for evidence-based policy’ as a solution to
the problem of ‘silo mentalities’ and the ‘fragmented government machine’ – another
illustration of problem chasing. In contrast, STS scholars Peter Weingart (2001) and
Arie Rip (2001) provided critical perspectives and foreshadowed the debate about
‘policy-based evidence’: Since ‘[f]rom the view of policy makers … evidence is one
weapon in their struggle … certain alliances between policy makers and evidence
providers [can] become more powerful … [and] the benevolence of such alliances
… is not simply a matter of evidence’ (Rip, 2001: 98).
Apparently unimpressed by such critique, the OECD has over the years fully
embraced the meta-instrument and its ‘evidence agenda’ (Burns and Schuller, 2007),
linking it not only to individual policy problems and their associated fields – such
as education (OECD, 2007), security (OECD, 2013), and youth well-being (OECD,
2017a) – but also to regulatory governance as a whole (OECD, 2015; 2017b).
Impacting levels 1 and 2, this has especially made RIA – as framed by EBP – one
of the OECD’s core tools ‘for ensuring the quality of new regulations through an
evidence-based process for decision making’ (OECD, 2015: 96).
In EU governance, EBP surfaced in the context of the functional promises of ‘Better
Regulation’ and the ‘European Research Area’. To our knowledge, the term first
appeared in the Mandelkern report, commissioned by the EU Ministers for Public
Administration to draft an action plan for the Better Regulation agenda.Among other
things, the report, whose authors included two members of the UK Cabinet Office,
advocated the use of RIA as ‘an effective tool for modern, evidence-based policy
making, providing a structured framework for handling policy problems’ (Mandelkern,
2001: ii).Whereas previously the term ‘EBP’ had not been very common in EU circles
(Böhme, 2002), it appeared more often in EU documents from 2005 onwards. EBP
has since been used as a meta-instrument (on levels 1 and 2) to frame and justify not
only the use of RIA but also the establishment of the Framework Programmes for
Research and Technological Development (European Commission, 2013) and the
High Level Group of Scientific Advisors (European Commission, 2015).Today,‘open
and participative evidence-based policy making has a key role to play in enhancing

7
Arno Simons and Alexander Schniedermann

the legitimacy of EU action … [and in] build[ing] better regulation into all stages of
the planning and programming cycle’ (European Commission, 2019: 4).
It has been stated that EBP has a long history in the US. In fact, RCTs have been
used in the US since the 1940s and were made mandatory for drug approval in
1962. Similarly, experiments in social welfare and city planning have been conducted
from the 1960s onwards. However, as Gueron and Rolsten (2013) report on the
basis of interviews with key stakeholders, widespread government support for social
experimentation and EBP only really began growing in 2002, with the creation of
the Institute of Education Sciences and its What Works Clearinghouse as the new
research arm of the United States Department of Education. Founding director
Grover Whitehurst, a key figure in the US EBP constituency, had been committed to
experimental methodologies throughout his professional career (Gueron and Rolsten,
2013: 464) as well as the ‘what works’ slogan, which had been used by New Labour
since 1997 (Whitehurst, 2012).
Already during the Bush Administration, the Office of Management and Budget
‘play[ed] a major role in stimulating and guiding the federal agency branch of the
evidence-based movement’ (Haskins, 2018: 24). Federal support for EBP grew in
2008 when the Obama administration began to roll out what Haskins and Baron
(2011: 4) call ‘the most extensive evidence-based initiatives in US history’.Among the
instruments implemented to achieve this aim were grant programmes to incentivise
agencies in their use of evidence, the strengthening of agency evaluation capacity (for
example in the White House Social and Behavioral Sciences Team), and the linking
of administrative data across agencies (Haskins, 2018).
Outside of government, the following initiatives should be mentioned. Since 2001,
the Coalition for Evidence-Based Policy, now subsumed by the Laura and John Arnold
Foundation, promotes and advocates the use of evidence tools. The Coalition also
promotes the National Network of Education Research–Practice Partnerships, which
aims ‘to develop, support, and connect partnerships between education agencies and
research institutions in order to improve the relationships between research, policy,
and practice’.4 Since 2003, the Abdul Latif Jameel Poverty Action Lab has transformed
into ‘a global research center … [a]nchored by a network of 227 affiliated professors
at universities around the world’.5
To this day, Britain, the EU and the US remain the key adopters of EBP. However,
through lobbying work of the EBP constituency, ‘EBP-ish’ initiatives have also been
launched or discussed in a number of other countries, including Australia, Canada
and New Zealand (Banks, 2009; Nutley et al, 2010; Head, 2010; Lenihan, 2013;
Boaz et al, 2019). Strikingly, most of the countries having adopted EBP follow the
Westminster model.6 We speculate that there is a connection here, but that it is due
to a common reflexive discourse on how to frame the science–policy nexus (level 3)
rather than due to aspects related to the polity in these countries. If we are right, this
could further explain why meta-instruments building on similar premises as EBP –
such as environmental markets with their mantra of ‘sound science’ and ‘cost–benefit
analysis’ – seem to be spreading predominantly in the same countries (Simons et al,
2014; Voß and Simons, 2014; Mann and Simons, 2014).

8
The neglected politics behind evidence-based policy

Transnational theorising

Over the past 25 years, the EBP constituency – a network of actors and practices
oriented towards developing, maintaining and expanding EBP, both as a theoretical
model and as implemented policy practice – has carved out more and more refined
interpretations of how EBP can and should be understood as a meta-instrument:
what types of evidence and instruments are included in the frame, how they work,
when, for whom, in what circumstance and why. Unsurprisingly, such debates have
painted a mixed picture.Till this day, the normative and theoretical positions of EBP
supporters vary (2018; French, 2019). But especially when contrasted with alternative
modelings of the science–policy nexus – such as those based on STS (Jung et al,
2014; Sedlačko and Staroňová, 2015) – the theoretical model behind EBP can be
reconstructed in terms of core and peripheral elements.
At its core, EBP embodies a distinct normative prescription for – rather than
analytical description of – the role of research in public policymaking (Sedlačko and
Staroňová, 2015). EBP promotes ‘an approach where evidence would take centre
stage in the decision-making process’, based on the assumed ‘desirability of both
improving the evidence base and increasing its influence on policy and practice in
the public services’ (Davies et al, 1999: 3–4). The key promise is that more research
in the policy process ultimately leads to ‘better policy-making’ (Bullock et al, 2001)
and the ‘retreat from ideology’ (Solesbury, 2001: 9), and that this is true for all sorts
of policy domains including healthcare (Brownson et al, 2017), education (Cooper
et al, 2009), and criminology (Sherman and Strang, 2007). This basic position has
been reconfirmed even by reformist proponents. Pawson (2006: 177–8), for example,
assures his readers that ‘[t]he percolation of evidence into policy is protracted
and convoluted’ even though he argues ‘for a different vision, a new paradigm, in
evidence-based policy’. Likewise, Nutley et al (2007: 297) have been ‘remaking the
case for a somewhat privileged role for research-based evidence in public policy and
service delivery’ (Nutley et al, 2007: 297): ‘Our clear expectation here is that such
enhancements of research use (inclusively defined) will, for the most part, be to the
betterment of democratic discourse, public policy making, service organisation and
the quality of public services.’
Also at its core, EBP targets three distinct problem areas (Davies et al, 2000b: 5;
see Sedlačko and Staroňová, 2015: 15) for which distinct categories of instruments
are considered. The first problem concerns the generation and quality of evidence
(first aspect of level 1) and raises ‘questions of methodology’ (Davies et al, 2000b:
4). Instruments specifically framed and advocated here include RCTs, systematic
reviews, guidelines and various forms of evidence ‘hierarchies’ (Hansen and Rieper,
2009). The second problem concerns the dissemination structures by which evidence
is communicated (second aspect of level 1). To address this problem, knowledge
brokering through organisations such as Cochrane, Campbell, EPPI and NICE is seen
as the instrument of choice (Van Kammen et al, 2006; MacKillop et al, 2020). Finally,
EBP also frames the use of instruments that help to incorporate evidence into decision
making (level 2), such as RIA and other forms of ex-ante and ex-post evaluation
(Dunlop et al, 2012; Dunlop and Radaelli, 2019).
At its periphery, EBP has undergone conceptual modifications over the years
(French, 2019). We highlight two main areas here. The first is the question of what
sort of evidence should be used to improve policymaking.Whereas dogmatic versions

9
Arno Simons and Alexander Schniedermann

of EBP assume the existence of a ‘hierarchy of evidence’ that emphasises quantitative,


experimental, and meta-analytical methodologies (Oakley et al, 2005; Shaneyfelt,
2016; Doleac, 2019), reformist positions argue that, depending on the policy field
and context, different forms of evidence, including qualitative and non-experimental
methodologies, can be useful (Hutton and Smith, 2000; Davies et al, 2000a; Cookson,
2005; Fleming and Rhodes, 2018). Suggestions have been made to replace hierarchies
with ‘typologies’ (Petticrew and Roberts, 2003) or to move ‘beyond hierarchies’
altogether (Parkhurst, 2017).
A second area of modification at the periphery concerns the underlying
understanding of the policy process (level 3). Whereas more dogmatic versions
assume that research use proceeds in a more or less straightforward way (Amman,
2001; Blunkett, 2000; Hodgkinson, 2000; Results First Initiative, 2014), reformists
have pointed to the real-world complexities of the policy process, arguing that a
more ‘realistic’ assumption of research use is captured in the ‘enlightenment model’
suggesting that scientific knowledge enters the policy sphere through indirect
and unguided channels (Nutley and Webb, 2000; Young et al, 2002; Pawson, 2006;
Sanderson, 2006; Cairney, 2016; Nutley et al, 2007; Hawkins and Parkhust, 2016). In
the same vein, reformists have suggested changing the label from ‘EBP’ to ‘evidence-
enlightened’ or ‘evidence-informed’ policy (Davies et al, 2000a; Young et al, 2002;
Sanderson 2003; Pawson, 2006; Boaz et al, 2019).
Do reformist versions of EBP represent ‘a compromise between political and
technocratic views of policy-making’, as Howlett (2009a: 156) suggests? To some
extent: yes. However, not all reformists have actually changed the label (for example,
Nutley et al, 2010) – according to Pawson (2006: iix) ‘evidence-informed policy… [is]
a horrible expression, all thin-lipped, prissy and politically correct’ – but even where
relabelled, their approaches remain closely associated with notions of ‘positivism’,
‘rationality’, and ‘instrumentality’ (Sedlačko and Staroňová, 2015; French, 2019; Sayer,
2020). Two decades after the first wave of EBP, the key goal, according to reformists,
still is to ‘identify “what works”’ (Boaz et al, 2019:4).
Taken together, the growth of the EBP constituency (outlined in the previous
sections) and the continuous re-specification of EBP by the constituency (discussed
here) show that the meta-instrument has co-evolved with its constituency. Similar
dynamics have been observed in other policy instruments (Simons and Voß, 2018;
2017).We see this as additional support for the claim that instrument constituencies,
as a unique subsystem focused on solutions, are a major factor for explaining the
development and spread of (meta) policy instruments.

Discussion and conclusion


Puzzled by the question why EBP thrives despite the fact that the evidence is not in
its favour, we proposed a double reframing. In terms of its ideational content, we argued
that EBP is best understood as a procedural ‘meta-instrument’ embodying a distinct
(rationalist) view of the policy process and thereby affording and framing the use of
procedural instruments such as RCTs and RIA. In terms of its social underpinnings,
we proposed that EBP is supported and driven by its own instrument constituency
(Simons and Voß, 2018; 2017) – a network of actors and practices aligned in their
shared interest to develop and advocate EBP as a meta-instrument.

10
The neglected politics behind evidence-based policy

This double reframing of EBP is meant as a contribution to the literature on the


science–policy nexus, and especially the use of research in policymaking, as well as
to the broader policy instrumentation literature. First, we want to highlight the role
of meta-instruments like EBP in framing and bundling the use of other instruments
on the basis of distinct goals and assumptions about the policy process. EBP, we
demonstrated, aims at the reorganisation of the science–policy nexus to strengthen
the role of research as a key element in all stages of the policy process. Future research
should investigate how other meta-instruments operate on similar or different
normative programmes, and how this creates aligned bundles of policy instruments.
Second, we are pointing to hitherto neglected politics underlying the development
and spread of (meta-)instruments, resulting from their co-evolution with instrument
constituencies.While the EBP constituency grew when more and more actors from
academia, consulting and policy became interested in advancing EBP not only for
the normative goal of delivering ‘better government’ (the functional promise) but
also to establish an occupational niche for themselves (the structural promise), the
constituency gradually emerged as a collective agency developing EBP further and
advocating it to instrument users. Crucially, such advocacy work included ‘problem
chasing’, a phenomenon that was first described by Kingdon (2003 [1984]) and that
has recently been revived within the instrument constituency framework (Voß and
Simons, 2014; Mukherjee and Howlett, 2015; Simons and Voß, 2015; Béland and
Howlett, 2016; Simons and Voß, 2018; 2017). Our analysis here demonstrates that
‘problem chasing’ also applies to meta-instruments.
Ironically, Cairney (2018: 200) mentions ‘problem chasing’ as a strategy that
debunks ‘romantic stories of “evidence-based policymaking”… in a policy cycle with
predictable, linear stages’. But EBP also relates to ‘problem chasing’ in another, more
fundamental way because it is itself promoted as a problem-framing solution by its
constituency. Convincing policymakers and publics of the necessity to increase the
use of research in the policy process has been a key effort of the EBP constituency.
Consequently, when dealing with EBP, we should not only recognise the role of
politics in policymaking – what Cairney (2016) calls the ‘politics of EBP’ – but also
that EBP is itself a political agenda pushed by a collective agency. This is what we
call the neglected politics behind EBP.
The notion of the ‘meta policy instrument’ requires further theoretical reflection.
Its closest conceptual cousins are Radaelli’s notions of ‘meta-regulation’ (Radaelli,
2007) and ‘meta-instrument’ (Radaelli and Meuwese, 2010). However, Radaelli uses
these notions mainly to refer to what others call ‘procedural instruments’ (Howlett
et al, 2018). Only in one formulation, he refers to RIA as ‘an opportunity structure to
handle a whole set of specific instruments’ (Radaelli and Meuwese, 2010: 142, emphasis
added). Note that we treated RIA as one of the instruments framed by EBP, the
latter being our meta-instrument. This touches upon the crucial question of where
an instrument ends and where a meta-instrument begins – a question to which we
have no definite answer yet.To our understanding, the underlying issue of delineating
and nesting instruments and meta-instruments has not been solved and needs much
further reflection (Howlett, 2009b; see Howlett et al, 2018).
To conclude this article, we would like to advance a normative claim about the use
of EBP within policy analysis. Building on our ontological claim that EBP is a meta-
instrument based on a distinct normative (rationalist) programme of re-organising the
science–policy nexus to make greater use of ‘evidence’, we argue that EBP should

11
Arno Simons and Alexander Schniedermann

not be mistaken as a scientific theory of the actual role of research in policymaking.


We emphasise this because we observe certain tendencies in the EBP literature to
expand the meaning of EBP to include a broader focus on ‘understanding research
use processes’ (Nutley et al, 2007: 297). Presumably, most of us are interested in
understanding the real-world dynamics of the science–policy nexus. But EBP is the
wrong framework to generate such an understanding. To understand the science–
policy nexus and its dynamics, we should instead turn to analytical frameworks
and taxonomies. In this area, STS-inspired approaches are currently very promising
(Lascoumes and Le Gales, 2007;Voß and Simons, 2014; Jung et al, 2014; Straßheim
and Kettunen, 2014; Simons and Voß, 2018; 2017; Voß and Simons, 2018; Voß and
Freeman, 2015; Christensen, 2020).

Notes
1 https://policy.bristoluniversitypress.co.uk/journals/evidence-and-policy/about
2 https://web.archive.org/web/20080703165242/http://evidencenetwork.org/Mission.

html
3 Since 2012, the UK-based Alliance for Useful Evidence – another non-governmental,

‘open access network of more than 4,300 individuals from across government, universities,
charities, businesses, and local authorities in the UK and internationally’ (https://www.
alliance4usefulevidence.org/about-us/aboutus) – serves a similar aim. The alliance is
hosted by the UK’s innovation charity Nesta and specialises in promoting EBP-related
research, training, advice and advocacy.
4 http://nnerpp.rice.edu/about
5 https://www.povertyactionlab.org/about-j-pal
6 We are grateful to one of our anonymous reviewers for having pointed this out.

Funding
This work was supported by the German Ministry of Education and Research (BMBF)
under Grant FKZ: 01PU17017.

Acknowledgements
We thank Holger Straßheim for several rounds of inspiring discussions and his feedback
on earlier versions of our manuscript, Clemens Blümel for supporting our work in the
context of our joint project on the ‘Function, Reception, and Performativity of Review
Literature in Science’ (FuReWiRev) as well as three anonymous reviewers for their
critical reading and for suggesting substantial improvements.

Conflict of interest
The authors declare that there is no conflict of interest.

References
Amman, R. (2001) Evidence-based policy: taking the vision forward, in Social Sciences
for Knowledge and Decision Making, Paris: OECD, pp 73–7.
Banks, G. (2009) Evidence-based policy making: What is it? How do we get it? Canberra:
ANU Public Lecture Series. 
Béland, D. and Howlett, M. (2016) How solutions chase problems: instrument
constituencies in the policy process, Governance, 29(3): 393–409.

12
The neglected politics behind evidence-based policy

Blunkett, D. (2000) Influence or irrelevance: can social science improve government,


Research Intelligence, 71(6): 12–21.
Boaz, A., Davies, H., Fraser, A. and Nutley, S. (eds) (2019) What Works Now? Evidence-
informed Policy and Practice, Bristol: Policy Press.
Boaz, A., Ashby, D. and Young, K. (2002) Systematic reviews: what have they got to
offer evidence based policy and practice?, Centre for Evidence Based Policy and Practice
(CEBPP) Working Paper,Vol. 2 London. 
Boaz,A., Grayson, L., Levitt, R. and Solesbury,W. (2008) Does Evidence-based policy
work? Learning from the UK experience, Evidence & Policy, 4(2): 233–53.
Böhme, K. (2002) Much ado about evidence: reflections from policy making in the
European Union, Planning Theory & Practice, 3(1): 98–101.
Brownson, R., Baker, E., Deshpande, A. and Gillespie, K. (2017) Evidence-based Public
Health, Oxford: Oxford University Press.
Bullock, H., Mountford, J. and Stanley, R. (2001) Better Policy-making, London:
Cabinet Office.
Burns,T. and Schuller,T. (2007) The evidence agenda, in Evidence in Education: Linking
Research and Policy, Paris: OECD, pp 15–32.
Cabinet Office (1999) Modernising Government, London: Cabinet Office.
Cairney, P. (2016) The Politics of Evidence-based Policy Making, London: Palgrave.
Cairney, P. (2018) Three habits of successful policy entrepreneurs, Policy & Politics,
46(2): 199–215.
Campbell, H. (2002) Editorial, Planning Theory & Practice, 3(1): 7–10.
Chalmers, I. (1993) The cochrane collaboration: preparing, maintaining, and
disseminating systematic reviews of the effects of health care, Annals of the New
York Academy of Sciences, 703(1): 156–65.
Christensen, J. (2020) Expert knowledge and policymaking: a multi-disciplinary research
agenda, Policy & Politics. https://doi.org/10.1332/030557320X15898190680037 
Clarence, E. (2002) Technocracy reinvented: the new evidence based policy movement,
Public Policy and Administration, 17(3): 1–11.
Cooper, A., Levin, B. and Campbell, C. (2009) The growing (but still limited)
importance of evidence in education policy and practice, Journal of Educational
Change, 10(2–3): 159–71.
Cookson, R. (2005) Evidence-based policy making in health care: what it is and what
it isn’t, Journal of Health Services Research & Policy, 10(2): 118–121.
David, M. (2002) Introduction: themed section on Evidence-based policy as a concept
for modernising governance and social science research, Social Policy and Society,
1(3): 213–14.
Davies, H., Nutley, S. and Smith, P. (1999) Viewpoint: editorial: what works? The
role of evidence in public sector policy and practice, Public Money and Management,
19(1): 3–5.
Davies, H., Nutley, S. and Smith, P. (2000b) Introducing Evidence-based policy and
practice in public services, in H. Davies, S. Nutley and P. Smith (eds) What Works?
Evidence-based Policy and Practice in Public Services, Bristol: Policy Press, pp 1–11.
Davies, H., Nutley, S. and Smith, P. (eds) (2000a) What Works? Evidence-based Policy
and Practice in Public Services, Bristol: Policy Press.
Davies, P. and Boruch, R. (2001) The Campbell Collaboration: does for public policy
what Cochrane does for health, BMJ, 323(7308): 294–95.

13
Arno Simons and Alexander Schniedermann

Doleac, J.L. (2019) ‘Evidence-based policy’ should reflect a hierarchy of evidence,


Journal of Policy Analysis and Management, 38(2): 517–19.
Dunlop, C., Maggetti, M., Radaelli, C. and Russel, D. (2012) The many uses of
regulatory impact assessment: a meta-analysis of EU and UK cases, Regulation &
Governance, 6(1): 23–45.
Dunlop, C.A. and Radaelli, C.M. (2019) Policy instruments, policy learning and
politics: impact assessment in the European Union, in G. Capano, M. Howlett,
A. Ramesh, and A. Virani (eds),  Making Policies Work, Cheltenham: Edward Elgar
Publishing.
European Commission (2013) REGULATION (EU) No 1291/2013 OJ L 347,
Brussels: European Commission.
European Commission (2015) COMMISSION DECISION of 16.10.2015 C(2015)
6946 Final, Brussels: European Commission.
European Commission (2019) Better Regulation: Taking Stock and Sustaining Our
Commitment, Brussels: European Commission.
Fleming, J. and Rhodes, R. (2018) Can experience be evidence? Craft knowledge
and evidence-based policing, Policy & Politics, 46(1): 3–26.
French, R. (2019) Is it time to give up on evidence-based policy? Four answers, Policy
& Politics, 47(1): 151–68.
French, R. (2018) Lessons from the evidence on evidence-based policy, Canadian
Public Administration, 61(3): 425–42.
Greenhalgh, T. and Russell, J. (2009) Evidence-based policymaking: a critique,
Perspectives in Biology and Medicine, 52(2): 304–18.
Gueron, J. and Rolston, H. (2013) Fighting for Reliable Evidence, New York: Russell
Sage Foundation.
Guyatt, G., Cairns, J., Churchill, D., Cook, D., Haynes, B., Hirsh, J., Irvine, J., Levine,
M., Levine, M. and Nishikawa, J. (1992) Evidence-based medicine: a new approach
to teaching the practice of medicine, Jama, 268(17): 2420–25.
Haddon, C. (2012) Reforming the Civil Service: The Centre for Management and Policy
Studies, 1999–2005, London: Institute for Government. 
Ham, C., Hunter, D. and Robinson, R. (1995) Evidence based policymaking, BMJ:
British Medical Journal, 310(6972): 71.
Hansen, H. and Rieper, O. (2009) The evidence movement: the development and
consequences of methodologies in review practices, Evaluation, 15(2): 141–63.
Haskins, R. (2018) Evidence-based policy: the movement, the goals, the issues, the
promise, The ANNALS of the American Academy of Political and Social Science, 678(1):
8–37.
Haskins, R. and Baron, J. (2011) Building the Connection between Policy and Evidence
The Obama Evidence-based Initiative, London: NESTA.. 
Hawkins, B. and Parkhurst, J. (2016) The ‘good governance’ of evidence in health
policy, Evidence & Policy, 12(4): 575–92.
Head, B. (2010) Reconsidering Evidence-based policy: key issues and challenges,
Policy and Society, 29(2): 77–94.
Hodgkinson, P. (2000) Who wants to be a social engineer? A commentary on David
Blunkett’s speech to the ESRC, Sociological Research Online, 5(1): 74–84.
Howlett, M. (2009a) Policy analytical capacity and Evidence-based Policy-making:
lessons from Canada, Canadian Public Administration, 52(2): 153–75.

14
The neglected politics behind evidence-based policy

Howlett, M. (2009b) Governance modes, policy regimes and operational plans: a


Multi-level nested model of policy instrument choice and policy design, Policy
Sciences, 42(1): 73–89.
Howlett, M., Mukherjee, I. and Woo, J. (2018) Thirty years of research on policy
instruments, in H.K. Colebatch and R. Hoppe (eds), Handbook on Policy, Process and
Governing, Cheltenham: Edward Elgar Publishing, pp 147. 
Hutton, J. and Smith, P. (2000) Non-experimental quantitative methods, in H. Davies,
S. Nutley and P. Smith (eds) What Works? Evidence-based Policy and Practice in Public
Services, Bristol: Policy Press, pp 277–90.
Innvaer, S., Vist, G., Trommald, M. and Oxman, A. (2002) Health Policy-makers’
perceptions of their use of evidence: a systematic review, Journal of Health Services
Research & Policy, 7(4): 239–44.
Jung, A., Korinek, R.L. and Straßheim, H. (2014) Embedded expertise: a conceptual
framework for reconstructing knowledge orders, their transformation and local
specificities, Innovation:The European Journal of Social Science Research, 27(4): 398–419.
Kingdon, J. (2003[1984]) Agendas, Alternatives and Public Policies, New York: Longman.
Labour Party (1997) Labour Party Manifesto,  London: Labour Party. 
Lascoumes, P. and Le Gales, P. (2007) Introduction: understanding public policy
through its instruments-from the nature of instruments to the sociology of public
policy instrumentation, Governance: An International Journal of Policy, Administration,
and Institutions, 20(1): 1–21. 
Lenihan, A. (2013) Lessons from abroad: international approaches to promoting
evidence-based social policy, London: Alliance for Useful Evidence.
MacKillop, E., Quarmby, S. and Downe, J. (2020) Does knowledge brokering facilitate
evidence-based policy? A review of existing knowledge and an agenda for future
research, Policy & Politics, 48(2): 335–53.
Mandelkern, M. (2001) Mandelkern Group on Better Regulation, Final Report.
Mann, C. and Simons, A. (2014) Local emergence and international developments
of conservation trading systems: innovation dynamics and related problems,
Environmental Conservation, 42(4): 325–34. 
Mukherjee, I. and Howlett, M.P. (2015) Who is a stream? Epistemic communities,
instrument constituencies and advocacy coalitions in multiple streams subsystems,
Politics and Governance, 3(2): 65–75.
National Audit Office (2007) Evaluation of Regulatory Impact Assessments 2006–07,
London: National Audit Office.
Newman, J. (2017) Deconstructing the debate over Evidence-based policy, Critical
Policy Studies, 11(2): 211–26.
Nutley, S. and Webb, J. (2000) Evidence and the policy process, in H. Davies, S. Nutley
and P. Smith (eds) What Works? Evidence-based Policy and Practice in Public Services,
Bristol: Policy Press, pp 13–41.
Nutley, S., Davies, H. and Walter, I. (2002) Evidence based policy and practice: cross
sector lessons from the UK, Centre for Evidence Based Policy and Practice (CEBPP)
Working Paper 9, London. 
Nutley, S., Morton, S., Jung, T. and Boaz, A. (2010) Evidence and policy in six
European countries: diverse approaches and common challenges, Evidence & Policy,
6(2): 131–44.
Nutley, S., Walter, I. and Davies, H. (2007) Using Evidence: How Research Can Inform
Public Services, Bristol: The Policy Press.

15
Arno Simons and Alexander Schniedermann

Oakley, A., Gough, D., Oliver, S. and Thomas, J. (2005) The politics of evidence and
methodology: lessons from the EPPI-Centre, Evidence & Policy, 1(1): 5–32.
OECD (2001) Social Sciences for Knowledge and Decision Making, Paris: OECD.
OECD (2007) Evidence in Education. Linking Research and Policy, Paris: OECD.
OECD (2013) Strengthening Evidence-based Policy Making on Security and Justice in
Mexico, Paris: OECD.
OECD (2015) OECD Regulatory Policy Outlook 2015, Paris: OECD.
OECD (2017a) Evidence-based Policy Making forYouth Well-being:A Toolkit, Paris: OECD.
OECD (2017b) Improving Regulatory Governance, Paris: OECD, pp 171–224.
Oliver, K., Lorenc,T. and Innvaer, S. (2014) New directions in Evidence-based policy
research: a critical analysis of the literature, Health Research Policy and Systems, 12(1): 34.
Parkhurst, J. (2017) The Politics of Evidence: From Evidence-based Policy to the Good
Governance of Evidence, Abingdon: Routledge Studies in Governance and Public
Policy, Routledge.
Pawson, R. (2001a) Evidence based policy: I. In search of a method, Centre for
Evidence Based Policy and Practice (CEBPP) Working Paper 3, London. 
Pawson, R. (2001b) Evidence based policy:II. the promise of ‘Realist Synthesis’,
Centre for Evidence Based Policy and Practice (CEBPP) Working Paper 4, London. 
Pawson, R. (2006) Evidence-based Policy: A Realist Perspective, London: SAGE.
Petticrew, M. and Roberts, H. (2003) Evidence, hierarchies, and typologies: horses for
courses, Journal of Epidemiology & Community Health, 57(7): 527–29.
Porter, S. and O’Halloran, P. (2009) The postmodernist war on Evidence-based practice,
International Journal of Nursing Studies, 46(5): 740–48.
Radaelli, C. (2007) Whither better regulation for the Lisbon agenda?, Journal of
European Public Policy, 14(2): 190–207.
Radaelli, C.M. and Meuwese, A.C.M. (2010) Hard questions, hard solutions:
proceduralisation through impact assessment in the EU, West European Politics,
33(1): 136–53.
Results First Initiative (2014) Evidence-based policymaking. a guide for effective
government. 
Rip, A. (2001) In praise of speculation, in Social Sciences for Knowledge and Decision
Making, Paris: OECD, pp 95–103.
Sackett, D., Rosenberg,W., Gray, M., Haynes, B. and Richardson, S. (1996) Evidence
based medicine: what it is and what it isn’t, British Medical Journal, 312: 71-72. 
Sanderson, I. (2003) Is it ‘what works’ that matters? Evaluation and Evidence-based
Policy-making, Research Papers in Education, 18(4): 331–45.
Sanderson, I. (2006) Complexity, ‘practical rationality’ and Evidence-based policy
making, Policy & Politics, 34(1): 115–32.
Sayer, P. (2020) A new epistemology of Evidence-based policy, Policy & Politics, 48(2):
241–58.
Sedlačko, M. and Staroňová, K. (2015) An overview of discourses on knowledge in
policy: thinking knowledge, policy and conflict together, Central European Journal
of Public Policy, 9(2): 10–31.
Shaneyfelt, T. (2016) Pyramids are guides not rules: the evolution of the evidence
pyramid, BMJ Evidence-Based Medicine, 21(4): 121–22.
Sherman, L. (2003) Misleading evidence and evidence-led policy: making social
science more experimental, The Annals of the American Academy of Political and Social
Science, 589(1): 6–19.

16
The neglected politics behind evidence-based policy

Sherman, L. and Strang, H. (2007) Restorative Justice: The Evidence, London: Smith
Institute. 
Simons, A. and Voß, J.P. (2015) Politics by other means: the making of the emissions
trading instrument as a ‘pre-history’ of carbon trading, in B. Stephan and R. Lane
(eds) The Politics of Carbon Markets, London & New York: Routlege, pp 51–68.
Simons, A. and Voß, J.P. (2017) Policy instrument constituencies, in M. Howlett and
I. Mukherjee (eds) Handbook of Policy Formulation, Cheltenham: Edward Elgar, pp
355–72.
Simons, A. and Voß, J.P. (2018) The concept of instrument constituencies: accounting
for dynamics and practices of knowing governance, Policy and Society, 37(1): 14–35.
Simons, A., Lis, A. and Lippert, I. (2014) The political duality of scale-making in
environmental markets, Environmental Politics, 23(4): 632–49.
Smith, A. (1996) Mad cows and ecstasy: chance and choice in an evidence-based
society, Journal-Royal Statistical Society Series A, 159(3): 367–384. 
Solesbury, W. (2001) Evidence based policy: whence it came and where it’s going,
Centre for Evidence Based Policy and Practice (CEBPP) Working Paper 1, London. 
Straßheim, H. (2020) Who are behavioural public policy experts and how are they
organised globally?, Policy & Politics, 49(1): 69–86. 
Straßheim, H. and Kettunen, P. (2014) When does Evidence-based policy turn into
policy-based evidence? Configurations, contexts and mechanisms, Evidence & Policy,
10(2): 259–77.
Sutcliffe, S. and Court, J. (2005) Evidence-based policymaking: What is it? How does it
work? What relevance for developing countries? London: Overseas Development Institute. 
Timmins, N., Rawlins, M. and Appleby, J. (2016) A Terrible Beauty: A Short History
of NICE the National Institute for Health and Care Excellence, Nonthaburi: HITAP. 
Van Kammen, J., de Savigny, D. and Sewankambo, N. (2006) Using knowledge
brokering to promote Evidence-based Policy-making: the need for support
structures, Bulletin of the World Health Organization, 84(8): 608–612. 
Voß, J.P. and Freeman, R. (2015) Knowing Governance, Basingstoke: Palgrave Macmillan.
Voß, J.P. and Simons,A. (2014) Instrument constituencies and the supply side of policy
innovation: the social life of emissions trading, Environmental Politics, 23(5): 735–54.
Voß, J.P. and Simons, A. (2018) A novel understanding of experimentation in
governance: co-producing innovations between ‘lab’ and ‘field’, Policy Sciences,
51(2): 213–29.
Weible, C. (2018) Instrument constituencies and the advocacy coalition framework:
an essay on the comparisons, opportunities, and intersections, Policy and Society,
37(1): 59–73.
Weingart, P. (2001) Paradoxes of scientific advice to politics, in Social Sciences for
Knowledge and Decision Making, Paris: OECD, pp 78–94.
What Works Network (2018) The What Works Network. Five Years On, London: What
Works Network. 
Whitehurst, G. (2012) The value of experiments in education, Education Finance and
Policy, 7(2): 107–23.
Young, K.,Ashby, D., Boaz,A. and Grayson, L. (2002) Social science and the evidence-
based policy movement, Social Policy and Society, 1(3): 215.

17

You might also like