Professional Documents
Culture Documents
Simons Adn Schniedermann 2021 The Neglected Politics Behind Evidence-Based Policy
Simons Adn Schniedermann 2021 The Neglected Politics Behind Evidence-Based Policy
article
The neglected politics behind evidence-based
policy: shedding light on instrument constituency
dynamics
Arno Simons, arno.simons@posteo.de
German Centre for Higher Education Research and Science Studies (DZHW),
Germany
Humboldt-Universität zu Berlin, Germany
Alexander Schniedermann, schniedermann@dzhw.eu
German Centre for Higher Education Research and Science Studies (DZHW),
Germany
Puzzled by the question why evidence-based policy (EBP) thrives despite evidence against it, we
reconstruct the development and spread of EBP in inter- and transnational contexts and find that
this process is characterised by some of the same dynamics (including ‘structural promises’ and
‘problem chasing’) that have also been observed in many policy instruments. We therefore propose
a double reframing: EBP is (1) a ‘meta-instrument’ aiming to establish a particular role for research
in policymaking (our ideational reframing) and (2) co-evolving with an ‘instrument constituency’
motivated not only by normative goals but also by the prospect of securing an occupational niche
for itself (our social reframing). Taken together, these reframings reveal the neglected politics
behind EBP and prompt us to treat EBP as a political device rather than as an analytical framework
to explain how policymaking actually relates to research.
To cite this article: Simons, A. and Schniedermann, A. (2021) The neglected politics behind
evidence-based policy: shedding light on instrument constituency dynamics, Policy & Politics,
vol XX, no XX, 1–17, DOI: 10.1332/030557321X16225469993170
1
Arno Simons and Alexander Schniedermann
Introduction
For over three decades, scholars have fiercely debated the prospects and challenges of
‘evidence-based policy’ (EBP). A series of recent articles in Policy & Politics on evidence
in policymaking and the role of experts (Fleming and Rhodes, 2018; French, 2019;
Christensen, 2020; Sayer, 2020; Straßheim 2020) underscores the current interest in
the topic.While EBP seems to gain momentum, it has become apparent that attempts
to implement EBP into practice have not yielded the promised results.
The evidence on EBP is not at all in its favour. According to a recent systematic
review of nearly 400 EBP publications, the bulk of empirical studies comes to the
conclusion that ‘the successful use of research in policy-making is no straight-forward
matter’ (French, 2018: 435), because ‘[s]cience and politics are intimately entwined
in policy-making’ (431), ‘[a]daptability, insight into others, stamina, persuasiveness,
and self-mastery matter far more than analytical capacity’ (430), and ‘[t]he proportion
of organised knowledge relative to other forms of information in the best policy-
making processes is more modest than proponents of EBP imagine’ (433).This shows
‘how shallow the central dogma of the EBP movement – that major improvements
in policy-making would result if only governments took research seriously – really
is’ (436). Earlier reviews have arrived at similar conclusions. Innvaer et al (2002: 239)
conclude that ‘there is, at best, only limited support for any of the many opinions
put forward in the literature on the use of research evidence by policy-makers’ (see
Oliver et al, 2014).
Is it time, then, to give up on EBP, as French (2019) asks? The critics say yes.While
not all of them reject EBP’s core dogma that more evidence is generally good for
policymaking, many have contested EBP’s definition of evidence and the claim that
evidence should play a major role in the policy process, arguing instead that EBP may
afford ‘policy-based evidence’, characterised by ‘modes of black-boxing, knowledge
monopolisation, blame avoidance and over-simplification’ (see Greenhalgh and
Russel, 2009; Straßheim and Kettunen, 2014: 260). In sharp contrast, the majority of
proponents would defend the claim that policymaking should be based on evidence
to the benefit of society as a whole. However, some proponents acknowledge ‘the
irony… that the evidence base confirming any benefits (or indeed, dysfunctions) of
an evidence-based approach … is actually rather thin’ (Nutley et al, 2006: 2), and
they hope that EBP can be ‘reformed’ or ‘reinvented’ (French, 2019). Overall, it seems
that proponents and critics take ‘fundamentally different philosophical perspectives’
(Boaz et al, 2008: 239) and ‘rarely engage with each other’s arguments’ (Newman,
2017: 1). Some proponents have sensed ‘open warfare’ (Porter and O’Halloran, 2009:
742). Others pride themselves that ‘despite the emergence of a significant critical
literature … the “evidence-based practice” juggernaut rumbles on, largely undeterred’
(Nutley et al, 2007: 17).
What creates and maintains the momentum of EBP? Why does EBP thrive when
its flaws have been pointed out again and again? The key to solving this puzzle, we
contend, lies in a double reframing of EBP as (1) a (procedural) meta-instrument that
co-evolves with and is driven by (2) an instrument constituency. Our ontological
claim is the following:While EBP is not a policy instrument in the traditional sense –
rather a ‘meta-instrument’ that frames the use of other instruments on the basis of
a normative programme – its development and spread are powered by some of the
same fundamental dynamics that underlie the development and spread of many
2
The neglected politics behind evidence-based policy
policy instruments (Simons and Voß, 2018; 2017). It is these dynamics that we call
the neglected politics behind EBP.
Our normative claim is that since EBP is a meta-instrument it should not be
misunderstood as a scientific theory of real-world policymaking. EBP embodies a
distinct vision of how policymaking should be done, namely that it should be ‘based on
evidence’. However, we cannot use this framework to understand how policymaking
actually relates to research – and how this relationship is mediated by meta-instruments
and their constituencies. To reach such an understanding, we should instead turn to
analytical frameworks and taxonomies, preferably those that build on insights from
the science and technology studies (STS) literature.
In the following section, we introduce the notions of meta policy instrument and
instrument constituency as two revised framings of EBP.Then, we devote three sections
to the reconstruction of the development and spread of EBP, from its UK origins over
its international spread to its transnational theorisation. Finally, we discuss a number
of wider implications of our theoretical and empirical contributions.
3
Arno Simons and Alexander Schniedermann
4
The neglected politics behind evidence-based policy
UK origins
The term ‘EBP’ was coined around the mid-1990s by proponents of ‘evidence-based
medicine’ (EBM), a network of North American and British epidemiologists who
advocated ‘a new paradigm for medical practice’ (Guyatt et al, 1992) emphasising
the production and use of research synthesis as a means to extract ‘the evidence’
from all available research findings on a given topic (Ham et al, 1995; Sackett et al,
1996). EBM is itself a ‘meta-instrument’ – albeit for practice – since it frames the
use of systematic reviews, meta analyses and other ‘evidence tools’ to reorganise the
generation and communication of science-based expertise (level 1 in Jung et al’s
2014 model), shifting the emphasis from research ‘findings’ to ‘evidence’. By virtue
of being linked to functional and structural promises, EBM also gathered its own
constituency, a new ‘evidence movement’ (Hansen and Rieper, 2009), whose first big
success was the foundation of the Cochrane Collaboration in the UK ‘to prepare,
maintain, and disseminate systematic, up-to-date reviews of RCTs of health care’
(Chalmers, 1993: 158). Since then, Cochrane has become an international flagship
of the EBM and later the EBP constituency, and it has served as a role model for the
institutionalisation of evidence synthesis in health care and beyond.
Chasing after problems on which to plug the new meta-instrument, during the
second half of the 1990s the EBM constituency expanded into other disciplines as well
as into the policy domain. Actors from non-medical backgrounds became enrolled
by the functional and structural promises linked to the new meta-instrument: its
potential both for policy change in the actors’ desired direction (functional promises)
and for securing new opportunities for themselves (structural promises).A prominent
example for the latter is the presidential address of Adrian Smith to the Royal Statistical
Society in 1996. Looking for a way to strengthen the society’s ‘profile and image’,
Smith (1996: 369–70) argued:
One such analogue emerged in 2000 after Cochrane founder Iain Chalmers had
teamed up with social scientists in the US and in Britain to create the Campbell
Collaboration, which ‘[d]oes for public policy what Cochrane does for health’ (Davies
and Boruch, 2001: 294). Another one emerged between 1993 and 2001 in Britain,
when efforts of Ann Oakley and other education scholars to set up a database for
evidence synthesis were supported by the Department for Education and Skills and
gradually led to the development of the ‘Evidence for Policy and Practice Information
and Co-ordinating Centre’ (EPPI, Oakley et al, 2005).
Through the enrolment of new actors, the emerging EBP constituency gained
momentum, especially in Britain. Looking for a way to set itself off from the previous
conservative government (a structural promise), The Labour Party (1997) adopted
EBP together with the formula ‘what counts is what works’ in 1997 as a marker of
‘modernisation’ and ‘post ideology’ (a functional promise).This meant an accolade for
5
Arno Simons and Alexander Schniedermann
EBP, and it opened an important window of opportunity for the EBP constituency.
A key goal of Tony Blair’s new government was to produce policies ‘shaped by the
evidence rather than a response to short-term pressures’ (Cabinet Office, 1999: 15),
a commitment that explicitly included all policy domains (but see Boaz et al, 2008:
243). In the name of EBP, New Labour not only funded the establishment of evidence
(synthesis) providers and knowledge brokers – such as EPPI, the National Institute
for health and Care Excellence (NICE) (Timmins et al, 2016), and the Centre for
Management and Policy Studies (CMPS) (Haddon, 2012) – but it also used the
meta-instrumental frame to introduce RIA as a means to incorporate EBP through
‘an analysis of costs and benefits’ of all new policy proposals (National Audit Office,
2007: 11). New Labour also supported the use of pilot studies, programme evaluation,
and EBP-related funds (Boaz et al, 2008). These initiatives not only spurred the
generation and communication of science-based expertise but also a re-regulation
of the science–policy nexus (levels 1 and 2).
In 2013, the UK Cabinet Office and Treasury set up the What Works Network ‘to
embed robust evidence at the heart of policy-making and service delivery’ through
evidence and capacity building, advice and guideline development (What Works
Network, 2018: 3). T he network consists of ten national centres – ‘[l]loosely based on
the model of ’ NICE (9) – which together have secured themselves an occupational
niche through the expansion ‘into new policy areas’ (7) and the attraction of ‘increasing
international attention’ (8). This gives the members of the What Works Network
‘every reason to be optimistic about the future’ (36) – again an illustration of how
structural promises drive constituency formation and integration.
Taken together, the official implementation of EBP by the UK government has
‘led to a growth in the numbers of analysts and in their status … social researchers
have become organised as a formal cadre within UK central government in the same
way as economists and statisticians’ (Boaz et al, 2008: 235).
Outside of government, more and more social scientists joined the EBP constituency
and began developing a more general theory of EBP, resulting in a more explicitly
defined meta-instrumental frame (level 3). Researchers at the newly funded Centre
for Evidence Based Policy and Practice (CEBPP) spearheaded this movement and
claimed a leading role both in ‘taking forward the development of social science
evidence for policy debate’ and in ‘developing appropriate methods and capacities
to meet the needs of the moment’ (Young et al, 2002: 223). The group developed
authority through a series of influential publications (Davies et al, 2000a; Pawson,
2001a; 2001b; Solesbury, 2001; Nutley et al, 2002; Young et al, 2002; Boaz et al,
2002) and the foundation of a new academic journal, Evidence & Policy – ‘the place
to explore [EBP’s] many meanings, how it is operationalised and how it works’.1 To
foster the reflexive organisation of the constituency, the group was also driving the
formation of an Evidence Network (2000–2008), which linked ‘some 900 researchers,
practitioners and policy makers world-wide’2 and contributed to capacity-building,
research and consultancy activities.3
The group’s ambition was to specify EBP as a meta-instrument and to spread it to
other countries, which is also a form of problem chasing (identifying similar problems
in different countries). As CEBPP’s director William Solesbury stated in 2001:
6
The neglected politics behind evidence-based policy
International spread
With Britain emerging as a forerunner of EBP, the prospects and challenges of EBP
were increasingly discussed in inter- and transnational policy circles. EBP-related
publications by CEBPP and others were featured and discussed in anglophone
journals, many of which had special issues on the topic (Davies et al, 1999; Campbell,
2002; David, 2002; Clarence, 2002; Sherman, 2003). Such events created both social
linkages among researchers across the Atlantic and ideational linkages between the
developments in Britain and the history of RCTs in the US.
The OECD picked up on EBP in the context of a workshop report about the role of
the social sciences for knowledge and decision making (OECD, 2001). CMPS director
Ron Amman (2001: 74) made ‘[t]he case for evidence-based policy’ as a solution to
the problem of ‘silo mentalities’ and the ‘fragmented government machine’ – another
illustration of problem chasing. In contrast, STS scholars Peter Weingart (2001) and
Arie Rip (2001) provided critical perspectives and foreshadowed the debate about
‘policy-based evidence’: Since ‘[f]rom the view of policy makers … evidence is one
weapon in their struggle … certain alliances between policy makers and evidence
providers [can] become more powerful … [and] the benevolence of such alliances
… is not simply a matter of evidence’ (Rip, 2001: 98).
Apparently unimpressed by such critique, the OECD has over the years fully
embraced the meta-instrument and its ‘evidence agenda’ (Burns and Schuller, 2007),
linking it not only to individual policy problems and their associated fields – such
as education (OECD, 2007), security (OECD, 2013), and youth well-being (OECD,
2017a) – but also to regulatory governance as a whole (OECD, 2015; 2017b).
Impacting levels 1 and 2, this has especially made RIA – as framed by EBP – one
of the OECD’s core tools ‘for ensuring the quality of new regulations through an
evidence-based process for decision making’ (OECD, 2015: 96).
In EU governance, EBP surfaced in the context of the functional promises of ‘Better
Regulation’ and the ‘European Research Area’. To our knowledge, the term first
appeared in the Mandelkern report, commissioned by the EU Ministers for Public
Administration to draft an action plan for the Better Regulation agenda.Among other
things, the report, whose authors included two members of the UK Cabinet Office,
advocated the use of RIA as ‘an effective tool for modern, evidence-based policy
making, providing a structured framework for handling policy problems’ (Mandelkern,
2001: ii).Whereas previously the term ‘EBP’ had not been very common in EU circles
(Böhme, 2002), it appeared more often in EU documents from 2005 onwards. EBP
has since been used as a meta-instrument (on levels 1 and 2) to frame and justify not
only the use of RIA but also the establishment of the Framework Programmes for
Research and Technological Development (European Commission, 2013) and the
High Level Group of Scientific Advisors (European Commission, 2015).Today,‘open
and participative evidence-based policy making has a key role to play in enhancing
7
Arno Simons and Alexander Schniedermann
the legitimacy of EU action … [and in] build[ing] better regulation into all stages of
the planning and programming cycle’ (European Commission, 2019: 4).
It has been stated that EBP has a long history in the US. In fact, RCTs have been
used in the US since the 1940s and were made mandatory for drug approval in
1962. Similarly, experiments in social welfare and city planning have been conducted
from the 1960s onwards. However, as Gueron and Rolsten (2013) report on the
basis of interviews with key stakeholders, widespread government support for social
experimentation and EBP only really began growing in 2002, with the creation of
the Institute of Education Sciences and its What Works Clearinghouse as the new
research arm of the United States Department of Education. Founding director
Grover Whitehurst, a key figure in the US EBP constituency, had been committed to
experimental methodologies throughout his professional career (Gueron and Rolsten,
2013: 464) as well as the ‘what works’ slogan, which had been used by New Labour
since 1997 (Whitehurst, 2012).
Already during the Bush Administration, the Office of Management and Budget
‘play[ed] a major role in stimulating and guiding the federal agency branch of the
evidence-based movement’ (Haskins, 2018: 24). Federal support for EBP grew in
2008 when the Obama administration began to roll out what Haskins and Baron
(2011: 4) call ‘the most extensive evidence-based initiatives in US history’.Among the
instruments implemented to achieve this aim were grant programmes to incentivise
agencies in their use of evidence, the strengthening of agency evaluation capacity (for
example in the White House Social and Behavioral Sciences Team), and the linking
of administrative data across agencies (Haskins, 2018).
Outside of government, the following initiatives should be mentioned. Since 2001,
the Coalition for Evidence-Based Policy, now subsumed by the Laura and John Arnold
Foundation, promotes and advocates the use of evidence tools. The Coalition also
promotes the National Network of Education Research–Practice Partnerships, which
aims ‘to develop, support, and connect partnerships between education agencies and
research institutions in order to improve the relationships between research, policy,
and practice’.4 Since 2003, the Abdul Latif Jameel Poverty Action Lab has transformed
into ‘a global research center … [a]nchored by a network of 227 affiliated professors
at universities around the world’.5
To this day, Britain, the EU and the US remain the key adopters of EBP. However,
through lobbying work of the EBP constituency, ‘EBP-ish’ initiatives have also been
launched or discussed in a number of other countries, including Australia, Canada
and New Zealand (Banks, 2009; Nutley et al, 2010; Head, 2010; Lenihan, 2013;
Boaz et al, 2019). Strikingly, most of the countries having adopted EBP follow the
Westminster model.6 We speculate that there is a connection here, but that it is due
to a common reflexive discourse on how to frame the science–policy nexus (level 3)
rather than due to aspects related to the polity in these countries. If we are right, this
could further explain why meta-instruments building on similar premises as EBP –
such as environmental markets with their mantra of ‘sound science’ and ‘cost–benefit
analysis’ – seem to be spreading predominantly in the same countries (Simons et al,
2014; Voß and Simons, 2014; Mann and Simons, 2014).
8
The neglected politics behind evidence-based policy
Transnational theorising
Over the past 25 years, the EBP constituency – a network of actors and practices
oriented towards developing, maintaining and expanding EBP, both as a theoretical
model and as implemented policy practice – has carved out more and more refined
interpretations of how EBP can and should be understood as a meta-instrument:
what types of evidence and instruments are included in the frame, how they work,
when, for whom, in what circumstance and why. Unsurprisingly, such debates have
painted a mixed picture.Till this day, the normative and theoretical positions of EBP
supporters vary (2018; French, 2019). But especially when contrasted with alternative
modelings of the science–policy nexus – such as those based on STS (Jung et al,
2014; Sedlačko and Staroňová, 2015) – the theoretical model behind EBP can be
reconstructed in terms of core and peripheral elements.
At its core, EBP embodies a distinct normative prescription for – rather than
analytical description of – the role of research in public policymaking (Sedlačko and
Staroňová, 2015). EBP promotes ‘an approach where evidence would take centre
stage in the decision-making process’, based on the assumed ‘desirability of both
improving the evidence base and increasing its influence on policy and practice in
the public services’ (Davies et al, 1999: 3–4). The key promise is that more research
in the policy process ultimately leads to ‘better policy-making’ (Bullock et al, 2001)
and the ‘retreat from ideology’ (Solesbury, 2001: 9), and that this is true for all sorts
of policy domains including healthcare (Brownson et al, 2017), education (Cooper
et al, 2009), and criminology (Sherman and Strang, 2007). This basic position has
been reconfirmed even by reformist proponents. Pawson (2006: 177–8), for example,
assures his readers that ‘[t]he percolation of evidence into policy is protracted
and convoluted’ even though he argues ‘for a different vision, a new paradigm, in
evidence-based policy’. Likewise, Nutley et al (2007: 297) have been ‘remaking the
case for a somewhat privileged role for research-based evidence in public policy and
service delivery’ (Nutley et al, 2007: 297): ‘Our clear expectation here is that such
enhancements of research use (inclusively defined) will, for the most part, be to the
betterment of democratic discourse, public policy making, service organisation and
the quality of public services.’
Also at its core, EBP targets three distinct problem areas (Davies et al, 2000b: 5;
see Sedlačko and Staroňová, 2015: 15) for which distinct categories of instruments
are considered. The first problem concerns the generation and quality of evidence
(first aspect of level 1) and raises ‘questions of methodology’ (Davies et al, 2000b:
4). Instruments specifically framed and advocated here include RCTs, systematic
reviews, guidelines and various forms of evidence ‘hierarchies’ (Hansen and Rieper,
2009). The second problem concerns the dissemination structures by which evidence
is communicated (second aspect of level 1). To address this problem, knowledge
brokering through organisations such as Cochrane, Campbell, EPPI and NICE is seen
as the instrument of choice (Van Kammen et al, 2006; MacKillop et al, 2020). Finally,
EBP also frames the use of instruments that help to incorporate evidence into decision
making (level 2), such as RIA and other forms of ex-ante and ex-post evaluation
(Dunlop et al, 2012; Dunlop and Radaelli, 2019).
At its periphery, EBP has undergone conceptual modifications over the years
(French, 2019). We highlight two main areas here. The first is the question of what
sort of evidence should be used to improve policymaking.Whereas dogmatic versions
9
Arno Simons and Alexander Schniedermann
10
The neglected politics behind evidence-based policy
11
Arno Simons and Alexander Schniedermann
Notes
1 https://policy.bristoluniversitypress.co.uk/journals/evidence-and-policy/about
2 https://web.archive.org/web/20080703165242/http://evidencenetwork.org/Mission.
html
3 Since 2012, the UK-based Alliance for Useful Evidence – another non-governmental,
‘open access network of more than 4,300 individuals from across government, universities,
charities, businesses, and local authorities in the UK and internationally’ (https://www.
alliance4usefulevidence.org/about-us/aboutus) – serves a similar aim. The alliance is
hosted by the UK’s innovation charity Nesta and specialises in promoting EBP-related
research, training, advice and advocacy.
4 http://nnerpp.rice.edu/about
5 https://www.povertyactionlab.org/about-j-pal
6 We are grateful to one of our anonymous reviewers for having pointed this out.
Funding
This work was supported by the German Ministry of Education and Research (BMBF)
under Grant FKZ: 01PU17017.
Acknowledgements
We thank Holger Straßheim for several rounds of inspiring discussions and his feedback
on earlier versions of our manuscript, Clemens Blümel for supporting our work in the
context of our joint project on the ‘Function, Reception, and Performativity of Review
Literature in Science’ (FuReWiRev) as well as three anonymous reviewers for their
critical reading and for suggesting substantial improvements.
Conflict of interest
The authors declare that there is no conflict of interest.
References
Amman, R. (2001) Evidence-based policy: taking the vision forward, in Social Sciences
for Knowledge and Decision Making, Paris: OECD, pp 73–7.
Banks, G. (2009) Evidence-based policy making: What is it? How do we get it? Canberra:
ANU Public Lecture Series.
Béland, D. and Howlett, M. (2016) How solutions chase problems: instrument
constituencies in the policy process, Governance, 29(3): 393–409.
12
The neglected politics behind evidence-based policy
13
Arno Simons and Alexander Schniedermann
14
The neglected politics behind evidence-based policy
15
Arno Simons and Alexander Schniedermann
Oakley, A., Gough, D., Oliver, S. and Thomas, J. (2005) The politics of evidence and
methodology: lessons from the EPPI-Centre, Evidence & Policy, 1(1): 5–32.
OECD (2001) Social Sciences for Knowledge and Decision Making, Paris: OECD.
OECD (2007) Evidence in Education. Linking Research and Policy, Paris: OECD.
OECD (2013) Strengthening Evidence-based Policy Making on Security and Justice in
Mexico, Paris: OECD.
OECD (2015) OECD Regulatory Policy Outlook 2015, Paris: OECD.
OECD (2017a) Evidence-based Policy Making forYouth Well-being:A Toolkit, Paris: OECD.
OECD (2017b) Improving Regulatory Governance, Paris: OECD, pp 171–224.
Oliver, K., Lorenc,T. and Innvaer, S. (2014) New directions in Evidence-based policy
research: a critical analysis of the literature, Health Research Policy and Systems, 12(1): 34.
Parkhurst, J. (2017) The Politics of Evidence: From Evidence-based Policy to the Good
Governance of Evidence, Abingdon: Routledge Studies in Governance and Public
Policy, Routledge.
Pawson, R. (2001a) Evidence based policy: I. In search of a method, Centre for
Evidence Based Policy and Practice (CEBPP) Working Paper 3, London.
Pawson, R. (2001b) Evidence based policy:II. the promise of ‘Realist Synthesis’,
Centre for Evidence Based Policy and Practice (CEBPP) Working Paper 4, London.
Pawson, R. (2006) Evidence-based Policy: A Realist Perspective, London: SAGE.
Petticrew, M. and Roberts, H. (2003) Evidence, hierarchies, and typologies: horses for
courses, Journal of Epidemiology & Community Health, 57(7): 527–29.
Porter, S. and O’Halloran, P. (2009) The postmodernist war on Evidence-based practice,
International Journal of Nursing Studies, 46(5): 740–48.
Radaelli, C. (2007) Whither better regulation for the Lisbon agenda?, Journal of
European Public Policy, 14(2): 190–207.
Radaelli, C.M. and Meuwese, A.C.M. (2010) Hard questions, hard solutions:
proceduralisation through impact assessment in the EU, West European Politics,
33(1): 136–53.
Results First Initiative (2014) Evidence-based policymaking. a guide for effective
government.
Rip, A. (2001) In praise of speculation, in Social Sciences for Knowledge and Decision
Making, Paris: OECD, pp 95–103.
Sackett, D., Rosenberg,W., Gray, M., Haynes, B. and Richardson, S. (1996) Evidence
based medicine: what it is and what it isn’t, British Medical Journal, 312: 71-72.
Sanderson, I. (2003) Is it ‘what works’ that matters? Evaluation and Evidence-based
Policy-making, Research Papers in Education, 18(4): 331–45.
Sanderson, I. (2006) Complexity, ‘practical rationality’ and Evidence-based policy
making, Policy & Politics, 34(1): 115–32.
Sayer, P. (2020) A new epistemology of Evidence-based policy, Policy & Politics, 48(2):
241–58.
Sedlačko, M. and Staroňová, K. (2015) An overview of discourses on knowledge in
policy: thinking knowledge, policy and conflict together, Central European Journal
of Public Policy, 9(2): 10–31.
Shaneyfelt, T. (2016) Pyramids are guides not rules: the evolution of the evidence
pyramid, BMJ Evidence-Based Medicine, 21(4): 121–22.
Sherman, L. (2003) Misleading evidence and evidence-led policy: making social
science more experimental, The Annals of the American Academy of Political and Social
Science, 589(1): 6–19.
16
The neglected politics behind evidence-based policy
Sherman, L. and Strang, H. (2007) Restorative Justice: The Evidence, London: Smith
Institute.
Simons, A. and Voß, J.P. (2015) Politics by other means: the making of the emissions
trading instrument as a ‘pre-history’ of carbon trading, in B. Stephan and R. Lane
(eds) The Politics of Carbon Markets, London & New York: Routlege, pp 51–68.
Simons, A. and Voß, J.P. (2017) Policy instrument constituencies, in M. Howlett and
I. Mukherjee (eds) Handbook of Policy Formulation, Cheltenham: Edward Elgar, pp
355–72.
Simons, A. and Voß, J.P. (2018) The concept of instrument constituencies: accounting
for dynamics and practices of knowing governance, Policy and Society, 37(1): 14–35.
Simons, A., Lis, A. and Lippert, I. (2014) The political duality of scale-making in
environmental markets, Environmental Politics, 23(4): 632–49.
Smith, A. (1996) Mad cows and ecstasy: chance and choice in an evidence-based
society, Journal-Royal Statistical Society Series A, 159(3): 367–384.
Solesbury, W. (2001) Evidence based policy: whence it came and where it’s going,
Centre for Evidence Based Policy and Practice (CEBPP) Working Paper 1, London.
Straßheim, H. (2020) Who are behavioural public policy experts and how are they
organised globally?, Policy & Politics, 49(1): 69–86.
Straßheim, H. and Kettunen, P. (2014) When does Evidence-based policy turn into
policy-based evidence? Configurations, contexts and mechanisms, Evidence & Policy,
10(2): 259–77.
Sutcliffe, S. and Court, J. (2005) Evidence-based policymaking: What is it? How does it
work? What relevance for developing countries? London: Overseas Development Institute.
Timmins, N., Rawlins, M. and Appleby, J. (2016) A Terrible Beauty: A Short History
of NICE the National Institute for Health and Care Excellence, Nonthaburi: HITAP.
Van Kammen, J., de Savigny, D. and Sewankambo, N. (2006) Using knowledge
brokering to promote Evidence-based Policy-making: the need for support
structures, Bulletin of the World Health Organization, 84(8): 608–612.
Voß, J.P. and Freeman, R. (2015) Knowing Governance, Basingstoke: Palgrave Macmillan.
Voß, J.P. and Simons,A. (2014) Instrument constituencies and the supply side of policy
innovation: the social life of emissions trading, Environmental Politics, 23(5): 735–54.
Voß, J.P. and Simons, A. (2018) A novel understanding of experimentation in
governance: co-producing innovations between ‘lab’ and ‘field’, Policy Sciences,
51(2): 213–29.
Weible, C. (2018) Instrument constituencies and the advocacy coalition framework:
an essay on the comparisons, opportunities, and intersections, Policy and Society,
37(1): 59–73.
Weingart, P. (2001) Paradoxes of scientific advice to politics, in Social Sciences for
Knowledge and Decision Making, Paris: OECD, pp 78–94.
What Works Network (2018) The What Works Network. Five Years On, London: What
Works Network.
Whitehurst, G. (2012) The value of experiments in education, Education Finance and
Policy, 7(2): 107–23.
Young, K.,Ashby, D., Boaz,A. and Grayson, L. (2002) Social science and the evidence-
based policy movement, Social Policy and Society, 1(3): 215.
17