Professional Documents
Culture Documents
Fact About Resarch Public Policy
Fact About Resarch Public Policy
4, 2005
Despite several decades of research on public policy implementation we know surprisingly little, not
only about cumulative research results, but also about several other key aspects of this research field.
This article tries to amend these deficiencies by presenting the results of a comprehensive literature
survey. Its main purpose is to challenge, revise, and supplement some conventional wisdom about
implementation research. A second motivation is to lay the foundation for and initiate a much needed
synthesis of empirical research results. The main results are: The overall volume of publications on
policy implementation has not stagnated or declined dramatically since the mid 1980s as is commonly
asserted. On the contrary, it has continued to grow exponentially through the 1990s and into the
twenty-first century. Even more surprising is that a large number of publications are located outside
the core fields. Hence, the literature is substantially larger and more multidisciplinary than most com-
mentators realize. Doctoral dissertations are the most ignored, but probably the richest, largest, and
best source of empirical research results. Tracing the origin as well as the location of the disciplinary
and geographical cradle of implementation studies must also be readjusted significantly. The ethno-
centric bias of this research field toward the Western hemisphere has been, and still is, strong and some
policy sectors are given much more attention than others. Although positive in many ways, the pre-
dominant multidisciplinary character of implementation research still poses some serious problems
with respect to theory development. Thus, I discuss whether a resurgence of interest in policy imple-
mentation among policy scholars may already be occurring. Finally, I suggest that the time is long
overdue for efforts to synthesize research results in a more rigorous scientific manner than has hith-
erto been done.
KEY WORDS: public policy implementation, bibliometric survey, origin, size, development, discipli-
nary structure, relevance, research agenda
Introduction
Science must begin with myths and with the criticism of myths.
—Karl Popper
559
0162-895X © 2005 The Policy Studies Journal
Published by Blackwell Publishing. Inc., 350 Main Street, Malden, MA 02148, USA, and 9600 Garsington Road, Oxford, OX4 2DQ.
560 Policy Studies Journal, 33:4
fragmented and inaccessible (Cooper & Hedges, 1994; Hunt, 1997). Quite a few
state-of-the-art reviews on public policy implementation research have been pub-
lished over the years (Alexander, 1985; Barrett, 2004; Goggin et al., 1990; Hill, 1997;
Hill & Hupe, 2002; Ingram, 1990; deLeon, 1999a; deLeon & deLeon, 2002; Lester &
Goggin, 1998; Lester et al., 1987; Linder & Peters, 1987; McLaughlin, 1987; Matland,
1995; May, 2003; O’Toole, 1986; 2000; 2004; Palumbo & Calista, 1990; Ryan, 1996;
Sabatier, 1986; Schofield, 2001, Schofield & Sausman, 2004; Sinclair, 2001; Winter,
1990; Winter, 2003; Yin, 1982). They have all, in their time, pointed to important
contributions and shortcomings in implementation research and offered valuable
advice with respect to improvements.
Nevertheless, surprisingly little is known about several key features of this type
of research which will be explicated further below. Another related problem is that
most reviewers fail to explain how they arrive at their many factual statements and
interpretations. This is not uncommon among review articles in social sciences
according to Jackson (1980, 444), but it precludes judgments regarding validity of
their assertions and conclusions. For the few exceptions, see O’Toole (1986), Hill and
Hupe (2002), and Sinclair (2001).
As a result, a certain story about implementation research—its origin, discipli-
nary foundation and development pattern, and so forth—has been retold so often
that it has now become institutionalized in the form of conventional wisdom. The
story has acquired this stature because it has never been challenged. Some key ele-
ments of this conventional wisdom were created by Pressman and Wildavsky (1973),
the presumed founding fathers of this type of research, at a time when it was much
more difficult to keep track of research publications and their contents. These found-
ing fathers made their claims in good faith. However, given the superior research
tools and data resources at hand today, there is now little excuse when one uncrit-
ically perpetuates that dubious account. It is time to set the record straight!
What parts of the institutionalized story about implementation research are
correct and which are not? The story is also incomplete in certain respects. That is,
there are some important aspects of this research to which nobody has claimed
knowledge. The primary aim of this article is to provide a more complete, factual
account about implementation research at the outset of the new millennium. This
entails recounting its origin, evolution, recent development, size, disciplinary foundation,
and internal structure. The next two sections further elaborate the research questions
and related research methodologies.
Research Questions
The most frequently debated issue of the 1990s has been the question of the state
of implementation research. Is this a field of investigation where the volume of
research has declined to the extent that it needs revitalization as many contemporary
commentators have suggested (Barrett, 2004; deLeon, 1999, 2002; Lester & Goggin,
1998; Schofield, 2001; Schofield & Sausman, 2004; Winter, 1999, 2003)? Perhaps the
opposite is the case as others had predicted at the outset of the previous decade
(Goggin et al., 1990)?
Saetren: Facts and Myths about Implementation Research 561
As most reviewers of implementation research have had little to say about their
data sources and methodologies, I find it necessary to rectify this neglect. This study
has relied on many data sources and research methods. Nonetheless, digitalized,
scientific literature databases presently available at most universities as well as a
related search instrument called bibliometrics,1 proved to be the most important by
far. Three databases in particular, the Expanded Social Science Citation Index, World
Catalogue, and the Digital Dissertations (Dissertation Abstracts), were utilized because
they are interdisciplinary and together they cover all major types of publications.
They contain vast amounts of information, but of variable quality.2 Still, there are
562 Policy Studies Journal, 33:4
Research Results
In the early 1990s scholars like Goggin et al. (1990, 9) were quite upbeat about
the future of implementation research, proclaiming interest “is likely to grow during
the 1990s and continue well into the twenty-first century. In fact, the field of public
policy in the next decade will very likely be defined by the focus on implementa-
tion. The nineties are likely to be the implementation era”(italics in original).
This optimistic view, however, lost ground during the latter part of the 1990s as
it dawned upon an increasing number of scholars that it might be misplaced. Saetren
(1996) presented some preliminary data to the effect that a pronounced and steady
decline in research publications on policy implementation from 1984 up to 1995 had
occurred. At approximately the same time Hill (1997) and Lester and Goggin (1998)
arrived at the same conclusion. Other commentators in recent years have also var-
iously subscribed to the thesis of decline (Barrett, 2004; Hill & Hupe, 2002; O’Toole,
2000, 2004; Schofield, 2001; Schofield & Sausman, 2004; Winter, 2003). The facts of
the matter would then seem to have been settled fairly and squarely in favor of the
decline thesis. Not quite! It turns out that we can all be simultaneously right and
wrong, depending on how we qualify our statements.
If the total picture is examined, that is, all publications jointly (Figure 1), the
data overwhelmingly refute the decline proposition. Although publications peaked
and declined somewhat during the mid-1980s, their numbers had bounced back
well before the end of that decade. Moreover, the trend continued unabated, even
growing beyond previous levels through the 1990s and into the twenty-first century.
This exponential growth amounted to the near doubling of publications during the
period after 1984 compared with the previous period (Table 1). Although the output
of some publications increased more rapidly than others, over time none of these
trends lend support to a demise or decline thesis.
This constitutes a paradox. How can it be that so many scholars us came to
believe that the interest in implementation research had declined to an alarmingly
low level? How did the exponential growth of research literature reported here
escape our attention? The next section, which explores the disciplinary structure of
implementation research, provides some clues to an answer.
Saetren: Facts and Myths about Implementation Research 563
400
350
300
Article
Number
25 0 Books
200 Chapters
Many authors have long believed that studies of policy implementation are
mostly carried out by those who have an interest either in political science in
general, or in public administration and public policy in particular. This notion may
seem to follow logically from the fact that a more fundamental understanding of
implementation as an integral part of the public policy process must draw upon and
bridge insights from these fields of knowledge (Winter, 2003). This is also our justi-
fication for defining journals in these three subfields as the core. The heretical ques-
tion raised here is: Does such a core really exist?
The implication of this credo is that we should expect a clear majority of arti-
cles on policy implementation to be published in core journals. Table 2 clearly shows
that such is not the case at all. Articles published in noncore journals far outnum-
ber those appearing in core journals. An alternative and more modest expectation
can be derived from one of the so-called laws of bibliometrics, namely Bradford’s
law. It states that journals in any field of investigation can be divided into three main
categories, each containing roughly the same number of articles: (i) a nucleus of rel-
atively few journals on the subject publishing approximately one-third of all arti-
cles, defined as the core; (ii) another group consisting of many more journals that
produces approximately the same number of articles, referred to as near-core; and
(iii) an even more distant category of a still larger number of journals—the truly
564 Policy Studies Journal, 33:4
noncore—which accounts for the last third of articles (Potter, 1988). To test this thesis
I divided all journals registered in a database into these same categories.3
The data (Table 2) even repudiate this moderated expectation of Bradford’s law.
Only in 1980 (Table 3) did I come close to attaining the law’s prediction, as 30 percent
of all articles published that year were actually published in core journals. It is,
however, clearly an exception to the general rule. Only 13 percent of all implemen-
tation articles between 1942 and 2003 were published in core journals. Another 14
percent appeared in the near-core category. Doctoral dissertations reveal a similar
pattern (not shown in Table 2) as approximately 15 percent of all are classified as
belonging to the core fields of political science or public administration. This means
that about three-fourths or more of these two types of publication were located
outside the core.
What academic disciplines, then, constitute the noncore group? It turns out that
education and health journals each account for 15 percent of all implementation arti-
cles (Table 2). Journals devoted to law, economics, and the environment are also impor-
tant as they jointly account for another 14 percent. The education profession figures
much more prominently among doctoral dissertations, as close to two-thirds have
this affiliation. These data attest to the pronounced multidisciplinary character of
research on public policy implementation and a surprisingly modest contribution
of scholars from the core fields. They also help explain why so many of us have
been oblivious to the continued rapid growth of implementation research during
the 1990s: it happened disproportionately in other academic fields and in a bewil-
dering array of different and largely peripheral journals (see Table 4) to which we
could not pay much attention, even if so desired. There are, in other words, good
excuses for our previous ignorance.
Even looking at the core journals, the decline notion cannot be sustained. The
volume of articles in these journals did peak during the mid-1980s and subsequently
stabilized at a somewhat lower level (Figure 2). It would, however, be rather far-
fetched to claim, as some do, that the whole research field has virtually disappeared.
In fact the number of articles in core journals almost doubled during the period from
1985 compared with the previous period. Why then is the notion of the demise of
implementation research so prevalent in spite of facts that clearly suggest the contrary?
Saetren: Facts and Myths about Implementation Research 565
300
250
200
Number
Core
150
Noncore
100
50
0
1933
1939
1944
1949
1954
1959
1964
1969
1974
1979
1984
1989
1994
1999
Year
Figure 2. Articles on Policy Implementation Published in Core and Noncore Journals
by Year of Publication.
566 Policy Studies Journal, 33:4
Table 5. Reviews of Books with Policy Implementation as the Key Word in Social Science Citation
Index Journals by Time Period
The paradox must also be explained by the surprising fact that implementation
research, despite its resilience in quantitative terms, went out of fashion among policy
scholars after a relative short peak period that lasted only through the first half of
the 1980s. It is symptomatic that the last symposium on policy implementation
research was published in Policy Studies Review in the late 1980s.4 Even stronger and
more convincing support for this interpretation is provided by data on the number
of book reviews on policy implementation in journals included in the Social Science
Citation Index. Fifty-five percent of all book reviews (Table 5) appeared during the
relative short period from 1980 to 1985 and only 16 percent date after 1990. We have
already seen how the declining interest in implementation studies from around the
mid-1980s coincided with a temporary stagnation and drop in publication output.
More interestingly, the fact that publication volume returned to previous levels
toward the end of the 1980s and has continued unabatedly ever since at even a
higher level did not mean that implementation research had once again become
fashionable among policy scholars.
Thus, Goggin et al. (1990) were simultaneously both right and wrong. Imple-
mentation research continued to grow as they predicted during the 1990s and into
the twenty-first century. The field of public policy during the same period however,
has not been defined by its focus on implementation. On the contrary, even leading
policy scholars came to regard implementation focus as an intellectual “dead end”
and has turned their attention toward other topics such as policy change, policy net-
works, or governance. Indeed, implementation studies became so unfashionable that
even those who still considered them important mistakenly thought they had essen-
tially disappeared. This erroneous notion was exacerbated by a myopic disciplinary
conception of the research field.
Fortunately, as I have pointed out, the interest in policy implementation studies
has been profound and widespread enough to defy narrow disciplinary boundaries.
Thus, its fate has been insulated from the impact of whimsical fashion trends among
scholars in any particular academic field, such as political science.
a perfectly fragmented publication pattern). Only one article was published in a core
journal. The closest we came to the opposite pattern, and that implied by Bradford’s
law, was in 1980 when one (core) journal (Policy Studies Journal) accounted for 18
percent of all articles published that year. Nevertheless, even in that year, the one
hundred articles published appeared in 78 different journals; 14 of which were of
the core type. This fragmented publication pattern obviously poses great challenges
for any efforts at systematic knowledge accumulation.
The core consists of close to 60 different journals that collectively published 454
implementation articles over a 60-year time period (see Appendix 1). Approximately
half of these articles were published in public policy journals, one-third in public
administration journals, and one-fifth in political science journals (Table 4). It is note-
worthy that the position of public administration journals relative to policy journals
has been reversed during the 1990s compared to earlier. This pattern seems to reflect
a growing interest in implementation research among editors and contributors to
public administration journals during the 1990s. The emergence of the Journal of
Public Administration Research and Theory in the late 1980s is but one of several indi-
cators of this trend. The relatively modest but stable fraction of articles published
in political science journals is also worth noticing.
Two policy journals—Policy Studies Journal and Policy Studies Review—published
the largest share of core articles followed by the Public Administration Review. These
three journals alone published 30 percent of the total. At the other end of the spec-
trum we have the prestigious American Political Science Review with only three arti-
cles, none of which are cited frequently.
The documented explosive growth of implementation research, even in recent
years, invites the intriguing question of what its present size might be. The next
section attempts to provide an approximate answer.
Few, if any, serious efforts have ever been made to estimate the total volume of
research on public policy implementation, and for good reason: it is a very demand-
ing and difficult task. Some have even concluded that it is a futile endeavor (Hill &
Hupe, 2002). To this I would counter that even having only an approximate esti-
mate is better than having no clue at all. Furthermore, as others have emphasized,
knowing something about the scale of the total universe of publications in any
research area is a prerequisite as well as the starting point for any serious synthe-
sizing efforts (Cooper & Hedges, 1994).
There appears to be only three earlier attempts to assess the volume of research
on public policy implementation (Hill & Hupe, 2002; O’Toole, 1986; Sinclair, 2001).
They were, however, as far as we can ascertain, limited to a particular type of
publication and time period. I know of no systematic attempt to document the
volume of several types of publication over an extended time period.
568 Policy Studies Journal, 33:4
The truism that Pressman and Wildavsky (1973) pioneered the more explicit use
of implementation as a key term was in no small part created by the authors them-
selves in a postscript chapter, Appendix 2, titled “Use of ‘Implementation’ in Social
Science Literature”:
There is (or there must be) a large literature about implementation in the
social sciences—or so we have been told by numerous people. None of them
can come up with specific citations to this literature but they are certain it
must exist”. . . . “It must be there, it should be there, but in fact it is not”
. . . “How shall we persuade others that it is fruitless to look for a literature
that does not exist? . . . (p. 166)
Saetren: Facts and Myths about Implementation Research 569
It has since been pointed out that many earlier scholars had studied the imple-
mentation of policies using other labels. Few, however, have challenged the notion
that Pressman and Wildavsky were the first scholars to use implementation as an
explicit analytic research term. They made their claim in good faith based on limited
library resources available at that time. My data clearly demonstrate that the claim
by Pressman and Wildavsky is grossly exaggerated. A substantial number of books,
journal articles, and doctoral dissertations using implementation or implementing as a
title-word can be traced back to, at least the mid-1950s and even earlier (see Figure 1).
Doctoral students are the pioneers of implementation research! The first two
dissertations I have registered date back to the early 1930s (Charlesworth, 1933) and
1940s (Laatsch, 1942). They were political scientists and their dissertations focused
on international politics. Neither feature, though, were typical of the first wave of
implementation studies. The next dissertations by Manwell (1947) and Little (1948)
are more typical with their background in the education profession and empirical
focus on educational institutions. Before 1970 the number of doctoral dissertations
on policy implementation had reached well over a hundred.
The first two books (Loughery, 1952 and Turton, 1955) dealt with implementa-
tion aspects of U.S. educational law and the American Constitution, respectively.
Five years later another book appeared and then another 19 other before 1970. The
first article published in a law journal in the late 1940s (Briggs, 1948) dealt with imple-
mentation of a policy issue that is equally relevant today: international human rights
legislation. Before 1970 another 134 articles had been published.
It is also important to note that the “take-off” in terms of publication volume
did not commence with the publication of Pressman and Wildavsky’s book, as is
commonly assumed. Rather, it started at least 10 years earlier. Likewise, my data
challenge the commonly held notion that the focus on policy implementation was
triggered by the early evaluation studies in the mid-1960s of American social reform
programs. Those events probably aided momentum to the new research genre, but
they were not the starting point. To sum this up, by the time Implementation was
published in 1973, close to thirty books, and over two hundred journal articles and
doctoral dissertations respectively that employ implementation or implementing as
a title word had been published or defended!
A word of caution may be in order here. I have not yet been able to investigate
closely the actual use of the implementation concept in these early publications.
Nonetheless, I assume it is not used casually or accidentally in most of them, par-
ticularly not with respect to doctoral dissertations.
Having already established that doctoral students pioneered the study of policy
implementation, I find it logical to use the location of the degree-awarding higher
educational institutions as a prime indicator of where this type of research origi-
nated. Close to half the doctoral dissertations in this field before 1970 can be traced
to American universities in the Northeast and just fewer than 40 percent belonged
to those in the Midwest region. Finally, universities on the West Coast and in the
South accounted for less than 10 percent, respectively. I found only three doctoral
570 Policy Studies Journal, 33:4
Hill and Hupe (2002, 201–204) investigated what policy issues implementation
scholars study by scrutinizing articles published in the most prominent core jour-
nals during the 1990s. Their results match mine closely. I found that implementa-
tion scholars consistently pay greater attention to the following five policy issues:
educational, health, environmental, social, and economic issues (Table 6).
There is, however, considerable variation among types of publication with
respect to their relative focus on each of these five policy sectors. Doctoral disser-
tations stand out, as close to two-thirds of these deal with educational issues. There
is not much variation in this fraction over time. The bias toward one particular
policy sector is much less in books, chapters, and journal articles. Articles in core
journals focus more on environmental and social issues (as Hill and Hupe also
found) while those published in noncore journals are more preoccupied with edu-
cational and health issues. These patterns seem to indicate that political scientists
have a broader research agenda than other professions in terms of sector focus.
Disregarding doctoral dissertations—where there is little variation over time—
attention to health and environmental issues after 1984 has increased compared with
the previous period (Tables 2a and 2h in Appendix 2). The trend at the outset of this
century is that health issues are rivaling or even slightly surpassing those of educa-
tion. The overwhelming majority of sectors and issues studied are domestic. Studies
on foreign policy are conspicuously rare (1–2 percent of all publications), with the
exception of the very early works. Another trend toward the end of the previous
century is that some policy issues that initially had a domestic focus, for example,
environment and civil or human rights, have shifted toward a more international
and global focus.
Saetren: Facts and Myths about Implementation Research 571
To sum up, there is a clear bias in research topics among implementation schol-
ars toward a few policy sectors. Still, I find it hard to conclude that these issues and
the challenges they present are not among the salient ones in contemporary soci-
eties. To what extent this reflects a regional bias in implementation research will be
clarified next.
The overall regional bias of implementation research is quite strong (Table 7).
Close to three out of every four publications have either an American/Canadian
author or a North American empirical focus. If Europe is included, then the Western
hemisphere accounts for close to 90 percent of all publications. The pattern is to a
large extent influenced by the overwhelmingly ethnocentric focus in doctoral
dissertations (87 percent) of predominantly North American origin. Even among the
other types of publications, more than half originate or focus on the North American
continent. The ethnocentric bias is more pronounced in articles than in books.
The good news here is that the regional focus of implementation studies has
broadened since the mid-1980s, although mostly in favor of Europe. Among the
572 Policy Studies Journal, 33:4
other regions, Asia is studied about twice as often as Africa, and Latin America and
Oceania appear as the most neglected continents.
Some early implementation scholars (Smith, 1973) thought implementation
problems were primarily a third-world phenomenon while others like Pressman
and Wildavsky (1973) demonstrated their endemic character in more developed
countries as well. With few exceptions (e.g., Brinkerhoff, 1996; Cheema & Rondinelli,
1983; Grindle, 1980; Grindle & Thomas, 1990; Lazin, 1999), the community of
implementation researchers seems to have followed the lead provided by Pressman
and Wildavsky.
There are at least six factors that may have contributed to the declining interest
in implementation research among policy scholars. First is the protracted and sterile
debate about the two competing analytical paradigms labeled top-down versus
bottom-up, which dominated the research field through most of the 1980s. It was a
confusing debate in how normative, methodological, and theoretical aspects were
almost seamlessly and indistinguishably intertwined. I suspect that this entrenched
and prolonged debate frustrated many scholars to the extent that they exited the
whole research enterprise. Second, state–society relations in many industrialized
nations have changed during the last few decades from unilateral and hierarchical
to more reciprocal and less hierarchical ones. This has been linked to a change in
governments’ political philosophy from the early 1980s most noticeably embodied
in the new radical policy programs of Reagan in the United States and Thatcher in
the United Kingdom. The advocacy of broad-scale retreat from government inter-
vention in the economy and more reliance on market-based policy instruments was
a cornerstone in these programs. These events, of course, did not escape the notice
of policy scholars who increasingly started to substitute the terms government and
implementation with those of governance and policy networks in order to better
describe new realities.
Third, many early implementation studies had a pronounced failure bias. Influ-
ential authors, like Pressman and Wildavsky (1973), even used probability theory
to prove that the chances of any governmental program to succeed were indeed
slim. Their conclusion was later disputed and shown to be overly pessimistic
because of some dubious assumptions in calculations (Bowen, 1982). Nevertheless,
the failure notion associated with implementation research stuck. It earned the nick-
name “misery research” (Rothstein, 1988). This grim picture of the fate of public
policy making generally fueled more conservative criticism of social reform pro-
grams in the United States in the 1980s. They were increasingly perceived as futile
and a waste of money, thereby undermining their political backing in general
(Murray, 1984). Thus, implementation studies, by being hijacked for particular ide-
ological purposes, unwittingly also contributed to delegitimizing themselves as a
research genre.
Fourth, policy scholars themselves started to express doubts about the extent to
which the policy process could be neatly segmented into discrete stages that pro-
gressed sequentially from agenda setting, through adoption, implementation, and
Saetren: Facts and Myths about Implementation Research 573
subsequent policy phases (Jenkins-Smith & Sabatier, 1993; John, 1998; Nakamura,
1987; Sabatier, 1988, 1999). The stages metaphor was accused of oversimplifying and
thus also misrepresenting a more complex and recursive policy process. Although
somewhat overstated at times the criticism has at least reminded us of what Press-
man and Wildavsky emphasized in their seminal book: that implementation cannot
fruitfully be studied and understood as a discrete policy activity. Regardless of
intention it may also have generally contributed further to the delegitimization of
implementation studies.
Fifth, it is easier to write something that gets published at the early stage of a
new research genre than many years later when saying something new and origi-
nal becomes much harder. For the ambitious, prolific author it may then be more
convenient to move on to a new, emerging research topic. Last but not the least
important, policy scholars are probably susceptible to fashion trends in science and
society with respect to their research agendas. The implication of the last two factors
is that few, if any, research topic—including policy implementation—will be able to
hold policy scholars’ attention for any extended period of time. These factors are,
of course, not mutually exclusive. Working together they may provide a plausible
explanation for the declining interest in the policy field with respect to doing imple-
mentation research.
The current situation in this field of investigation suggests that the relevance
question is not rhetorical to the larger community of policy researchers. Hence, the
relevance arguments need to be set forth. First, and as others (e.g., Hill & Hupe,
2002, 111) have recently pointed out, there is nothing in the changing state–society
interface toward more cooperative and negotiated networks that precludes an
implementation perspective. On the contrary, given these new systemic features,
representative governments still exist whose translation of policies into practice,
even in some broader sense, is a challenge and a legitimate concern. The multitude
of reports on implementation issues produced by governments and various inter-
national agencies each year attest to their endemic nature. Developments toward
supranational governing structures such as the European Union or international
agreements and conventions on many salient policy issues also typically pose for-
midable implementation challenges. Second, the criticism concerning bias toward
failure cases is probably less relevant today than it once was (O’Toole, 2000). Third,
although some caution concerning the use of the stages metaphor is justified, I, like
some others (e.g., deLeon, 1999), find its total dismissal by Sabatier and others
unwarranted and hence misguided.10 Fourth, we are not even close to a well-
developed theory of policy implementation, although as several commentators have
pointed out, some progress has been made (e.g., O’Toole, 2000; Winter, 2003). In this
respect it is unfortunate that policy scholars have lost interest in researching a
phenomenon for which they are theoretically better equipped than any other group
of scholars. This fact bestows both analytical primacy and special responsibility
upon policy scholars with respect to spearheading theorizing about policy
implementation.
574 Policy Studies Journal, 33:4
Schmidt, & Jackson, 1982) and have been applied in other related fields of the social
sciences (Hendricks & Singhal, 1997; Robertson & Seneviratne, 1995; Robertson et
al., 1993). The time is overdue with respect to initiating efforts toward synthesizing
the policy implementation literature. What I have done is draw up as close to a
complete map of the research literature as possible that can guide future efforts
toward its integration.
Notes
1. According to Hertzel (2003, 295–296), bibliometrics is the analysis of the structure of a body of liter-
ature using any type of quantitative method.
2. I would like to point out that problems of validity and reliability were endemic features of some of
these databases. For example, searches targeting books and doctoral dissertations specifically would
nevertheless retrieve many other types of publications as well. Publications registered more than
once was another not infrequent problem. Both flaws would—to the uncritical investigator’s
dismay—produce inflated estimates of the actual number of relevant publications. WorldCat was by
far the worst in this respect. Dissertations Abstract had mostly the former problem and less of the
latter while the Social Science Citation Index had none of them as far as I could tell.
3. Journals devoted to theory development in political science, public administration, or public policy/policy
analysis in general constitute the core. A list of these core journals is provided in Appendix 1. It should
be noted that this list does not cover all existing core journals. It covers only those in which I have
registered an implementation-titled article. Journals dedicated to the application of these three study
fields in a particular context are classified as near-core. Typical examples here are Energy Policy, Journal
of Health Politics & Policy, and Administration in Social Work. Journal of Curriculum Inquiry, Elementary
School Journal, and Journal of School Health exemplify the more distant noncore category.
4. The symposium articles appeared in a book 3 years later (Palumbo & Calista, 1990) together with
another seminal book (Goggin et al., 1990). Together they seem to mark the end of an era with respect
to interest in implementation studies by most policy scholars.
5. Actually, public policy implementation is the most correct and accurate composite key term in this
respect, but it is, unfortunately, employed so rarely by both authors and librarians as to render it
practically useless. Even the slightly simpler term policy implementation misses too many relevant
publications because of its inconsistent application. But the single term implementation, even as a title
word, creates the opposite problem as it produces many thousands of references, most of which turn
out to be irrelevant. Nevertheless, and despite a tedious and time-consuming weeding process, the
latter procedure was chosen to ensure comprehensiveness. Implementation as key word was not used
since key words are not generally available in our databases before 1990 when they became syn-
onymous with title-words.
6. There is no doubt, as it has been repeatedly pointed out before, that the execution stage of policy
making was analyzed and studied without using the term implementation, both before and after
Pressman and Wildavsky elevated it to fame among policy scholars in the early 1970s. The problem
here is to come up with the full range of alternative terms that have been used in these cases and
which are needed to identify them. Compliance, enforcement, execution, governance, and regulation
are but a few of the many other terms that would be needed to produce a truly complete coverage
of everything that is relevant.
7. As an illustration, two-thirds of all journal articles using the title-word implementation had no
relevance to public policy in my fairly liberal judgment.
8. Three decisions were critical to the size estimate I arrived at. The first was to base my search strate-
gies on an understanding of implementation studies as a potentially multidisciplinary research field,
which meant broadening the mapping procedure beyond narrow disciplinary boundaries (e.g., Polit-
ical Science Abstracts). The second was to include doctoral dissertations, a hitherto hidden and unrec-
576 Policy Studies Journal, 33:4
ognized important source of scientific research. The third decision was to extend the searches in all
data sources over the maximum time period feasible, both backwards and forwards
9. Regardless of choices made during any search procedure, there are two classical problems the
researcher faces (White, 1994). The first is to miss publications that should have been included (false
negatives), a problem I have already elaborated upon above. The second is to include publications
that should not have been there (false positives). The magnitude of the second problem can only be
ascertained through a closer examination of each publication, or a random sample thereof, which is
beyond the scope of this article.
10. It is true that the public policy process does not necessarily follow the sequential logic of the stages
metaphor in all cases, especially not its later stages (like implementation, evaluation, feedback, and
policy learning). Nevertheless, the stages metaphor is important because it reflects institutional rules
and norms about how public policies should to be transformed from ideas to practice in modern
political systems. The appropriateness of this analytical construct should not be taken for granted or
dispelled axiomatically. What the stages metaphor does is to describe a hypothetical process
sequence, the validity of which must always remain an empirical question.
References
Alexander, E. R. 1985. “From Idea to Action: Notes for a Contingency Theory of the Policy Implementa-
tion Process.” Administration & Society 16 (4): 403–26.
Barrett, S. M. 2004. “Implementation Studies: Time for a Revival? Personal Reflections on 20 Years of
Implementation Studies.” Public Administration 82 (2): 249–62.
Borgman, C. L. 1990. Scholarly Communication and Bibliometrics. Newbury Park, London and New Dehli:
Sage.
Bowen, E. R. 1982. “The Pressman-Wildavsky Paradox: Four Addendas or Why Models Based on Prob-
ability Theory Can Predict Implementation Success and Suggest Useful Tactical Advice for Imple-
menters.” Journal of Public Policy 2 (1): 1–22.
Briggs, H. W. 1948. “Implementation of the Proposed International Covenant on Human Rights.” Amer-
ican Journal of International Law 42 (2): 389–97.
Brinkerhoff, D. W. 1996. “Process Perspectives on Policy Change. Highlighting Implementation.” World
Development 24 (9): 1395–402.
Charlesworth, J. C. 1933. The Implementation of the Pact of Paris. PhD diss., University of Pittsburgh.
Cheema, G. S., and D. A. Rondinelli. 1983. Dentralization and Development: Policy Implementation in Devel-
oping Countries. Beverly Hills, CA: Sage.
Cooper, H. M., and L. V. Hedges. 1994. The Handbook of Research Synthesis. New York: Russell Sage
Foundation.
Goggin, M. L., A. M. O’Bowman, J. P. Lester, and L. J. O’Toole Jr. 1990. Implementation Theory and Prac-
tice: Toward a Third Generation. Glenwood, IL: Scott Foreman/Little, Brown.
Grindle, M. S. 1980. Politics and Policy Implementation in the Third World. N.J.: Princeton University Press.
Grindle, M. S., and J. W. Thomas. 1990. Public Choices and Policy Change. Baltimore/London: The Johns
Hopkins University Press.
Hendricks, K. B., and V. R. Singhal. 1997. “Does Implementing an Effective TQM Program Actually
Improve Operating Performance? Empirical Evidence from Firms That Have Won Quality Awards.”
Management Science 43 (9): 1258–74.
Hertzel, D. H. 2003. “Bibliometrics History.” In Encyclopedia of Library and Information Science, 2nd edn,
ed. M. A. Drake. New York: Marcel Dekker, 144–211.
Hill, M. 1997. “Implementation Theory: Yesterday’s Issue?” Policy and Politics 25 (4): 375–85.
Hill, M., and P. Hupe. 2002. Implementing Public Policy: Governance in Theory and in Practice. London,
Thousand Oaks, and New Delhi: Sage.
Hunt, M. 1997. How Science Takes Stock: The Story of Meta-Analysis. New York: Russell Sage Foundation.
Hunter, J., F. L. Schmidt, and G. B. Jackson. 1982. Meta-Analysis: Cumulating Research Findings across
Studies. Beverly Hills, CA: Sage.
Saetren: Facts and Myths about Implementation Research 577
Ingram, H. 1990. “Implementation: A Review and Suggested Framework.” (Chapter 18) In Public Admin-
istration: The State of the Art, ed. N. M. Lynn, and A. Wildavsky. Chatham, NJ: Chatham House,
462–480.
Jackson, G. B. 1980. “Methods for Integrative Reviews.” Review of Educational Research 50 (3): 438–60.
Jenkins-Smith, H., and P. A. Sabatier. 1993. “The Study of the Public Policy Process.” In Policy Change and
Learning: An Advocacy Coalition Approach, ed. P. A. Sabatier, and H. Jenkins-Smith. Boulder, CO: West-
view Press, 1–9.
John, P. 1998. Analysing Public Policy. London & New York: Pinter.
Kettunen, P. 2000. “Implementation Approach. The Political Scientist’s Perspective”. Policy Currents 10
(1–2): 3–5.
Kirst, M., and R. Jung. 1982. “The Utility of a Longitudinal Approach in Assessing Implementation: A
Thirteen-Year View of Title I, ESEA.” In Studying Implementation: Methodological and Administrative
Issues, ed. W. Williams, and R. E. Elmore. Chatham, NJ: Chatham House, 119–148.
Laatsch, M. H. A. 1942. The Implementation of the Monroe Doctrine. PhD diss., Princeton University.
Lazin, F. A. 1999. The Policy Implementation Process in Developing Nations. Stamford, Conn.: JAI Press.
deLeon, P. 1999a. “The Missing Link Revisited: Contemporary Implementation Research.” Policy Studies
Review 16 (3/4): 311–38.
———. 1999b. “Cold Comfort Indeed.” Policy Currents 8 (4): 6–8.
deLeon, P., and L. deLeon. 2002. “Whatever Happened to Policy Implementation? An Alternative
Approach.” Journal of Public Administration Research and Theory 12 (4): 467–88.
Lester, J. P. 2000. “Back to the Future in Implementation Research: A Response.” Policy Currents 10 (1–2):
2–5.
Lester, J. P., A. M. O’Bowman, M. L. Goggin, and L. J. O’Toole Jr. 1987. “Public Policy Implementation:
Evolution of the Field and Agenda for Future Research.” Policy Studies Review 7 (1): 200–16.
Lester, J. P., and M. L. Goggin. 1998. “Back to the Future: The Rediscovery of Implementation Studies.”
Policy Currents 8 (3): 1–9.
Linder, S. H., and B. G. Peters. 1987. “A Design Perspective on Policy Implementation: The Fallacies of
Misplaced Prescription.” Policy Studies Review 6 (3): 459–75.
Little, J. R. 1948. A Study of Present Practices and the Opinions of Educational Authorities with Respect to the
Implementation of State Responsibility for Public Education and Its Application to the State of Colorado.
PhD diss., University of Colorado at Boulder.
Loughery, B. F. S. 1952. Parental Rights in American Educational Law. Their Basis and Implementation. Wash-
ington, DC: Catholic University of America Press.
McLaughlin, M. W. 1987. “Learning from Experience: Lessons from Policy Implementation.” Education
Evaluation and Policy Analysis 9 (2): 171–8.
Manwell, E. A. 1947. Health in High School Science. The Implementation of a Philosophy of Teaching in High
School Science with a Study of Accompanying Changes in Pupils. PhD diss., Columbia University.
Matland, R. E. 1995. “Synthesizing the Implementation Literature: The Ambiguity-Conflict Model of
Policy Implementation.” Journal of Public Administration Research and Theory 5 (2): 145–74.
May, P. 2003. “Policy Design and Implementation.” (Chapter 17) In Handbook of Public Administration, ed.
B. G. Peters, and J. Pierre. London, Thousand Oaks, CA and New Delhi, India: Sage, 223–233.
Meier, K. 1999. Are We Sure Lasswell Did It This Way? Lester Goggin and Implementation Research.”
Policy Currents 9 (1): 5–8.
Murray, C. 1984. Loosing Ground: American Social Policy 1950–1980. New York: Basic Books.
Nakamura, R. T. 1987. “The Textbook Policy Process and Implementation Research.” Policy Studies Review
7 (1): 142–54.
O’Toole, L. J., Jr. 1986. “Policy Recommendations for Multi-Actor Implementation: An Assessment of the
Field.” Journal of Public Policy 6 (2): 181–210.
———. 2000. “Research on Policy Implementation: Assessment and Prospects.” Journal of Public Admin-
istration Research and Theory 10 (2): 263–88.
———. 2004. “The Theory-Practice Issue in Policy Implementation Research.” Public Administration 82
(2): 309–29.
578 Policy Studies Journal, 33:4
Palumbo, D. J., and D. J. Calista. 1990. Implementation and the Policy Process: Opening Up the Black Box.
New York: Greenwood Press.
Peters, B. G., and Pierre, J. 2003. Handbook of Public Administration. London/Thousand Oaks/New Delhi:
Sage.
Potter, W. G. 1988. “Of Making Many Books There Is No End: Bibliometrics and Libraries.” The Journal
of Academic Librarianship 14 (9): 1–3.
Potoski, M. 2001. “Implementation, Uncertainty and the Policy Sciences.” Policy Currents 11 (1): 2–5. (2):
283–308.
Pressman, J. L., and A. B. Wildavsky. 1973. Implementation. How Great Expectations in Washington Are
Dashed in Oakland. Berkeley, CA and London: University of California Press.
Robertson, P. J., and S. J. Seneviratne. 1995. “Outcomes of Planned Organizational Change in the Public
Sector: A Meta-Analytic Comparison to the Private Sector.” Public Administration Review 55 (6):
547–57.
Rothstein, B. 1988. Just Institutions Matter. The Moral and Political Logic of the Universal Welfare State. Cam-
bridge, UK: Cambridge University Press.
Ryan, N. 1996. “Some Advantages of an Integrated Approach to Implementation Analysis. A Study of
Australian Industrial Policy.” Public Administration 74 (4): 737–53.
Sabatier, P. A. 1986. “Top-Down and Bottom-Up Approaches to Implementation Research: A Critical
Analysis and Suggested Synthesis.” Journal of Public Policy 6 (1): 21–48.
———. 1988. “An Advocacy Coalition Framework of Policy Change and the Role of Policy-Oriented
Learning Therein.” Policy Sciences 21 (2–3): 129–68.
———. 1999. “The Need for Better Theories.” (Chapter 1) In Theories of the Policy Process, ed. P. A. Sabatier.
Boulder, CO: Westview Press, 3–17.
Saetren, H. 1996. “Whatever happened to Implementation Research?” Paper for presentation at the con-
ference of the Nordic Association of Political Science, Helsinki, Finland.
Schneider, A. L. 1999. “Terminator? Who, Me? Some Thoughts about the Study of Policy Implementa-
tion.” Policy Currents 9 (1): 1–5.
Schofield, J. 2001. “Time for a Revival? Public Policy Implementation: A Review of the Literature and
Agenda for Future Research.” International Journal of Management Review 3 (3): 245–63.
Schofield, J. 2004. “A Model of Learned Implementation.” Public Administration 82 (2): 283–308.
Schofield, J., and C. Sausman. 2004. “Symposium on Implementing Public Policy: Learning from Theory
and Practice: Introduction.” Public Administration 82 (2): 235–48.
Sinclair, T. A. P. 2001. “Implementation Theory and Practice: Uncovering Policy and Administration Link-
ages in the 1990’s.” International Journal of Public Administration 24 (1): 77–94.
Smith, T. B. 1973. “The Policy Implementation Process.” Policy Sciences 4 (2): 197–209.
Turton, J. 1955. Implementing the American Constitution. A Treatise on Constitutional Procedures with Special
Reference to the Supposed ‘Amendments.’ New York: Exposition Press.
White, H. D. 1994. “Scientific Communication and Literature Retrieval.” (Chapter 4) In Handbook of
Research Synthesis, ed. H. Cooper, and L. V. Hedges. New York: Russell Sage Foundation, 42–55.
Winter, S. 1990. “Integrating Implementation Research.” In Implementation and the Policy Process: Opening
Up the Black Box, ed. D. J. Palumbo, and D. J. Calista. New York: Greenwood Press, 19–38.
———. 1999. “New Directions for Implementation Research.” Policy Currents 8 (4): 1–5.
———. 2003. “Implementation Perspectives: Status and Reconsideration.” (Chapter 16) In Handbook of
Public Administration, ed. B. G. Peters, and J. Pierre. London, Thousand Oaks, CA, and New Delhi:
Sage, 212–222.
Yin, R. 1982. “Studying the Implementation of Public Programs.” In Studying Implementation: Method-
ological and Administrative Issues, ed. W. Williams, and R. E. Elmore. Chatham, NJ: Chatham House,
36–72.
Saetren: Facts and Myths about Implementation Research 579
Policy Journals:
Policy Studies Journal (54)
Policy Studies Review (43)
Policy & Politics (23)
Public Policy/Journal of Policy Analysis and Management (20)
Policy Sciences (17)
Evaluation Review (15)
Journal of Public Policy (12)
Policy Analysis (7)
Policy Currents (7)
Journal of European Public Policy (4)
Canadian Public Policy (3)
Cato (3)
Policy Studies Annual Review (1)
Policy Studies (1)
Local Government Policy Making (2)
Policy Evaluation (1)
The Review of Policy Research (1)
American Journal of Evaluation (1)
Public Policy and Administration (1)
Public Administration Journals:
Public Administration Review (39)
Administration and Society (22)
Public Administration (21)
International Journal of Public Administration (15)
Australian Journal of Public Administration (12)
Journal of Public Administration
Research and Theory (11)
American Review of Public Administration (11)
Indian Journal of Public Administration (8)
International Review of Administrative Science (6)
Philippine Journal of Public Administration (3)
Canadian Public Administration
Public Administration Quarterly (1)
Public Management Review (1)
Sage Public Administration Abstracts
Political Science Journals:
Publius (11)
European Journal of Political Research (9)
Western Political Research (9)
Scandinavian Political Studies (8)
Journal of Politics (7)
Political Science Quarterly (4)
American Political Science Review (3)
American Journal of Political Science (3)
Comparative Political Studies (2)
Political Research Quarterly (2)
Comparative Politics (2)
American Politics Quarterly (2)
Governance (2)
Journal of Political Economy (2)
Polity (2)
Journal of Political Economy (2)
Politeia (2)
580 Policy Studies Journal, 33:4
Table 2a. Books and Chapters by Policy Sector Focus and Period
of Publication
Table 2g. Articles by Policy Sector, Type of Journal, and Period of Publication
Table 2h. Articles by Regional Focus, Type of Journal, and Period of Publication