You are on page 1of 24

The Policy Studies Journal, Vol. 33, No.

4, 2005

Facts and Myths about Research on Public Policy


Implementation: Out-of-Fashion, Allegedly Dead, But
Still Very Much Alive and Relevant
Harald Saetren

Despite several decades of research on public policy implementation we know surprisingly little, not
only about cumulative research results, but also about several other key aspects of this research field.
This article tries to amend these deficiencies by presenting the results of a comprehensive literature
survey. Its main purpose is to challenge, revise, and supplement some conventional wisdom about
implementation research. A second motivation is to lay the foundation for and initiate a much needed
synthesis of empirical research results. The main results are: The overall volume of publications on
policy implementation has not stagnated or declined dramatically since the mid 1980s as is commonly
asserted. On the contrary, it has continued to grow exponentially through the 1990s and into the
twenty-first century. Even more surprising is that a large number of publications are located outside
the core fields. Hence, the literature is substantially larger and more multidisciplinary than most com-
mentators realize. Doctoral dissertations are the most ignored, but probably the richest, largest, and
best source of empirical research results. Tracing the origin as well as the location of the disciplinary
and geographical cradle of implementation studies must also be readjusted significantly. The ethno-
centric bias of this research field toward the Western hemisphere has been, and still is, strong and some
policy sectors are given much more attention than others. Although positive in many ways, the pre-
dominant multidisciplinary character of implementation research still poses some serious problems
with respect to theory development. Thus, I discuss whether a resurgence of interest in policy imple-
mentation among policy scholars may already be occurring. Finally, I suggest that the time is long
overdue for efforts to synthesize research results in a more rigorous scientific manner than has hith-
erto been done.
KEY WORDS: public policy implementation, bibliometric survey, origin, size, development, discipli-
nary structure, relevance, research agenda

Introduction

Science must begin with myths and with the criticism of myths.
—Karl Popper

As a new field of investigation continues to grow and expand over several


decades, it becomes important to take stock of what has happened and what has
been achieved. If not, the knowledge and insights gained will become increasingly

559
0162-895X © 2005 The Policy Studies Journal
Published by Blackwell Publishing. Inc., 350 Main Street, Malden, MA 02148, USA, and 9600 Garsington Road, Oxford, OX4 2DQ.
560 Policy Studies Journal, 33:4

fragmented and inaccessible (Cooper & Hedges, 1994; Hunt, 1997). Quite a few
state-of-the-art reviews on public policy implementation research have been pub-
lished over the years (Alexander, 1985; Barrett, 2004; Goggin et al., 1990; Hill, 1997;
Hill & Hupe, 2002; Ingram, 1990; deLeon, 1999a; deLeon & deLeon, 2002; Lester &
Goggin, 1998; Lester et al., 1987; Linder & Peters, 1987; McLaughlin, 1987; Matland,
1995; May, 2003; O’Toole, 1986; 2000; 2004; Palumbo & Calista, 1990; Ryan, 1996;
Sabatier, 1986; Schofield, 2001, Schofield & Sausman, 2004; Sinclair, 2001; Winter,
1990; Winter, 2003; Yin, 1982). They have all, in their time, pointed to important
contributions and shortcomings in implementation research and offered valuable
advice with respect to improvements.
Nevertheless, surprisingly little is known about several key features of this type
of research which will be explicated further below. Another related problem is that
most reviewers fail to explain how they arrive at their many factual statements and
interpretations. This is not uncommon among review articles in social sciences
according to Jackson (1980, 444), but it precludes judgments regarding validity of
their assertions and conclusions. For the few exceptions, see O’Toole (1986), Hill and
Hupe (2002), and Sinclair (2001).
As a result, a certain story about implementation research—its origin, discipli-
nary foundation and development pattern, and so forth—has been retold so often
that it has now become institutionalized in the form of conventional wisdom. The
story has acquired this stature because it has never been challenged. Some key ele-
ments of this conventional wisdom were created by Pressman and Wildavsky (1973),
the presumed founding fathers of this type of research, at a time when it was much
more difficult to keep track of research publications and their contents. These found-
ing fathers made their claims in good faith. However, given the superior research
tools and data resources at hand today, there is now little excuse when one uncrit-
ically perpetuates that dubious account. It is time to set the record straight!
What parts of the institutionalized story about implementation research are
correct and which are not? The story is also incomplete in certain respects. That is,
there are some important aspects of this research to which nobody has claimed
knowledge. The primary aim of this article is to provide a more complete, factual
account about implementation research at the outset of the new millennium. This
entails recounting its origin, evolution, recent development, size, disciplinary foundation,
and internal structure. The next two sections further elaborate the research questions
and related research methodologies.

Research Questions

The most frequently debated issue of the 1990s has been the question of the state
of implementation research. Is this a field of investigation where the volume of
research has declined to the extent that it needs revitalization as many contemporary
commentators have suggested (Barrett, 2004; deLeon, 1999, 2002; Lester & Goggin,
1998; Schofield, 2001; Schofield & Sausman, 2004; Winter, 1999, 2003)? Perhaps the
opposite is the case as others had predicted at the outset of the previous decade
(Goggin et al., 1990)?
Saetren: Facts and Myths about Implementation Research 561

There is no doubt that the volume of research on public policy implementation


has grown substantially since the mid-1970s, but exactly how large is it at the begin-
ning of the new millennium? Can this question be answered at all, or even approx-
imately? If so, what is the answer?
Political science scholars and their colleagues in the subfields of public admin-
istration and public policy share the sacred notion that policy implementation
research is primarily their domain. Does this conviction withstand closer examina-
tion? If not, what are the other academic disciplines that make important contribu-
tions to implementation research?
The degree of concentration or fragmentation regarding the structure of scholarly
communication is essential to efforts at accumulation of knowledge in any field of
investigation (Borgman, 1990). The greater the number of articles published in a few
core journals the more favorable the condition should be for synthesizing that
research literature and vice versa. How does implementation research fare in this
vital respect?
It is customarily asserted that virtually no research under the explicit label of
implementation (or implementing) was carried out before the publication of Press-
man and Wildavsky’s seminal book titled Implementation in 1973. Is this really true?
In the same vein it might be reasonably assumed that implementation research orig-
inated on the West Coast in the San Francisco Bay Area at the University of
California Berkeley campus, where Wildavsky was an influential teacher and
researcher during the 1960s and 1970s. Do the facts corroborate this?
What policy issues do implementation scholars study and how do they match
the supposedly salient ones in our societies? This question also has a geographical
dimension. There is no doubt that implementation research from the beginning was
predominantly a North American enterprise. The question here is to what extent the
focus has shifted to other regions of the world and their policy challenges as the
research field matured?
Another key issue is the relevance of implementation research, and why it
became unfashionable. Finally, one wonders if some policy scholars’ recent calls for
the revival of implementation research are having any effect. If so, what direction
should reenergized research efforts take? These, then, are the main questions this
article purports to answer.

Data and Methodology

As most reviewers of implementation research have had little to say about their
data sources and methodologies, I find it necessary to rectify this neglect. This study
has relied on many data sources and research methods. Nonetheless, digitalized,
scientific literature databases presently available at most universities as well as a
related search instrument called bibliometrics,1 proved to be the most important by
far. Three databases in particular, the Expanded Social Science Citation Index, World
Catalogue, and the Digital Dissertations (Dissertation Abstracts), were utilized because
they are interdisciplinary and together they cover all major types of publications.
They contain vast amounts of information, but of variable quality.2 Still, there are
562 Policy Studies Journal, 33:4

some pieces of relevant information concerning publications listed in the databases


that are not consistently registered. This applied especially to: (i) abstracts; (ii)
authors’ background (professional and national), whether their publications are
empirically oriented or not; (iii) research methodology employed; (iv) policy sector
studied; and (v) regional focus. These deficiencies constituted the main rationale for
creating our own electronically indexed and retrievable literature data file based on
the Cardbox software program.
Conceptual and practical matters related to searches in the three literature data-
bases are discussed in the section below that deals with the size estimate issue where
they are of particular relevance.

Research Results

The Alleged Demise of Implementation Research

In the early 1990s scholars like Goggin et al. (1990, 9) were quite upbeat about
the future of implementation research, proclaiming interest “is likely to grow during
the 1990s and continue well into the twenty-first century. In fact, the field of public
policy in the next decade will very likely be defined by the focus on implementa-
tion. The nineties are likely to be the implementation era”(italics in original).
This optimistic view, however, lost ground during the latter part of the 1990s as
it dawned upon an increasing number of scholars that it might be misplaced. Saetren
(1996) presented some preliminary data to the effect that a pronounced and steady
decline in research publications on policy implementation from 1984 up to 1995 had
occurred. At approximately the same time Hill (1997) and Lester and Goggin (1998)
arrived at the same conclusion. Other commentators in recent years have also var-
iously subscribed to the thesis of decline (Barrett, 2004; Hill & Hupe, 2002; O’Toole,
2000, 2004; Schofield, 2001; Schofield & Sausman, 2004; Winter, 2003). The facts of
the matter would then seem to have been settled fairly and squarely in favor of the
decline thesis. Not quite! It turns out that we can all be simultaneously right and
wrong, depending on how we qualify our statements.
If the total picture is examined, that is, all publications jointly (Figure 1), the
data overwhelmingly refute the decline proposition. Although publications peaked
and declined somewhat during the mid-1980s, their numbers had bounced back
well before the end of that decade. Moreover, the trend continued unabated, even
growing beyond previous levels through the 1990s and into the twenty-first century.
This exponential growth amounted to the near doubling of publications during the
period after 1984 compared with the previous period (Table 1). Although the output
of some publications increased more rapidly than others, over time none of these
trends lend support to a demise or decline thesis.
This constitutes a paradox. How can it be that so many scholars us came to
believe that the interest in implementation research had declined to an alarmingly
low level? How did the exponential growth of research literature reported here
escape our attention? The next section, which explores the disciplinary structure of
implementation research, provides some clues to an answer.
Saetren: Facts and Myths about Implementation Research 563

400
350
300
Article
Number

25 0 Books
200 Chapters

150 PhD dissertations


Total
100
50
0
1933
1938
1943
1948
1953
1958
1963
1968
1973
1978
1983
1988
1993
1998
2003
Year
Figure 1. Publications on Policy Implementation by Type and Year.

Table 1. Publications by Type and Two Time Periods

Type of Publication 1933–84 1985–2003 Total


Articles 1,094 2,429 3,523
Books and chapters 323 682 1,005
PhD dissertations 1,091 1,682 2,773
Total 2,498 4,803 7,301
Note: Absolute values are used for the data presented in this table.

Exploring the Disciplinary Foundation of Implementation Research

Many authors have long believed that studies of policy implementation are
mostly carried out by those who have an interest either in political science in
general, or in public administration and public policy in particular. This notion may
seem to follow logically from the fact that a more fundamental understanding of
implementation as an integral part of the public policy process must draw upon and
bridge insights from these fields of knowledge (Winter, 2003). This is also our justi-
fication for defining journals in these three subfields as the core. The heretical ques-
tion raised here is: Does such a core really exist?
The implication of this credo is that we should expect a clear majority of arti-
cles on policy implementation to be published in core journals. Table 2 clearly shows
that such is not the case at all. Articles published in noncore journals far outnum-
ber those appearing in core journals. An alternative and more modest expectation
can be derived from one of the so-called laws of bibliometrics, namely Bradford’s
law. It states that journals in any field of investigation can be divided into three main
categories, each containing roughly the same number of articles: (i) a nucleus of rel-
atively few journals on the subject publishing approximately one-third of all arti-
cles, defined as the core; (ii) another group consisting of many more journals that
produces approximately the same number of articles, referred to as near-core; and
(iii) an even more distant category of a still larger number of journals—the truly
564 Policy Studies Journal, 33:4

Table 2. Articles by type of journal and time periods

Type of journal 1948–1984 1985–2003 Total


Core 15% (159) 12% (295) 13% (454)
Near core 9% (97) 16% (379) 14% (476)
Non-core 76% (838) 72% (1,755) 73% (2,593)
Total 100% (1,094) 100% (2,429) 100% (3,523)
Non-core specified:
Health 12% (126) 16% (384) 15% (510)
Education 17% (178) 14% (330) 15% (508)
Law 12% (131) 5% (131) 7% (262)
Environment 4% (41) 6% (144) 5% (185)
Economics 3% (31) 3% (74) 3% (105)
Other non-core 29% (331) 28% (692) 29% (1,023)
Note: Absolute values are in parentheses.

noncore—which accounts for the last third of articles (Potter, 1988). To test this thesis
I divided all journals registered in a database into these same categories.3
The data (Table 2) even repudiate this moderated expectation of Bradford’s law.
Only in 1980 (Table 3) did I come close to attaining the law’s prediction, as 30 percent
of all articles published that year were actually published in core journals. It is,
however, clearly an exception to the general rule. Only 13 percent of all implemen-
tation articles between 1942 and 2003 were published in core journals. Another 14
percent appeared in the near-core category. Doctoral dissertations reveal a similar
pattern (not shown in Table 2) as approximately 15 percent of all are classified as
belonging to the core fields of political science or public administration. This means
that about three-fourths or more of these two types of publication were located
outside the core.
What academic disciplines, then, constitute the noncore group? It turns out that
education and health journals each account for 15 percent of all implementation arti-
cles (Table 2). Journals devoted to law, economics, and the environment are also impor-
tant as they jointly account for another 14 percent. The education profession figures
much more prominently among doctoral dissertations, as close to two-thirds have
this affiliation. These data attest to the pronounced multidisciplinary character of
research on public policy implementation and a surprisingly modest contribution
of scholars from the core fields. They also help explain why so many of us have
been oblivious to the continued rapid growth of implementation research during
the 1990s: it happened disproportionately in other academic fields and in a bewil-
dering array of different and largely peripheral journals (see Table 4) to which we
could not pay much attention, even if so desired. There are, in other words, good
excuses for our previous ignorance.
Even looking at the core journals, the decline notion cannot be sustained. The
volume of articles in these journals did peak during the mid-1980s and subsequently
stabilized at a somewhat lower level (Figure 2). It would, however, be rather far-
fetched to claim, as some do, that the whole research field has virtually disappeared.
In fact the number of articles in core journals almost doubled during the period from
1985 compared with the previous period. Why then is the notion of the demise of
implementation research so prevalent in spite of facts that clearly suggest the contrary?
Saetren: Facts and Myths about Implementation Research 565

Table 3. Several Measures of Fragmentation of Scholarly Communication in the Journal Literature on


Public Policy Implementation for Selected Years

1970 1975 1980 1985 1990 1995 2000


Number of articles in core journals 1 6 30 20 12 17 16
Number of different core journals in 1 5 14 10 10 13 14
which they were published
Journals to articles ratio in core 1:1 83% 47% 50% 83% 77% 88%
category
Number of articles in noncore 22 31 70 67 86 117 137
journals
Number of different noncore 22 27 64 60 78 110 120
journals in which they were
published
Journals to articles ratio in noncore 1:1 87% 91% 90% 91% 94% 88%
category
Percentage of all articles in core 4% 16% 30% 23% 12% 13% 11%
journals

Table 4. Articles on Public Policy Implementation in Core Journals by Type of Journal


and Time Period

1953–79 1980–84 1985–89 1990–94 1995–99 2000–03 Total


Political science 22% (12) 17% (18) 17% (18) 20% (12) 21% (16) 23% (13) 20% (89)
Public policy 46% (25) 65% (68) 48% (50) 44% (26) 38% (29) 32% (18) 47% (216)
Public administration 32% (17) 18% (19) 35% (37) 36% (21) 41% (31) 45% (25) 33% (150)
(N) = 100% (54) (105) (105) (59) (76) (56) (454)
Note: Absolute values are in parentheses.

300

250

200
Number

Core
150
Noncore
100

50

0
1933
1939
1944
1949
1954
1959
1964
1969
1974
1979
1984
1989
1994
1999

Year
Figure 2. Articles on Policy Implementation Published in Core and Noncore Journals
by Year of Publication.
566 Policy Studies Journal, 33:4

Table 5. Reviews of Books with Policy Implementation as the Key Word in Social Science Citation
Index Journals by Time Period

Time Period 1972–79 1980–85 1986–90 1991–95 1996–99 2000–03 Total


Book reviews 7% (6) 55% (48) 23% (20) 8% (7) 2% (2) 6% (5) 101% (88)
Note: Absolute values are in parentheses.

The paradox must also be explained by the surprising fact that implementation
research, despite its resilience in quantitative terms, went out of fashion among policy
scholars after a relative short peak period that lasted only through the first half of
the 1980s. It is symptomatic that the last symposium on policy implementation
research was published in Policy Studies Review in the late 1980s.4 Even stronger and
more convincing support for this interpretation is provided by data on the number
of book reviews on policy implementation in journals included in the Social Science
Citation Index. Fifty-five percent of all book reviews (Table 5) appeared during the
relative short period from 1980 to 1985 and only 16 percent date after 1990. We have
already seen how the declining interest in implementation studies from around the
mid-1980s coincided with a temporary stagnation and drop in publication output.
More interestingly, the fact that publication volume returned to previous levels
toward the end of the 1980s and has continued unabatedly ever since at even a
higher level did not mean that implementation research had once again become
fashionable among policy scholars.
Thus, Goggin et al. (1990) were simultaneously both right and wrong. Imple-
mentation research continued to grow as they predicted during the 1990s and into
the twenty-first century. The field of public policy during the same period however,
has not been defined by its focus on implementation. On the contrary, even leading
policy scholars came to regard implementation focus as an intellectual “dead end”
and has turned their attention toward other topics such as policy change, policy net-
works, or governance. Indeed, implementation studies became so unfashionable that
even those who still considered them important mistakenly thought they had essen-
tially disappeared. This erroneous notion was exacerbated by a myopic disciplinary
conception of the research field.
Fortunately, as I have pointed out, the interest in policy implementation studies
has been profound and widespread enough to defy narrow disciplinary boundaries.
Thus, its fate has been insulated from the impact of whimsical fashion trends among
scholars in any particular academic field, such as political science.

The Structure of Scholarly Communication

The previous analysis suggests that the structure of scholarly communication


in implementation research is quite fragmented. To corroborate this we need to take
a closer look at the ratio of articles to the number of different journals in which they
were published (fragmentation ratio). Our findings (Table 3) lend strong support to
the fragmentation hypothesis. At the alleged infancy of implementation studies in
1970, the 23 articles were published in an equally different number of journals (i.e.,
Saetren: Facts and Myths about Implementation Research 567

a perfectly fragmented publication pattern). Only one article was published in a core
journal. The closest we came to the opposite pattern, and that implied by Bradford’s
law, was in 1980 when one (core) journal (Policy Studies Journal) accounted for 18
percent of all articles published that year. Nevertheless, even in that year, the one
hundred articles published appeared in 78 different journals; 14 of which were of
the core type. This fragmented publication pattern obviously poses great challenges
for any efforts at systematic knowledge accumulation.

A Closer Look at the Core

The core consists of close to 60 different journals that collectively published 454
implementation articles over a 60-year time period (see Appendix 1). Approximately
half of these articles were published in public policy journals, one-third in public
administration journals, and one-fifth in political science journals (Table 4). It is note-
worthy that the position of public administration journals relative to policy journals
has been reversed during the 1990s compared to earlier. This pattern seems to reflect
a growing interest in implementation research among editors and contributors to
public administration journals during the 1990s. The emergence of the Journal of
Public Administration Research and Theory in the late 1980s is but one of several indi-
cators of this trend. The relatively modest but stable fraction of articles published
in political science journals is also worth noticing.
Two policy journals—Policy Studies Journal and Policy Studies Review—published
the largest share of core articles followed by the Public Administration Review. These
three journals alone published 30 percent of the total. At the other end of the spec-
trum we have the prestigious American Political Science Review with only three arti-
cles, none of which are cited frequently.
The documented explosive growth of implementation research, even in recent
years, invites the intriguing question of what its present size might be. The next
section attempts to provide an approximate answer.

What Is the Size of the Implementation Literature?

Few, if any, serious efforts have ever been made to estimate the total volume of
research on public policy implementation, and for good reason: it is a very demand-
ing and difficult task. Some have even concluded that it is a futile endeavor (Hill &
Hupe, 2002). To this I would counter that even having only an approximate esti-
mate is better than having no clue at all. Furthermore, as others have emphasized,
knowing something about the scale of the total universe of publications in any
research area is a prerequisite as well as the starting point for any serious synthe-
sizing efforts (Cooper & Hedges, 1994).
There appears to be only three earlier attempts to assess the volume of research
on public policy implementation (Hill & Hupe, 2002; O’Toole, 1986; Sinclair, 2001).
They were, however, as far as we can ascertain, limited to a particular type of
publication and time period. I know of no systematic attempt to document the
volume of several types of publication over an extended time period.
568 Policy Studies Journal, 33:4

The universe of any research literature is located in a three-dimensional space.


First, what concepts are most appropriate to identify the relevant literature? Second,
what types of publications are to be searched with chosen key words? Third, what is
the time frame for such searches? In this study, implementation as title-word was the
preferred choice since Pressman and Wildavsky were so instrumental in populariz-
ing the use of this term with regard to studies of policy execution.5 Although this
choice can be challenged,6 it is justified as follows. First, the term implementation
helps to establish a well-defined body of literature that can be retrieved by others
and subjected to replication and critical examination. Second, I think it is reason-
able to assume, as others have done before me (Sinclair, 2001, 81), that those who
want to make a contribution toward the understanding of policy implementation
are more prone to using either implementation or implementing as title words than
those who do not share this goal. The latter group is obviously more peripheral to
our mapping enterprise than the former.
Publications vary from journal articles, books, book chapters, doctoral dissertations,
reports, to research papers to name the most common ones in the reference literature.
We chose to focus primarily on the first four mentioned above. The time frame was
extended as far as the databases allowed. I found approximately 7,300 English pub-
lications using implementation or implementing as a title word, after eliminating
the most obviously irrelevant ones for our analysis.7 Table 1 shows their distribu-
tion by types of publication: 548 books, 432 book chapters, 3,523 journal articles, and
2,773 doctoral dissertations.8 The most surprising in this respect was the large
number of doctoral dissertations, both in absolute terms and relative to the two
other types of publications. They are obviously an unexplored and untapped, rich
source for future efforts to synthesize research results.
To conclude, it is impossible to pinpoint with much precision the true size of
research publications on policy implementation. Any size estimate critically
depends on a number of steps and related judgment calls taken by the researcher
during the search procedures.9 In that sense it remains an elusive utopian ideal.
Nonetheless, I think our database is sufficiently large to ensure a reasonably approx-
imate representation of the policy implementation literature as a whole. Now it is
time to turn to the question of the origin of this elusive universe.

When Did Implementation Research Start?

The truism that Pressman and Wildavsky (1973) pioneered the more explicit use
of implementation as a key term was in no small part created by the authors them-
selves in a postscript chapter, Appendix 2, titled “Use of ‘Implementation’ in Social
Science Literature”:
There is (or there must be) a large literature about implementation in the
social sciences—or so we have been told by numerous people. None of them
can come up with specific citations to this literature but they are certain it
must exist”. . . . “It must be there, it should be there, but in fact it is not”
. . . “How shall we persuade others that it is fruitless to look for a literature
that does not exist? . . . (p. 166)
Saetren: Facts and Myths about Implementation Research 569

It has since been pointed out that many earlier scholars had studied the imple-
mentation of policies using other labels. Few, however, have challenged the notion
that Pressman and Wildavsky were the first scholars to use implementation as an
explicit analytic research term. They made their claim in good faith based on limited
library resources available at that time. My data clearly demonstrate that the claim
by Pressman and Wildavsky is grossly exaggerated. A substantial number of books,
journal articles, and doctoral dissertations using implementation or implementing as a
title-word can be traced back to, at least the mid-1950s and even earlier (see Figure 1).
Doctoral students are the pioneers of implementation research! The first two
dissertations I have registered date back to the early 1930s (Charlesworth, 1933) and
1940s (Laatsch, 1942). They were political scientists and their dissertations focused
on international politics. Neither feature, though, were typical of the first wave of
implementation studies. The next dissertations by Manwell (1947) and Little (1948)
are more typical with their background in the education profession and empirical
focus on educational institutions. Before 1970 the number of doctoral dissertations
on policy implementation had reached well over a hundred.
The first two books (Loughery, 1952 and Turton, 1955) dealt with implementa-
tion aspects of U.S. educational law and the American Constitution, respectively.
Five years later another book appeared and then another 19 other before 1970. The
first article published in a law journal in the late 1940s (Briggs, 1948) dealt with imple-
mentation of a policy issue that is equally relevant today: international human rights
legislation. Before 1970 another 134 articles had been published.
It is also important to note that the “take-off” in terms of publication volume
did not commence with the publication of Pressman and Wildavsky’s book, as is
commonly assumed. Rather, it started at least 10 years earlier. Likewise, my data
challenge the commonly held notion that the focus on policy implementation was
triggered by the early evaluation studies in the mid-1960s of American social reform
programs. Those events probably aided momentum to the new research genre, but
they were not the starting point. To sum this up, by the time Implementation was
published in 1973, close to thirty books, and over two hundred journal articles and
doctoral dissertations respectively that employ implementation or implementing as
a title word had been published or defended!
A word of caution may be in order here. I have not yet been able to investigate
closely the actual use of the implementation concept in these early publications.
Nonetheless, I assume it is not used casually or accidentally in most of them, par-
ticularly not with respect to doctoral dissertations.

Where Did Implementation Research Originate?

Having already established that doctoral students pioneered the study of policy
implementation, I find it logical to use the location of the degree-awarding higher
educational institutions as a prime indicator of where this type of research origi-
nated. Close to half the doctoral dissertations in this field before 1970 can be traced
to American universities in the Northeast and just fewer than 40 percent belonged
to those in the Midwest region. Finally, universities on the West Coast and in the
South accounted for less than 10 percent, respectively. I found only three doctoral
570 Policy Studies Journal, 33:4

dissertations from Stanford University and two from University of California at


Berkeley during this period. Wildavsky’s undisputed influence on later develop-
ment of this research field is not noticeable at this early stage in my data as might
be expected.
On the East Coast, Harvard University figures prominently with 19 percent of
all dissertations produced in this period, and if we include the remaining Ivy League
universities their share constituted 27 percent. In the Midwest region there is a con-
centration of doctoral dissertations at universities located in Chicago, Illinois, in
Michigan, and in Wisconsin.
Only 15 percent of the doctoral dissertations before 1970 are classified as belong-
ing to political science or public administration. It can therefore be concluded that
the implementation focus on public policies did not originate in political science
departments. The predominant focus on one policy sector in particular—education,
which accounts for 65 percent of the total—strongly suggests that the “midwives”
of early implementation research were located in graduate schools or departments of
education. So does also the fact that the relatively few doctoral students with a polit-
ical science background from this period, with few exceptions, studied other policy
sectors, aside from education.

What Policy Issues Are Typically Studied?

Hill and Hupe (2002, 201–204) investigated what policy issues implementation
scholars study by scrutinizing articles published in the most prominent core jour-
nals during the 1990s. Their results match mine closely. I found that implementa-
tion scholars consistently pay greater attention to the following five policy issues:
educational, health, environmental, social, and economic issues (Table 6).
There is, however, considerable variation among types of publication with
respect to their relative focus on each of these five policy sectors. Doctoral disser-
tations stand out, as close to two-thirds of these deal with educational issues. There
is not much variation in this fraction over time. The bias toward one particular
policy sector is much less in books, chapters, and journal articles. Articles in core
journals focus more on environmental and social issues (as Hill and Hupe also
found) while those published in noncore journals are more preoccupied with edu-
cational and health issues. These patterns seem to indicate that political scientists
have a broader research agenda than other professions in terms of sector focus.
Disregarding doctoral dissertations—where there is little variation over time—
attention to health and environmental issues after 1984 has increased compared with
the previous period (Tables 2a and 2h in Appendix 2). The trend at the outset of this
century is that health issues are rivaling or even slightly surpassing those of educa-
tion. The overwhelming majority of sectors and issues studied are domestic. Studies
on foreign policy are conspicuously rare (1–2 percent of all publications), with the
exception of the very early works. Another trend toward the end of the previous
century is that some policy issues that initially had a domestic focus, for example,
environment and civil or human rights, have shifted toward a more international
and global focus.
Saetren: Facts and Myths about Implementation Research 571

Table 6. Publications by Publication Type and Policy Sector Focus

Policy Sector Articles Books and Chapters PhD Dissertations Total


Education 23% 22% 63% 38%
Health 24% 9% 7% 15%
Environment 12% 14% 4% 9%
Social 8% 9% 7% 8%
Economic 8% 15% 4% 8%
Civil/Human rights 4% 6% 0% 3%
Criminal justice 2% 2% 2% 2%
Administrative reform 4% 4% 1% 3%
Agriculture/Forestry/Fishery 3% 4% 2% 3%
Energy 2% 1% 0% 1%
Manpower 2% 3% 1% 2%
Foreign/Military/Defense 1% 2% 1% 1%
(N) = percentage base (3,523) (1,005) (2,773) (7,301)

Table 7. Publications by Type and Region of Focus or Origin

Region Articles Books and Chapters PhD Dissertations Total


United States/Canada 61% 50% 87% 69%
Latin America 2% 2% 3% 2%
Europe 21% 32% 2% 15%
Africa 4% 4% 3% 4%
Asia/Middle East 9% 10% 5% 7%
Oceania 3% 2% 1% 2%
Third World 1% 3% 0% 1%
International/Global 4% 8% 0% 3%
Total: 105% 111% 101% 103%
(N) = percentage base (3,523) (1,005) (2,773) (7,301)

To sum up, there is a clear bias in research topics among implementation schol-
ars toward a few policy sectors. Still, I find it hard to conclude that these issues and
the challenges they present are not among the salient ones in contemporary soci-
eties. To what extent this reflects a regional bias in implementation research will be
clarified next.

The Ethnocentric Bias in Implementation Studies

The overall regional bias of implementation research is quite strong (Table 7).
Close to three out of every four publications have either an American/Canadian
author or a North American empirical focus. If Europe is included, then the Western
hemisphere accounts for close to 90 percent of all publications. The pattern is to a
large extent influenced by the overwhelmingly ethnocentric focus in doctoral
dissertations (87 percent) of predominantly North American origin. Even among the
other types of publications, more than half originate or focus on the North American
continent. The ethnocentric bias is more pronounced in articles than in books.
The good news here is that the regional focus of implementation studies has
broadened since the mid-1980s, although mostly in favor of Europe. Among the
572 Policy Studies Journal, 33:4

other regions, Asia is studied about twice as often as Africa, and Latin America and
Oceania appear as the most neglected continents.
Some early implementation scholars (Smith, 1973) thought implementation
problems were primarily a third-world phenomenon while others like Pressman
and Wildavsky (1973) demonstrated their endemic character in more developed
countries as well. With few exceptions (e.g., Brinkerhoff, 1996; Cheema & Rondinelli,
1983; Grindle, 1980; Grindle & Thomas, 1990; Lazin, 1999), the community of
implementation researchers seems to have followed the lead provided by Pressman
and Wildavsky.

Why Did Implementation Studies Become Unfashionable?

There are at least six factors that may have contributed to the declining interest
in implementation research among policy scholars. First is the protracted and sterile
debate about the two competing analytical paradigms labeled top-down versus
bottom-up, which dominated the research field through most of the 1980s. It was a
confusing debate in how normative, methodological, and theoretical aspects were
almost seamlessly and indistinguishably intertwined. I suspect that this entrenched
and prolonged debate frustrated many scholars to the extent that they exited the
whole research enterprise. Second, state–society relations in many industrialized
nations have changed during the last few decades from unilateral and hierarchical
to more reciprocal and less hierarchical ones. This has been linked to a change in
governments’ political philosophy from the early 1980s most noticeably embodied
in the new radical policy programs of Reagan in the United States and Thatcher in
the United Kingdom. The advocacy of broad-scale retreat from government inter-
vention in the economy and more reliance on market-based policy instruments was
a cornerstone in these programs. These events, of course, did not escape the notice
of policy scholars who increasingly started to substitute the terms government and
implementation with those of governance and policy networks in order to better
describe new realities.
Third, many early implementation studies had a pronounced failure bias. Influ-
ential authors, like Pressman and Wildavsky (1973), even used probability theory
to prove that the chances of any governmental program to succeed were indeed
slim. Their conclusion was later disputed and shown to be overly pessimistic
because of some dubious assumptions in calculations (Bowen, 1982). Nevertheless,
the failure notion associated with implementation research stuck. It earned the nick-
name “misery research” (Rothstein, 1988). This grim picture of the fate of public
policy making generally fueled more conservative criticism of social reform pro-
grams in the United States in the 1980s. They were increasingly perceived as futile
and a waste of money, thereby undermining their political backing in general
(Murray, 1984). Thus, implementation studies, by being hijacked for particular ide-
ological purposes, unwittingly also contributed to delegitimizing themselves as a
research genre.
Fourth, policy scholars themselves started to express doubts about the extent to
which the policy process could be neatly segmented into discrete stages that pro-
gressed sequentially from agenda setting, through adoption, implementation, and
Saetren: Facts and Myths about Implementation Research 573

subsequent policy phases (Jenkins-Smith & Sabatier, 1993; John, 1998; Nakamura,
1987; Sabatier, 1988, 1999). The stages metaphor was accused of oversimplifying and
thus also misrepresenting a more complex and recursive policy process. Although
somewhat overstated at times the criticism has at least reminded us of what Press-
man and Wildavsky emphasized in their seminal book: that implementation cannot
fruitfully be studied and understood as a discrete policy activity. Regardless of
intention it may also have generally contributed further to the delegitimization of
implementation studies.
Fifth, it is easier to write something that gets published at the early stage of a
new research genre than many years later when saying something new and origi-
nal becomes much harder. For the ambitious, prolific author it may then be more
convenient to move on to a new, emerging research topic. Last but not the least
important, policy scholars are probably susceptible to fashion trends in science and
society with respect to their research agendas. The implication of the last two factors
is that few, if any, research topic—including policy implementation—will be able to
hold policy scholars’ attention for any extended period of time. These factors are,
of course, not mutually exclusive. Working together they may provide a plausible
explanation for the declining interest in the policy field with respect to doing imple-
mentation research.

Are Implementation Studies Still Relevant?

The current situation in this field of investigation suggests that the relevance
question is not rhetorical to the larger community of policy researchers. Hence, the
relevance arguments need to be set forth. First, and as others (e.g., Hill & Hupe,
2002, 111) have recently pointed out, there is nothing in the changing state–society
interface toward more cooperative and negotiated networks that precludes an
implementation perspective. On the contrary, given these new systemic features,
representative governments still exist whose translation of policies into practice,
even in some broader sense, is a challenge and a legitimate concern. The multitude
of reports on implementation issues produced by governments and various inter-
national agencies each year attest to their endemic nature. Developments toward
supranational governing structures such as the European Union or international
agreements and conventions on many salient policy issues also typically pose for-
midable implementation challenges. Second, the criticism concerning bias toward
failure cases is probably less relevant today than it once was (O’Toole, 2000). Third,
although some caution concerning the use of the stages metaphor is justified, I, like
some others (e.g., deLeon, 1999), find its total dismissal by Sabatier and others
unwarranted and hence misguided.10 Fourth, we are not even close to a well-
developed theory of policy implementation, although as several commentators have
pointed out, some progress has been made (e.g., O’Toole, 2000; Winter, 2003). In this
respect it is unfortunate that policy scholars have lost interest in researching a
phenomenon for which they are theoretically better equipped than any other group
of scholars. This fact bestows both analytical primacy and special responsibility
upon policy scholars with respect to spearheading theorizing about policy
implementation.
574 Policy Studies Journal, 33:4

In brief, there is no shortage of implementation challenges regardless of form


and level of governing structures. Moreover, the need to understand them more
fully is no less in demand today than during the first half of the 1980s. The very
resilience of this type of research field, despite fashion fads in the policy studies
field, attests to the enduring importance of implementation phenomena in all
spheres of human society.

Is the Time for Revival Already Here?

How imminent is a revival of implementation research in policy studies? Is this


just wishful thinking or is it perhaps already taking place? There are some indica-
tions that a resurgence of interest in policy implementation as an analytical and
theoretical concept may already be occurring. It started with Hill’s (1997) rhetorical
question about whether implementation theory was yesterday’s issue followed by
Lester and Goggin’s (1998) quirky plea to their colleagues in Policy Currents titled
“Back to the Future: The Rediscovery of Implementation Studies.” These initiatives
triggered various immediate responses from other policy scholars (Kettunen, 2000;
deLeon, 1999b; Lester, 2000; Meier, 1999; Potoski, 2001; Schneider, 1999) to be fol-
lowed up by more comprehensive reflections on the subject matter (deLeon, 1999b;
deLeon & deLeon, 2002; Lester, 2000; O’Toole, 2000; Schofield, 2001; Sinclair, 2001).
The fact that Public Administration devoted its spring issue of 2004 to a symposium
on implementing public policy by featuring a number of articles by both “new” and
“old” scholars like Schofield (2004), Schofield and Sausman (2004), Barrett (2004)
and O’Toole (2004) is another noteworthy event.
Equally important in this respect is the publication of two recent books by Hill
and Hupe (2002) and Peters and Pierre (2003). The first is a textbook that tries to
sum up implementation research and link it to some other policy research topics
that have replaced it during the last 15 to 20 years. The second is a new and sub-
stantially revised edition of the Handbook of Public Administration that contains
several state-of-the-art articles on policy implementation research by seasoned
scholars such as May (2003) and Winter (2003).
It is too early to say whether these recent scholarly exchanges constitute a new
groundswell of interest in implementation research. Assuming that is the case, there
is a plethora of challenges facing a resuscitated research agenda. These challenges
are of a conceptual, theoretical, and methodological nature that others have dealt
with elsewhere (Goggin et al., 1990; Hill & Hupe, 2002; O’Toole, 2000; Winter, 2003).
The results I have reported here strongly suggest that what is needed is not
more, but better research. Indeed, the enormous and diverse research literature
documented here constitutes a problem. Over two decades ago, some reviewers
already stated that “we face an abundance of information on implementation” and
concluded that “we need to know how to extract knowledge from the information
we already have.” (Kirst & Jung, 1982, 141). Considering the phenomenal growth
in the research literature since then, the need for knowledge accumulation should
be that much more pressing. Procedures and techniques for doing this have been
developed over the last 25 to 30 years (Cooper & Hedges, 1994; Hunt, 1997; Hunter,
Saetren: Facts and Myths about Implementation Research 575

Schmidt, & Jackson, 1982) and have been applied in other related fields of the social
sciences (Hendricks & Singhal, 1997; Robertson & Seneviratne, 1995; Robertson et
al., 1993). The time is overdue with respect to initiating efforts toward synthesizing
the policy implementation literature. What I have done is draw up as close to a
complete map of the research literature as possible that can guide future efforts
toward its integration.

Harald Saetren is professor of administration and organization theory at the Uni-


versity of Bergen, Norway.

Notes

1. According to Hertzel (2003, 295–296), bibliometrics is the analysis of the structure of a body of liter-
ature using any type of quantitative method.
2. I would like to point out that problems of validity and reliability were endemic features of some of
these databases. For example, searches targeting books and doctoral dissertations specifically would
nevertheless retrieve many other types of publications as well. Publications registered more than
once was another not infrequent problem. Both flaws would—to the uncritical investigator’s
dismay—produce inflated estimates of the actual number of relevant publications. WorldCat was by
far the worst in this respect. Dissertations Abstract had mostly the former problem and less of the
latter while the Social Science Citation Index had none of them as far as I could tell.
3. Journals devoted to theory development in political science, public administration, or public policy/policy
analysis in general constitute the core. A list of these core journals is provided in Appendix 1. It should
be noted that this list does not cover all existing core journals. It covers only those in which I have
registered an implementation-titled article. Journals dedicated to the application of these three study
fields in a particular context are classified as near-core. Typical examples here are Energy Policy, Journal
of Health Politics & Policy, and Administration in Social Work. Journal of Curriculum Inquiry, Elementary
School Journal, and Journal of School Health exemplify the more distant noncore category.
4. The symposium articles appeared in a book 3 years later (Palumbo & Calista, 1990) together with
another seminal book (Goggin et al., 1990). Together they seem to mark the end of an era with respect
to interest in implementation studies by most policy scholars.
5. Actually, public policy implementation is the most correct and accurate composite key term in this
respect, but it is, unfortunately, employed so rarely by both authors and librarians as to render it
practically useless. Even the slightly simpler term policy implementation misses too many relevant
publications because of its inconsistent application. But the single term implementation, even as a title
word, creates the opposite problem as it produces many thousands of references, most of which turn
out to be irrelevant. Nevertheless, and despite a tedious and time-consuming weeding process, the
latter procedure was chosen to ensure comprehensiveness. Implementation as key word was not used
since key words are not generally available in our databases before 1990 when they became syn-
onymous with title-words.
6. There is no doubt, as it has been repeatedly pointed out before, that the execution stage of policy
making was analyzed and studied without using the term implementation, both before and after
Pressman and Wildavsky elevated it to fame among policy scholars in the early 1970s. The problem
here is to come up with the full range of alternative terms that have been used in these cases and
which are needed to identify them. Compliance, enforcement, execution, governance, and regulation
are but a few of the many other terms that would be needed to produce a truly complete coverage
of everything that is relevant.
7. As an illustration, two-thirds of all journal articles using the title-word implementation had no
relevance to public policy in my fairly liberal judgment.
8. Three decisions were critical to the size estimate I arrived at. The first was to base my search strate-
gies on an understanding of implementation studies as a potentially multidisciplinary research field,
which meant broadening the mapping procedure beyond narrow disciplinary boundaries (e.g., Polit-
ical Science Abstracts). The second was to include doctoral dissertations, a hitherto hidden and unrec-
576 Policy Studies Journal, 33:4

ognized important source of scientific research. The third decision was to extend the searches in all
data sources over the maximum time period feasible, both backwards and forwards
9. Regardless of choices made during any search procedure, there are two classical problems the
researcher faces (White, 1994). The first is to miss publications that should have been included (false
negatives), a problem I have already elaborated upon above. The second is to include publications
that should not have been there (false positives). The magnitude of the second problem can only be
ascertained through a closer examination of each publication, or a random sample thereof, which is
beyond the scope of this article.
10. It is true that the public policy process does not necessarily follow the sequential logic of the stages
metaphor in all cases, especially not its later stages (like implementation, evaluation, feedback, and
policy learning). Nevertheless, the stages metaphor is important because it reflects institutional rules
and norms about how public policies should to be transformed from ideas to practice in modern
political systems. The appropriateness of this analytical construct should not be taken for granted or
dispelled axiomatically. What the stages metaphor does is to describe a hypothetical process
sequence, the validity of which must always remain an empirical question.

References

Alexander, E. R. 1985. “From Idea to Action: Notes for a Contingency Theory of the Policy Implementa-
tion Process.” Administration & Society 16 (4): 403–26.
Barrett, S. M. 2004. “Implementation Studies: Time for a Revival? Personal Reflections on 20 Years of
Implementation Studies.” Public Administration 82 (2): 249–62.
Borgman, C. L. 1990. Scholarly Communication and Bibliometrics. Newbury Park, London and New Dehli:
Sage.
Bowen, E. R. 1982. “The Pressman-Wildavsky Paradox: Four Addendas or Why Models Based on Prob-
ability Theory Can Predict Implementation Success and Suggest Useful Tactical Advice for Imple-
menters.” Journal of Public Policy 2 (1): 1–22.
Briggs, H. W. 1948. “Implementation of the Proposed International Covenant on Human Rights.” Amer-
ican Journal of International Law 42 (2): 389–97.
Brinkerhoff, D. W. 1996. “Process Perspectives on Policy Change. Highlighting Implementation.” World
Development 24 (9): 1395–402.
Charlesworth, J. C. 1933. The Implementation of the Pact of Paris. PhD diss., University of Pittsburgh.
Cheema, G. S., and D. A. Rondinelli. 1983. Dentralization and Development: Policy Implementation in Devel-
oping Countries. Beverly Hills, CA: Sage.
Cooper, H. M., and L. V. Hedges. 1994. The Handbook of Research Synthesis. New York: Russell Sage
Foundation.
Goggin, M. L., A. M. O’Bowman, J. P. Lester, and L. J. O’Toole Jr. 1990. Implementation Theory and Prac-
tice: Toward a Third Generation. Glenwood, IL: Scott Foreman/Little, Brown.
Grindle, M. S. 1980. Politics and Policy Implementation in the Third World. N.J.: Princeton University Press.
Grindle, M. S., and J. W. Thomas. 1990. Public Choices and Policy Change. Baltimore/London: The Johns
Hopkins University Press.
Hendricks, K. B., and V. R. Singhal. 1997. “Does Implementing an Effective TQM Program Actually
Improve Operating Performance? Empirical Evidence from Firms That Have Won Quality Awards.”
Management Science 43 (9): 1258–74.
Hertzel, D. H. 2003. “Bibliometrics History.” In Encyclopedia of Library and Information Science, 2nd edn,
ed. M. A. Drake. New York: Marcel Dekker, 144–211.
Hill, M. 1997. “Implementation Theory: Yesterday’s Issue?” Policy and Politics 25 (4): 375–85.
Hill, M., and P. Hupe. 2002. Implementing Public Policy: Governance in Theory and in Practice. London,
Thousand Oaks, and New Delhi: Sage.
Hunt, M. 1997. How Science Takes Stock: The Story of Meta-Analysis. New York: Russell Sage Foundation.
Hunter, J., F. L. Schmidt, and G. B. Jackson. 1982. Meta-Analysis: Cumulating Research Findings across
Studies. Beverly Hills, CA: Sage.
Saetren: Facts and Myths about Implementation Research 577

Ingram, H. 1990. “Implementation: A Review and Suggested Framework.” (Chapter 18) In Public Admin-
istration: The State of the Art, ed. N. M. Lynn, and A. Wildavsky. Chatham, NJ: Chatham House,
462–480.
Jackson, G. B. 1980. “Methods for Integrative Reviews.” Review of Educational Research 50 (3): 438–60.
Jenkins-Smith, H., and P. A. Sabatier. 1993. “The Study of the Public Policy Process.” In Policy Change and
Learning: An Advocacy Coalition Approach, ed. P. A. Sabatier, and H. Jenkins-Smith. Boulder, CO: West-
view Press, 1–9.
John, P. 1998. Analysing Public Policy. London & New York: Pinter.
Kettunen, P. 2000. “Implementation Approach. The Political Scientist’s Perspective”. Policy Currents 10
(1–2): 3–5.
Kirst, M., and R. Jung. 1982. “The Utility of a Longitudinal Approach in Assessing Implementation: A
Thirteen-Year View of Title I, ESEA.” In Studying Implementation: Methodological and Administrative
Issues, ed. W. Williams, and R. E. Elmore. Chatham, NJ: Chatham House, 119–148.
Laatsch, M. H. A. 1942. The Implementation of the Monroe Doctrine. PhD diss., Princeton University.
Lazin, F. A. 1999. The Policy Implementation Process in Developing Nations. Stamford, Conn.: JAI Press.
deLeon, P. 1999a. “The Missing Link Revisited: Contemporary Implementation Research.” Policy Studies
Review 16 (3/4): 311–38.
———. 1999b. “Cold Comfort Indeed.” Policy Currents 8 (4): 6–8.
deLeon, P., and L. deLeon. 2002. “Whatever Happened to Policy Implementation? An Alternative
Approach.” Journal of Public Administration Research and Theory 12 (4): 467–88.
Lester, J. P. 2000. “Back to the Future in Implementation Research: A Response.” Policy Currents 10 (1–2):
2–5.
Lester, J. P., A. M. O’Bowman, M. L. Goggin, and L. J. O’Toole Jr. 1987. “Public Policy Implementation:
Evolution of the Field and Agenda for Future Research.” Policy Studies Review 7 (1): 200–16.
Lester, J. P., and M. L. Goggin. 1998. “Back to the Future: The Rediscovery of Implementation Studies.”
Policy Currents 8 (3): 1–9.
Linder, S. H., and B. G. Peters. 1987. “A Design Perspective on Policy Implementation: The Fallacies of
Misplaced Prescription.” Policy Studies Review 6 (3): 459–75.
Little, J. R. 1948. A Study of Present Practices and the Opinions of Educational Authorities with Respect to the
Implementation of State Responsibility for Public Education and Its Application to the State of Colorado.
PhD diss., University of Colorado at Boulder.
Loughery, B. F. S. 1952. Parental Rights in American Educational Law. Their Basis and Implementation. Wash-
ington, DC: Catholic University of America Press.
McLaughlin, M. W. 1987. “Learning from Experience: Lessons from Policy Implementation.” Education
Evaluation and Policy Analysis 9 (2): 171–8.
Manwell, E. A. 1947. Health in High School Science. The Implementation of a Philosophy of Teaching in High
School Science with a Study of Accompanying Changes in Pupils. PhD diss., Columbia University.
Matland, R. E. 1995. “Synthesizing the Implementation Literature: The Ambiguity-Conflict Model of
Policy Implementation.” Journal of Public Administration Research and Theory 5 (2): 145–74.
May, P. 2003. “Policy Design and Implementation.” (Chapter 17) In Handbook of Public Administration, ed.
B. G. Peters, and J. Pierre. London, Thousand Oaks, CA and New Delhi, India: Sage, 223–233.
Meier, K. 1999. Are We Sure Lasswell Did It This Way? Lester Goggin and Implementation Research.”
Policy Currents 9 (1): 5–8.
Murray, C. 1984. Loosing Ground: American Social Policy 1950–1980. New York: Basic Books.
Nakamura, R. T. 1987. “The Textbook Policy Process and Implementation Research.” Policy Studies Review
7 (1): 142–54.
O’Toole, L. J., Jr. 1986. “Policy Recommendations for Multi-Actor Implementation: An Assessment of the
Field.” Journal of Public Policy 6 (2): 181–210.
———. 2000. “Research on Policy Implementation: Assessment and Prospects.” Journal of Public Admin-
istration Research and Theory 10 (2): 263–88.
———. 2004. “The Theory-Practice Issue in Policy Implementation Research.” Public Administration 82
(2): 309–29.
578 Policy Studies Journal, 33:4

Palumbo, D. J., and D. J. Calista. 1990. Implementation and the Policy Process: Opening Up the Black Box.
New York: Greenwood Press.
Peters, B. G., and Pierre, J. 2003. Handbook of Public Administration. London/Thousand Oaks/New Delhi:
Sage.
Potter, W. G. 1988. “Of Making Many Books There Is No End: Bibliometrics and Libraries.” The Journal
of Academic Librarianship 14 (9): 1–3.
Potoski, M. 2001. “Implementation, Uncertainty and the Policy Sciences.” Policy Currents 11 (1): 2–5. (2):
283–308.
Pressman, J. L., and A. B. Wildavsky. 1973. Implementation. How Great Expectations in Washington Are
Dashed in Oakland. Berkeley, CA and London: University of California Press.
Robertson, P. J., and S. J. Seneviratne. 1995. “Outcomes of Planned Organizational Change in the Public
Sector: A Meta-Analytic Comparison to the Private Sector.” Public Administration Review 55 (6):
547–57.
Rothstein, B. 1988. Just Institutions Matter. The Moral and Political Logic of the Universal Welfare State. Cam-
bridge, UK: Cambridge University Press.
Ryan, N. 1996. “Some Advantages of an Integrated Approach to Implementation Analysis. A Study of
Australian Industrial Policy.” Public Administration 74 (4): 737–53.
Sabatier, P. A. 1986. “Top-Down and Bottom-Up Approaches to Implementation Research: A Critical
Analysis and Suggested Synthesis.” Journal of Public Policy 6 (1): 21–48.
———. 1988. “An Advocacy Coalition Framework of Policy Change and the Role of Policy-Oriented
Learning Therein.” Policy Sciences 21 (2–3): 129–68.
———. 1999. “The Need for Better Theories.” (Chapter 1) In Theories of the Policy Process, ed. P. A. Sabatier.
Boulder, CO: Westview Press, 3–17.
Saetren, H. 1996. “Whatever happened to Implementation Research?” Paper for presentation at the con-
ference of the Nordic Association of Political Science, Helsinki, Finland.
Schneider, A. L. 1999. “Terminator? Who, Me? Some Thoughts about the Study of Policy Implementa-
tion.” Policy Currents 9 (1): 1–5.
Schofield, J. 2001. “Time for a Revival? Public Policy Implementation: A Review of the Literature and
Agenda for Future Research.” International Journal of Management Review 3 (3): 245–63.
Schofield, J. 2004. “A Model of Learned Implementation.” Public Administration 82 (2): 283–308.
Schofield, J., and C. Sausman. 2004. “Symposium on Implementing Public Policy: Learning from Theory
and Practice: Introduction.” Public Administration 82 (2): 235–48.
Sinclair, T. A. P. 2001. “Implementation Theory and Practice: Uncovering Policy and Administration Link-
ages in the 1990’s.” International Journal of Public Administration 24 (1): 77–94.
Smith, T. B. 1973. “The Policy Implementation Process.” Policy Sciences 4 (2): 197–209.
Turton, J. 1955. Implementing the American Constitution. A Treatise on Constitutional Procedures with Special
Reference to the Supposed ‘Amendments.’ New York: Exposition Press.
White, H. D. 1994. “Scientific Communication and Literature Retrieval.” (Chapter 4) In Handbook of
Research Synthesis, ed. H. Cooper, and L. V. Hedges. New York: Russell Sage Foundation, 42–55.
Winter, S. 1990. “Integrating Implementation Research.” In Implementation and the Policy Process: Opening
Up the Black Box, ed. D. J. Palumbo, and D. J. Calista. New York: Greenwood Press, 19–38.
———. 1999. “New Directions for Implementation Research.” Policy Currents 8 (4): 1–5.
———. 2003. “Implementation Perspectives: Status and Reconsideration.” (Chapter 16) In Handbook of
Public Administration, ed. B. G. Peters, and J. Pierre. London, Thousand Oaks, CA, and New Delhi:
Sage, 212–222.
Yin, R. 1982. “Studying the Implementation of Public Programs.” In Studying Implementation: Method-
ological and Administrative Issues, ed. W. Williams, and R. E. Elmore. Chatham, NJ: Chatham House,
36–72.
Saetren: Facts and Myths about Implementation Research 579

Appendix 1: Journals Classified as Belonging to the Core.


Number of Articles in Each Journal in Parentheses

Policy Journals:
Policy Studies Journal (54)
Policy Studies Review (43)
Policy & Politics (23)
Public Policy/Journal of Policy Analysis and Management (20)
Policy Sciences (17)
Evaluation Review (15)
Journal of Public Policy (12)
Policy Analysis (7)
Policy Currents (7)
Journal of European Public Policy (4)
Canadian Public Policy (3)
Cato (3)
Policy Studies Annual Review (1)
Policy Studies (1)
Local Government Policy Making (2)
Policy Evaluation (1)
The Review of Policy Research (1)
American Journal of Evaluation (1)
Public Policy and Administration (1)
Public Administration Journals:
Public Administration Review (39)
Administration and Society (22)
Public Administration (21)
International Journal of Public Administration (15)
Australian Journal of Public Administration (12)
Journal of Public Administration
Research and Theory (11)
American Review of Public Administration (11)
Indian Journal of Public Administration (8)
International Review of Administrative Science (6)
Philippine Journal of Public Administration (3)
Canadian Public Administration
Public Administration Quarterly (1)
Public Management Review (1)
Sage Public Administration Abstracts
Political Science Journals:
Publius (11)
European Journal of Political Research (9)
Western Political Research (9)
Scandinavian Political Studies (8)
Journal of Politics (7)
Political Science Quarterly (4)
American Political Science Review (3)
American Journal of Political Science (3)
Comparative Political Studies (2)
Political Research Quarterly (2)
Comparative Politics (2)
American Politics Quarterly (2)
Governance (2)
Journal of Political Economy (2)
Polity (2)
Journal of Political Economy (2)
Politeia (2)
580 Policy Studies Journal, 33:4

Annals of the American Academy of Political Science (1)


British Journal of Politics & International Relations (1)
Journal of Theoretical Politics (1)
Journal of Commonwealth Comparative Politics (1)
Schweizerische Zeitschrift fur Politikwissenschaft (1)
Statsvetenskaplig Tidsskrift (1)
Acta Politica (1)
Political Studies (1)
Political Quarterly (1)
Proceedings of the Academy of Political Science
World Politics (1)
West European Politics (1)
Tulane Studies in Political Science (1)
Journal of Commonwealth Comparative Politics (1)

Appendix 2: More Tables on Publications by Time Period,


Policy Focus, Region, etc.

Table 2a. Books and Chapters by Policy Sector Focus and Period
of Publication

Policy Sector 1942–84 1985–2002 Total


Education 20% (66) 22% (151) 22% (217)
Environment 12% (38) 16% (106) 14% (144)
Health 7% (24) 10% (69) 9% (93)
Social 13% (43) 7% (51) 9% (94)
Economic 12% (39) 16% (108) 15% (147)
Total for the five sectors 64% 71% 69%
(N) = percentage base (323) (682) (1,005)
Note: Absolute values are in parentheses.

Table 2b. Books and Chapters by Regional Focus and Period


of Publication

Region 1942–84 1985–2002 Total


United States/Canada 67% (215) 42% (288) 50% (503)
Latin America 2% (7) 2% (10) 2% (17)
Europe 24% (78) 35% (239) 32% (317)
Africa 3% (10) 4% (30) 4% (40)
Asia and Middle East 6% (18) 12% (82) 10% (100)
Oceania 0% (0) 3% (17) 2% (17)
Third World unspecified 3% (8) 3% (22) 3% (30)
International and global 2% (7) 10% (71) 8% (78)
(N) = percentage base 107% (323) 111% (682) 111% (1,005)
Note: Absolute values are in parentheses.
Saetren: Facts and Myths about Implementation Research 581

Table 2c. Articles by Policy Sector Focus and Period of Publication

Policy Sector 1942–84 1985–2003 Total


Education 26% (277) 21% (516) 23% (795)
Environment 9% (95) 13% (326) 12% (421)
Health 16% (177) 27% (650) 24% (827)
Social 8% (89) 8% (193) 8% (282)
Economic 10% (109) 7% (168) 8% (277)
Total for the five sectors 69% 76% 75%
(N) = percentage base (1,094) (2,429) (3,523)
Note: Absolute values are in parentheses.

Table 2d. Articles by Regional Focus and Period of Publication

Region 1942–84 1985–2003 Total


United States/Canada 72% (791) 56% (1,357) 61% (2,148)
Latin America 1% (7) 2% (45) 2% (52)
Europe 14% (156) 23% (565) 21% (721)
Africa 3% (36) 4% (109) 4% (145)
Asia and Middle East 6% (70) 10% (232) 9% (303)
Oceania 2% (19) 3% (77) 3% (96)
Third World unspecified 1% (12) 1% (33) 1% (45)
International and global 3% (36) 5% (112) 4% (148)
Total 102% 104% 105%
(N) = percentage base (1,094) (2,429) (3,523)
Note: Absolute values are in parentheses.

Table 2e. PhD Dissertations by Sector Focus and Time Period

Policy Sector 1933–84 1985–2003 Total


Education 62% (676) 64% (1,074) 63% (1,750)
Health 6% (63) 8% (137) 7% (200)
Social 8% (80) 6% (102) 7% (182)
Environment 3% (27) 4% (74) 4% (101)
Economic 5% (50) 3% (53) 4% (103)
Total for the five sectors 84% 85% 85%
(N) = percentage base (1,091) (1,682) (2,773)
Note: Absolute values are in parentheses.
582 Policy Studies Journal, 33:4

Table 2f. PhD Dissertations by Regional Focus and Time Period

Region 1933–84 1985–2003 Total


United States/Canada 90% (978) 85% (1,426) 87% (2,404)
Latin America 2% (17) 2% (30) 2% (52)
Europe 2% (20) 3% (55) 3% (75)
Africa 3% (35) 3% (58) 3% (93)
Asia and Middle East 4% (41) 6% (102) 5% (143)
Oceania 0% (3) 1% (9) 0% (12)
(N) = percentage base 101% (1,091) 100% (1,682) 100% (2,773)
Note: Absolute values are in parentheses.

Table 2g. Articles by Policy Sector, Type of Journal, and Period of Publication

Policy Sector 1948–84 1985–2003 Total


Core Noncore Core Noncore Core Noncore
Education 7% (11) 29% (266) 5% (16) 23% (500) 6% (27) 25% (766)
Environment 13% (21) 8% (74) 16% (47) 13% (279) 15% (68) 12% (353)
Health 9% (14) 17% (163) 5% (16) 28% (634) 7% (30) 26% (797)
Social 13% (20) 7% (69) 14% (41) 8% (152) 13% (61) 7% (221)
Economic 10% (16) 10% (93) 8% (22) 7% (146) 8% (38) 8% (239)
Total 52% 71% 48% 79% 49% 78%
(N) = percentage (159) (935) (295) (2,134) (454) (3,069)
base
Note: Absolute values are in parentheses.

Table 2h. Articles by Regional Focus, Type of Journal, and Period of Publication

Region 1948–84 1985–2003 Total


Core Noncore Core Noncore Core Noncore
United States/ 70% (111) 73% (680) 64% (188) 55% (1,169) 66% (299) 60% (1,849)
Canada
Latin America 0% (0) 1% (7) 2% (5) 2% (40) 1% (5) 2% (47)
Europe 26% (42) 12% (114) 21% (61) 24% (504) 23% (103) 20% (618)
Africa 0% (0) 4% (36) 3% (8) 5% (101) 2% (8) 4% (137)
Asia and Middle 3% (5) 7% (65) 8% (24) 10% (209) 6% (29) 9% (274)
East
Oceania 2% (3) 2% (16) 5% (14) 3% (63) 4% (17) 3% (79)
Third World 1% (2) 1% (10) 1% (2) 2% (31) 1% (4) 1% (41)
International and 1% (1) 4% (35) 1% (3) 5% (109) 1% (4) 5% (144)
global
Total 103% 104% 105% 106% 104% 104%
(N) = percentage (150) (935) (295) (2,134) (454) (3,069)
base
Note: Absolute values are in parentheses.

You might also like