You are on page 1of 16

Factors Influencing the Use of Performance Data to Improve Municipal Services: Evidence from

the North Carolina Benchmarking Project


Author(s): David N. Ammons and William C. Rivenbark
Source: Public Administration Review, Vol. 68, No. 2 (Mar. - Apr., 2008), pp. 304-318
Published by: Wiley on behalf of the American Society for Public Administration
Stable URL: http://www.jstor.org/stable/25145604
Accessed: 20-09-2015 15:13 UTC

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at http://www.jstor.org/page/
info/about/policies/terms.jsp

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content
in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship.
For more information about JSTOR, please contact support@jstor.org.

Wiley and American Society for Public Administration are collaborating with JSTOR to digitize, preserve and extend access
to Public Administration Review.

http://www.jstor.org

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions
David N. Ammons

William C. Rivenbark
University of North Carolina at Chapel Hill

Twin Studies Factors Influencing the Use of Performance Data to Improve


Targeting Services: Evidence from the North Carolina
Grassroots Municipal
Performance
Benchmarking Project

David N. Amnions isAlbertCoates


Many local governments measure and report their per service improvement (see, e.g., ASPA 1992; GASB
Professor
of Public and
Administration
formance, but the recordthese governments
of
in
actually 1989; ICMA 1991; NAPA 1991). How, they have
Government at theUniversity
of North
Carolinaat ChapelHill.He isthe authorof
measures to services is more asked, can
governments be accountable unless
using performance improve truly
MunicipalBenchmarks: AssessingLocal modest. The authors
of
this
study
examine patterns
of they not only document their financial condition but
Performance andEstablishingCommunity measurement use a set on on
among North also report service levels and, service effec
Standards(Sage,2001) and Toolsfor performance of ideally,
DecisionMaking:A Practical GuideforLocal Carolina cities and conclude that the types of measures tiveness and the efficiency of service delivery? And
Government(CQPress,2002).His research on which
officials rely, the willingness of officials to how can officials and improve
manage departments
interestsincludelocalgovernment to which measures
embrace and the services without measures?
comparison, degree performance Evidently,
management,performance measurement,
are into many officials saw the of the advice
andbenchmarking. incorporated key management systems distinguish logic proponents'
E-mail: ammons@sog.unc.edu cities that are more to use measures or succumbed to the pressure of the growing band
likely performance for
service improvementfrom those less likely to do so. wagon for performance measurement.
Today, many
William C. Rivenbark isan associate
local governments measure
and
professorof publicadministration performance, although
government at theUniversity
of North
Surveys of local government officials suggest that often at only the workload or output level. Typically,
Carolinaat ChapelHill.He isthecoauthor
the practice of collecting measures, report their measures in their
budget or, perhaps,
of Performance in Stateand performance they
Budgeting
at least at a level, is well in a or on the government's Web site.
LocalGovernment(M.E.Sharpe,2003).His rudimentary fairly special report
and
researchinterestsincludeperformance established U.S. cities and counties (Berman
among
managementin localgovernment.
financial
andWang 2000; GASB and NAPA 1997; Melkers The record of local in the actual use
governments
E-mail: rivenbark@sog.unc.edu
andWilloughby 2005; O'Toole and Stipak 2002; of performance measures in
managerial
or
policy
Poister and Streib 1999).1 Robert Behn even declares decisions?beyond simply reporting
the numbers?

is (2003, is much the difference between the


playfully, "Everyone measuring performance" spottier. Noting
In contrast, the practice of actually these of measures (i.e., the design
586). using adoption performance
measures to influence decisions or to services and collection of measures) and implementation (i.e.,
improve
is less apparent and far less documented (Hatry 2002). actual use), Patria de Lancer Julnes and Marc Holzer

(2001) conclude that a subset of the state and


only
local in local that collect measures use
Clearly, governments' progress using perfor governments actually
mance measures to influence decisions and them to decision These authors and
program improve making.2
service delivery has lagged behind their pace in col others who have attempted by
means of broad surveys
and reporting basic measures. Nevertheless, to information on the actual use of measures find
lecting gain
some local governments are their modest evidence of and, even
using performance only implementation
measures to influence program decisions and improve then, the possibility of overstatement
they acknowledge
services. This article identifies several initiatives inspired
when such information is self-reported and specific
measurement 15 North Carolina documentation claims is
by performance among substantiating respondents'
cities in a
decade-long comparative performance
not required (Poister and Streib 1999, 332). Although
engaged
measurement It examines some of the a 1997 claims that mea
project. likely survey
produced performance
reasons for the greater use of measures sures had resulted in in program focus,
performance changes budgets,
set of cities and decisions of city governments, Theodore H. Poister
for service improvement decisions by this

compared
to cities in
general
and also the reasons for and Gregory Streib detected a tendency for "favorable
levels of use of measures these 15 cities. of the effectiveness of these ... to
varying among ratings systems
. . . few substan
outstrip reported impacts. [RJelatively
Measures for Reporting and More? tial effects were claimed" (1999, 334).
For many years, associations and others
professional
have local officials to measure Behn counts for performance measure
urged government eight purposes
for the sake of greater and ment but contends that one of the eight,
performance accountability fostering

304 Public Administration Review March April


| 2008

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions
improvement,
is "the core pur daily important or helpful (Ho
behind the other seven" Behn counts and Coates Others con
pose eight purposes 2004).

(2003, 586). Those other for performance measurement, tend that performance
measure

seven?to evaluate, control, but contends that one of the ment is likely to have an
motivate, influence on
budget, promote, is important manage
and learn?are means eight, fostering improvement, rial and policy decisions only
celebrate,
"the core purpose behind the
to the desired end and core when are
pur steps intentionally
to Yet hard evidence
other seven." Those other taken to measures into
pose: improve. integrate
mea seven?to evaluate, control, or
documenting performance key management systems
surement's on decision
impact manage budget, motivate, promote, processes?for example,
ment decisions and service means work
celebrate, and learn?are departmental objectives,
is rare. from and
improvements Apart to the desired end and core plans, budget proposals
a small set of celebrated decisions, and strategic
relatively planning
purpose: to
cases?for instance, New York improve. (Poister and Streib 1999; Clay
City's CompStat,
a data-driven and Bass 2002).
system that proved effective in

fighting crime (Silverman 2001; Smith and Bratton Each of these


explanations
is
plausible. Perhaps
several

2001); Baltimore's CitiStat, which expanded the con others, not


listed here, are as well. To pass beyond mere

to a wide of services (Behn however, a needs to be


cept array municipal conjecture, possible explanation
2006); and several other isolated cases reported in tested among multiple governments, ideally
in a con

various venues (see, e.g., Osborne and Gaebler 1992; trolled or semicontrolled setting inwhich comparisons
Osborne and Hutchinson 2004; Osborne and Plastrik can be made and claims can be confirmed. For this

2000; Wang 2002; and theWeb sites of the GASB study,


we examine the characteristics and patterns of

and ICMA)?most claims of measure measurement use 15 cities


performance performance among
ment's value in
influencing decisions and improving participating in the North Carolina Benchmarking
services tend to be broad and disappointingly vague Project. Through
their experience,
we
explore several

(Melkers andWilloughby 2005). Even the presumed factors that appear to distinguish municipalities that
linkage to budget decisions, although promised in present clear evidence of the use of
performance
mea

theory, is often difficult to detect in practice (Joyce sures in the


making
of important decisions from others

1997; Melkers andWilloughby 2005; OToole and that do not.

Stipak 2002; Wang 2002).


or actual use of performance
Implementation
The limited use of measures for much beyond public measurement has been defined by De Lancer Julnes
detractors would even the nonuse and Holzer to include "the actual use ... for strate
reporting?some say,

by most adopters?has led some public officials and gic planning,


resource allocation, program manage
to the net value of ment, evaluation, and to
employees question collecting monitoring, reporting
measures in the first and some scholars to note internal elected officials, and citizens
place management,
the gap between rhetoric and reality (Berman 2002; or the media" (2001, 695). In this study, we employ
Bouckaert and Peters 2002; and a narrower definition of use. For
purposes,our
Coplin, Merget,
Bourdeaux 2002; Dubnick 2005; Grizzle 2002; Kelly actual use excludes
simply reporting
measures or

2002; Poister 2003; Streib and Poister 1999;Weitzman, somewhat


vaguely considering
measures when moni

Silver, and Brazill 2006).3 Even several of the presumed toring operations.4
For us, credible claims of actual
leaders in use of measures evidence of
performance management enjoy reputations performance require
that they admit are somewhat inflated. In a recent an on decisions at some level of the
study impact
of 24 cities with outstanding managing for results organization.
reputations, one-third withdrew from the follow-up

probe phase,
with several
saying that they
were not as North Carolina Benchmarking Project
far as their would (Burke and the desire local
along reputation imply Prompted by among government
Costello 2005). officials for better cost and performance data with
which to compare services, the North
municipal
Many explanations
are offered for the use or nonuse Carolina Benchmarking Project was established in
of
performance
measures in local government. Some 1995 by a set of seven municipalities and the Institute
observers to the support of as of Government at the of North Carolina.
point top management University
a crucial in measurement This is similar in and motive to many
ingredient performance project principle
success (de Lancer Julnes and Holzer 2001;
Page and other cooperative projects, but it is distinctive in at

Malinowski 2004). Some suggest that interest in least three ways. First, the
organizers of this project
measures elected officials or citi realized, more than organizers of most similar
performance among projects
zen involvement in the and even have, that their would be and
development undertaking complex,
the collection of performance measures can be resisted the to compare all service
espe they temptation

The Use of Performance Data to Improve Municipal Services 305

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions
functions. a coordinated use
Instead, they started small and have only project made confirmation of data
to compare more than the claims in the 15 cities were
gradually expanded original relatively easy. Officials

queried by survey during the spring of 2005 and


seven functions at the outset. Second, the
targeted
has focused on cost on-site interviews, followed in some
project meticulously accounting subsequently by
issues and the uniform of cost cases calls and e-mail for
application accounting by telephone correspondence
rules across As a result, clarification and further details. The
participating municipalities. survey question
project participants exhibit a
greater than typical
naire inquired about broad applications of performance
of trust in measures, unit measures communication with elected officials
degree efficiency typically (e.g.,
costs, developed through this project (Ammons, Coe, and uses
citizens, in support of long-range
planning,
and Lombardo 2001). Third, the project has survived use of measures in the
budget process), preferences
for more than a decade?and a few measures and analytic staff
only undertakings among techniques (e.g.,
of this sort can make that claim. The reliance on outcome, or measures
project's efficiency, output
continuation is testimony to the value as and the methods used to the measures), and
project's analyze
perceived by the participating governments. By 2005, documented examples of performance data
being
the North Carolina project had grown to include 15 used to alter
performance
to reduce costs or
improve
cities and towns (herein referred to as cities) service The and supporting material
simply quality. responses

ranging in population from 24,357 to 599,771 resi revealed extensive use in some cities, showed less use
dents.5 The median population in 2005 was 144,333. in others, and suggested possible factors influencing the
difference.
In what ways is the project valuable to
participating
cities? a of benefits, both The taken in this has over
Participants report variety approach study advantages
tangible and intangible. Among the intangibles cited the two more common methods of
performance
are the of a of cities measurement research: the case and
importance being among group single-city study
in as a man the multicity survey, without docu
engaged something regarded progressive usually required
initiative, increased awareness of the practices mentation or to confirm claims.
agement follow-up respondent
of other governments, and the project's stimulating The former typically lacks the breadth supplied by
effect in
encouraging officials to consider service amulticity study. The latter, usually in the form of
and to make data-driven decisions. mail survey, often information
delivery options fixed-response produces
Other benefits are more and include of and relevance to
reported tangible questionable reliability performance
measurement and measurement and has been criticized as
improved performance reporting, practice

handy
access to better data for those instances when
"methodologically inappropriate" (Frank and D'Souza
the governing or chief executive 2004, 704). Without the of documenta
body requests requirement
the ability to use data in reports tion or the promise of follow-up, local officials
comparisons, project many
and studies, and service and to such surveys are to overstate
special improved quality responding tempted
the greatest test for a their organization's and use of management
efficiency. Perhaps comparative adoption
measurement as well as for per deemed to be such as perfor
performance project, techniques progressive,
formance measurement in is whether the mance measurement 1997). More intensive
general, (Wang
data are used to influence and review on a basis provides
performance being opera thorough case-by-case
tions. A few such examples early in the proj greater
assurance of an accurate reflection of conditions,
appeared
more in recent years. as well as an to
ect's history. Many have emerged opportunity verify claims of performance
measurement uses This set
beyond reporting. study's
Research Inquiry and Methodology of mini-case studies?less intensive
individually than

from the remarks of local government observers a full-scale case but much more intensive than a
Judging study
use elsewhere, the of modest breadth
who report minimal the record of simple survey?has advantages

performance
data use by participants in the North aswell as relative depth and detail. This approach
Carolina seems and probably the opportunity to confirm the
project reasonably good provided investigators
surpasses that of many other local governments. If assertions of municipal officials.

cities in the use of


participating project make greater

performance measures, why


is this so? And why do Using Performance Data for Service
some of the in the use the data to
participants project Improvement
influence more than other do? The first instance of major impact from the use of
operations participants
data occurred in the North Carolina
early project's
In an attempt to answer these the authors when officials of one of the participating cities
questions, history,

queried project officials in the 15 cities participating examined the efficiency measures for residential refuse
in the North Carolina Benchmarking Project in 2005 collection in other cities and found their own measures

their experiences and the uses being


made of to be far out of line. The measures indicated high
unit
regarding
data. Unlike a random of cities, where costs and low worker After first
project sample productivity. challenging
claims of performance measurement use be but the accuracy of their
might eventually acknowledging
difficult to confirm, the participation of these cities in officials in this realized that
counterparts' figures, city

306 Public Administration Review March April


| 2008

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions
the measures revealed the underutilization of labor from the North Carolina project is hardly conclusive,
and Because a section of this com the small set of cities and our reliance on self
equipment. large given
was served a hauler whose contract for some of the data. Nevertheless, the ex
munity by private reporting
would soon
expire, the city was able to discontinue tent to which respondents provided facts and figures
refuse collection and extend its own to substantiate their claims convinces us that several of
private operation
into that area without or labor. The the participating governments have indeed used per
adding equipment
annual savings totaled almost $400,000 (Jones 1997). formance data to
improve
service delivery. Coupled
with information about measurement
performance
Another city used data from the benchmarking proj and performance management practices
in these cities,

ect to avoid a hike from its residential refuse the patterns of data use lead us to suggest three factors
price
collection contractor. The contractor had that are especially influential: the collection of and
initially
insisted on a 10 increase in its new contract; reliance on measures?that is, outcome
percent higher-order
however, the city used data to argue measures (effectiveness) and especially measures of
project convincingly
that the contractor's was low and its record than measures
efficiency efficiency?rather simply output
of complaints was high compared to the residential (workload); the willingness of officials to embrace
refuse of other The with other or service
performance project participants. comparison governments
contractor backed off its price hike. Still another partici producers;
and the incorporation of performance

pating city, using data from the benchmarking project


measures into systems.
key management
to service for refuse collection,
analyze delivery options
switched to automated
equipment and one-person Collection of and Reliance on Higher-Order
crews. That city reduced
its cost per ton for refuse Measures
collection by 30 percent between 1996 and 2004. For more than half a
century, local governments have

been to measure and their


encouraged report perfor
One of the participating cities was persuaded by project mance (Ridley and Simon 1943). Through the years,
data to introduce in its that most of the city and county governments that heeded
changes recycling program
increased its waste diversion rate from 14 percent to this advice toward the collection and
gravitated
24 over a tabulation of simple workload measures, now often
percent five-year period, thereby extending
the life of its landfill. Another, alarmed by recycling called output measures, the most rudimentary type
inefficiencies relative to its counterparts, turned to of performance measures. These measures recorded

and reduced the cost per ton of the number of units of service
privatization recyclables only produced?for
collected by 24 percent, yielding a savings of approxi example, applications processed,
meters read, arrests
or tons of laid. Workload measures
mately $75,000 per year (Ammons 2000). By 2004, made, asphalt
were
the savings relative to the base year had grown from had the advantage of simplicity: They easy to
24 percent to 58 percent ton. count and to If an audience or reader
per easy report.
could be the volume of activity under
impressed by

Project data prompted other analyses involving different taken a or these measures
by department program,
and services in various cities. Fire service could serve that purpose. Workload measures answer
departments
in one case revealed underutilization of staff the easiest How However, are ill
analysis question: many? they
resources and led to the of operations into suited for answering more
expansion managerially challenging
emergency medical services.
Relying
on data from the questions: How efficiently? How effectively? Of
a in one a level of what
project, police city revealed
study quality?
that was to its
low relative and
staffing counterparts
insufficient to meet the department's re Local officials who in
objectives government engaged perfor
This the of mance measurement to their
garding proactive patrols. prompted hiring strictly satisfy obligation
33 new officers. Another led to the establish for accountability could do so, at least in a narrow
study
ment of a telephone unit to deflect some of sense, with the least expense and bother
response by focusing
the burden on officers, as documented on workload measures. Raw counts of
placed police governmental
data. Analyses in emergency communica activity would impressive numbers and
by project produce big,
tions and fleet maintenance in other cities revealed would demonstrate, that departments and
perhaps,
to actual were were to count
instances of
overstaffing
relative service employees busy. Because they easy
demand and led to staff reductions in these functions. and compile,
the of workload measures
collecting
Several cities used project data to help establish would impose minimal
disruption and expense
on

in various Nevertheless, some


performance targets operations. operating departments. operating
officials grumbled about devoting any time and
What factors have contributed to the use of resources to the collection of these measures, for
perfor they
mance data to in these cities, saw little use made of them. More than a few
improve operations being
when observers so often bemoan the failure of local whether their investment in
questioned performance
to use measures for any measurement, restricted to workload mea
governments performance entirely
more than The evidence sures, any benefits at all.
thing performance reporting? produced operating

The Use of Performance Data to Improve Municipal Services 307

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions
As noted at the of this article, the value of true?is an
beginning improvement?although perhaps excessively
measurement can be divided into two harsh indictment of the officials who these systems
performance put
broad categories. First, it supports in for it misconstrues the of
accountability? place, original purpose
specifically, performance reporting?and second, many of these most attempts at
rudimentary perfor
service improvement. it is axiomatic that mance measurement and
Perhaps reporting.
measurement
performance systems designed strictly
for the former (i.e., performance Over the past few decades, the cities and counties that
reporting), especially
when a is on ease of data collection, are considered leaders in local
premium placed government perfor
are unlikely to yield much of the mance measurement have
supple
latter. Systems intended solely to mented their workload measures
assure elected officials, citizens, with measures of efficiency and
Over the past few decades, the
and the media that the govern effectiveness. These governments
ment is busily engaged in a broad
cities and counties that are
have invested in systems
designed
of activities considered leaders in local for and service
array high-volume accountability
can be
designed to achieve this government performance improvement (Halachmi 2002)
aim while imposing minimal measurement have and therefore are justified in
and expense, if these a return on
disruption
supplemented their workload expecting higher
systems focus on workload their investment in a more ad
only measures with measures of
measures. such a vanced of
Unfortunately, system performance
efficiency and effectiveness.
system produces feedback having measurement. Good measures of

very little managerial or policy efficiency and effectiveness are


value to officials or more than output measures
operating likely
government executives to about service
beyond merely documenting inspire managerial thinking
whether demand for a service is up, down, or rela
improvement.
stable. that 45 citizens were enrolled
tively Knowing
in the art class at the civic center, that the had in a coordinated measurement
library Participants performance
32,000 visitors, that the water such as that in North Carolina, confront a
department repaired project,
600 meters, or that the made 200 of to data
police department variety challenges achieving comparability,
arrests few managers, but also enjoy some over individual
probably inspires supervisors, they advantages
and to consider to ser cities or counties measurement
employees strategies improve tackling performance
vices. Raw workload counts do not alone. administrators and fellow
simply inspire Project participants
much managerial thinking. expose these governments to measures of
higher-order
efficiency and effectiveness and guide them away from
In contrast, measures on service reliance on workload measures. demonstrate to
focusing quality, They
effectiveness, or can cause officials and one another a of uses for the data
efficiency variety performance
to rethink service For collect. As a group, the officials in
employees delivery strategies. they city engaged
instance, measures
revealing that persons signing up theNorth Carolina project had little difficulty respond
for a class at the civic center re-enroll in an to this off many
rarely ing study's inquiry, easily rattling
other, that the local library's circulation per capita is of performance measurement's influence on
examples
among the lowest that the cost per
in the
region, local recommendations and decisions. The
managerial
is almost as much
as the a new meter, measures
repair price of availability of
higher-order helped make
or that the local home rate has reached an measurement a more relevant management
burglary performance
historic are measures of that are tool in these cities than it appears to be in local
high performance govern
much more to the consideration of ments in where continue to
likely prompt general, many systems
alternate to achieve better results. Unlike on workload measures.
strategies rely overwhelmingly
workload measures, these measures of and
efficiency
effectiveness inspire managers, and front The of measures is
supervisors, importance higher-order amplified
line
employees
to
diagnose
the
problem,
if one exists,
by a careful review of key distinctions among
and to devise
strategies
to correct it. In short,
they participants in the North Carolina project, including
comments about where
inspire managerial thinking. respondents' they focus
their attention among the measures collected and
Performance measurement that over how use data. The coordi
systems rely they performance project
on workload measures tend to have been nators in some of the cities declared
whelmingly participating
to a narrow view of that their organization focuses its attention on one
designed only satisfy accountability
and to do so at minimal cost in terms of resources and or both of the higher-order measures (efficiency and
These systems either were not for effectiveness); these officials did not even mention
disruption. designed
service or, if service was workload measures. and these were the
improvement improvement By large,
their were to achieve that cities that accounted for most of the of
purpose, poorly designed examples
end. To them with to of performance measurement for service
charge failing inspire performance application

308 Public Administration Review March April


| 2008

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions
improvement. Their officials? principles of efficiency. In reality,
the ones most often and most Typically, candidates for elective however, few local
relatively
in the ap office to eliminate governments
are
particularly
intensively engaged promise
of performance mea waste at all at their effi
plication and administrators adept measuring
sures in with much and
key management levels of government swear ciency precision,
and major those that are have been accorded
systems management
to the of
decisions?also to be allegiance principles akin to
celebrity
status
appeared something
the ones who most fully grasped efficiency. In reality, however, and admirers?
by counterparts
the value of good efficiency and relatively few local governments for example, the cities of Sunny

effectiveness measures in these are at vale, Indianapolis, Charlotte, and


particularly adept
and decisions and who Phoenix. assaults on
systems
measuring their efficiency with Aggressive
most fully recognized the lim inefficiency have less often been
much precision, and those that
ited value of workload mea measure
prompted by precise
are have been accorded
sures. In contrast, coordinators ment and the desire to squeeze
akin to
who said that their cities rely
on something celebrity another dime out of unit cost or

all three of measures, who status another half hour out of process
types by counterparts
said that they rely on workload and admirers?for ing
time than by the more obvi
example,
and another or ous alarms of idle employees in
perhaps type, the cities of Sunnyvale,
who were unable to say which full public view or budgets rising
Indianapolis, Charlotte,
types are most used tended to far beyond historic levelswhile
and Phoenix.
cities with or vendors claim that can do
represent average they
less than evidence of the the work more
average cheaply.
actual application of perfor
mance measurement for service Their Privatization and the celebrated
improvement. managed competition,
or to discount the value of for results tactic in which
inability unwillingness managing municipal depart
workload measures perhaps betrayed their limited ments must compete with
private companies
or other

to measures in for the opportunity to deliver services, have


attempts apply performance major producers
management systems and decisions. exposed vulnerabilities inmany local government
that have arisen, in part, because
operations perhaps
Efficiency Measures in Particular of the
inadequate
state of
efficiency
measurement in
measures of efficiency with these Unmeasured, untracked, and
Ideally, report precision governments.
the between and the therefore often undetected, small inefficiencies can
relationship production outputs
resources consumed to these outputs (see, become over a of years, and
produce large span eventually,
e.g., Coplin and Dwyer 2000). Resources may be de alternatives for these operations become
cost-saving
as dollars or as some other of a attractive.
picted representation understandably
resource element?for $8 per
major example, applica
tion processed; 150 applications processed per $1,000; Managed competition allows decision makers to
skip
2.2 applications processed per staff hour; 4,400 past many of the intricacies and complexities
of measur

applications processed per full-time equivalent (FTE) ing efficiency


at the various
stages
production
of the

administrative clerk. Each of these relates process?stages and measures that should not be
examples skipped
outputs to the dollars or human
energy to if one is
truly managing All that officials
required performance.
them. Variations that also address need in order to make their
produce efficiency managed competition
include measures of utilization the extent to decision are a few standards or
depicting quality-of-service
which equipment, facilities, and personnel are fully measures and the bottom-line costs for the various

utilized, and measures that gauge the is hardly a success


only roughly options. Managed competition
of production processes turnaround story for efficiency measurement; instead, it a
efficiency (e.g., signals
time, surrender to the that many local
average daily backlog, percentage completed reality governments
on schedule). have not measured or their
managed efficiency very
well and now find themselves vulnerable if officials are
The of greater has a to test the bottom line for selected
pursuit efficiency prominent place ready operations.
in the history of American government and public
administration in the 20th century, beginning with Efficiency
measurement is not easy, even if the

the "cult of and to the current seems A measure that relates outputs
efficiency" extending concept simple.
insistence on for the use of to resources with the accurate
accountability productive precision requires
resources (Comptroller General 1988; GASB 1994; measurement of outputs and inputs. The
problem
Haber 1964; Hatry et al. 1990;Mosher 1968; Schachter for most governments lies in
primarily accounting
1989; Schiesl 1977). Typically, candidates for elective for inputs. The cost
accounting systems in many
office to eliminate waste and administrators local if they exist at all, fail to capture
promise governments,
at all levels of government swear to the total costs. or other
allegiance Perhaps they overlook overhead

The Use of Performance Data to Improve Municipal Services 309

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions
indirect costs, ignore the cost of employee benefits (at not
only
to track
changes
in their own
efficiency from
least insofar as a costs are con year to but also to with others,
particular program's year compare although
cerned), or fail to include annualized with caution. Most, however, do not overcome the
capital expenses.
In such instances, if unit costs are calculated at all, noted here. Because of the of
problems inadequacies
understate actual costs and mask measurement and lack of in cost
they inefficiency. efficiency uniformity
accounting rules, most cities
counties and
wishing
to

Some local measures of their services with other are well


governments desiring compare jurisdictions
cope with cost advised to focus on measures of effectiveness
efficiency inadequate accounting primarily
staff hours, labor hours, or FTE and quality, where cost and the differentia
systems by using accounting
to reflect resources rather than dollars. This tion of duties are not at issue, and
positions multiple only
many cost issues within on measures of efficiency.
strategy dodges accounting secondarily
their own system and has the additional of
advantage
permitting efficiency comparisons with other govern Participants in the North Carolina project tell a
ments without about differences in cost different In this which focuses a
worrying story. project, large
rules from one to another. of on
its attention cost rules and
accounting government portion accounting
Even this measure of becomes in claim to on
efficiency complex, uniformity reporting, participants rely
however, when the time of a must be measures as or in some cases more
given employee efficiency heavily
divided among multiple duties and different outputs. heavily than on other
categories
of measures. The
While systems introduce has measures that
time-logging complications project produced efficiency partici
that many resist, estimation pants consider reliable. Accordingly,
operations techniques project partici
can are
introduce imprecisions that reduce the value of pants less apt to ignore the messages they receive
the measure as a tool and as a reliable from these measures. Because so
diagnostic guide they have expended
for performance management efforts. much effort on costs when their
identifying precisely,
measures are inefficient,
efficiency suggest they they
In the face of these too many local are to dismiss the warning. Instead, are
complexities, unlikely they
resort to "FTEs 1,000 to focus on ways to correct the
governments reporting per likely finding problem.
or "cost per for services overall or The broad array of initiatives
population" capita" performance management
for the services of a These are in table 1 reflects this tendency.
particular department. reported Participating
crude measures of efficiency, if they can be cities are from left to
extremely arrayed right roughly according
called efficiency
measures at all.
Comparisons of FTEs to the level and significance of their use of project data
per 1,000 population are favorites of local governments to influence
operations.
that contract out one or more functions; with
major
their own in such some cities on
they look good
fewer of participants, reliance
employees, Among emphasize
of whether the privatization measures more than others do. Claims of
comparisons, regardless efficiency
services or saves money. Costs reliance on measures to be necessary
strategy improves per efficiency appear
for services overall are calculated but insufficient as of extensive use of
capita typically by predictors perfor

dividing the total budget by the current population mance measurement data. Some that claimed to use

and compared with similar figures for neighboring efficiency


measures as much or more than workload or

or counterparts more These effectiveness measures were not the project


jurisdictions broadly. among
differences in the leaders in management
comparisons usually ignore quality performance applications;
and of services the listed govern however, others making this claim were among the
array provided by
ments. A that has no leaders. that did not indicate use of
city government responsibility Participants
for parks or fire services because these are handled measures tended not to be among the
by efficiency
a county or district will leaders.
government special appear performance management
more efficient in a total cost per
capita comparison
than its full-service counterparts that have responsibil Comparison with Others
ity for these costly functions. Per capita
cost
compari Local governments in general exhibit different levels of
sons on a basis reduce this enthusiasm for comparing their own
function-by-function performance

problem but often are plagued by cost accounting statistics with the statistics of others. Some eschew

variations from to or in them


city city. interjurisdictional comparisons engage only
some are more but if the
reluctantly; receptive, only
For governments to possess are controlled; and others
wishing good efficiency comparisons carefully appear
measures and desiring to compare their efficiency to to embrace For instance,
comparisons wholeheartedly.
others, there are in with a the city of Portland, and others
advantages affiliating coop Oregon, voluntarily
erative project that
doggedly
focuses on issues of cost
publishing Service Efforts and Accomplishments
Most unaffiliated cities have to on at the of the Governmental
accounting. grapple Reports urging Accounting
their own with the problems noted in
preceding para Standards Board have featured performance comparisons

graphs.
Some have overcome these problems and have prominently (Portland City Auditor 2003). The
established measures that they can use of the performance comparison project of the
good efficiency growth

310 Public Administration Review March April


| 2008

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions
International Association, which son as a card rather than a search
City Management management report
included 87 cities and counties in 2004 and more than for best practices,
or a more modest case of the

200 by 2007, is further evidence of an enthusiasm for distrust of


performance
measurement and lack of
of each of the three confidence in response attributed
comparison.6 Representatives organizational
comparers, but cautious to officials who not to engage in external
groups?reluctant willing prefer
comparers, and enthusiastic comparers?are at all.
present comparisons
among the 15municipalities participating in the
North Carolina Concern that service demands and the services
project.
themselves are different in fundamental ways in
Some local in mea and small cities, and that economies of scale
governments engage performance large
surement but insist that it is not for the purpose of will favor communities, fuels the reluctance
larger
interjurisdictional comparison. Officials of these of many officials to
engage in
comparison
across
one or two in the North Caro
governments, including population ranges. While the
challenges
of service
lina project, contend that are more interested in and of service differ
they delivery expectations recipients
their own than in from to the effects of scale
reviewing year-to-year performance community community,
with others. While economies are less clear than who com
comparing performance comparison many reject
with one's own at earlier of time across assume. Studies of
performance periods parison population ranges
is reluctance to embrace external economies of scale for local services
important, compari government
son is odd for a participant in a project different rates and
designed primar report economy-of-scale ceilings
ily for that purpose and may reveal an underlying across various
municipal functions and are some

distrust of
performance measurement, anxiety about times contradictory in their findings (Ahlbrandt
the numbers being produced and what they will sug 1973; Boyne 1995; DeBoer 1992; Deller, Chicoine,
gest about relative
standing,
or a lack of confidence in andWalzer 1988; Duncombe and Yinger 1993;
the
organization's ability
to
improve performance. Fox 1980; Gyimah-Brempong 1987; Hirsch 1964,
Representatives of one city (city L) have been outspoken 1965, 1968; Kitchen 1976; Newton 1982; Ostrom,
from the start about their greater interest in year-to-year Bish, and Ostrom 1988; Savas 1977a, 1977b;
comparisons of their own
performance data than in Travers, Jones, and Burnham 1993; Walzer 1972).
external even if their official to Nevertheless, most in the North Caro
comparisons, response participants
this indicated a to compare lina project with cities of similar
study's inquiry willingness prefer comparisons
with cities of similar size. This reticence about size evidence of
city's despite ambiguous population
with other local appears to related effects on service or unit costs within
comparison governments quality
extend also to its use of data for the project data. Their for comparison
performance manage preference
ment. The concerns that inhibit the former cities size, even when
apparently only with of similar
efficiency
also inhibit the latter. These are the reluctant measures are standardized as unit costs, reveals a

comparers. latent belief in economies of scale than


stronger
the evidence of the existence and of such
impact
A second but cautious economies
group?the willing comparers? supports.
includes cities that are more to external
open

comparisons but The desire to control the comparison, not


strongly prefer comparisons only carefully
to cities of similar size, perhaps only including
a
only by population but also by other factors to ensure
select set of cities considered to be of a like the comparison a
generally similarity among group, suggests
nature leaders or citizens in sense of officials over the
by community general. anxiety among possibility
Some officials are restrictive in their notions that the comparison will be used as a
especially management
regarding suitable comparisons, preferring that the report card?that is, as a gauge for assessing how
cities not be similar in size but also similar in well or how heads and other
only poorly department
other characteristics and in mode of are their jobs. this
demographic managers doing Unfortunately,
service for whatever function is can the search for best
delivery being anxiety completely displace
compared. and a that
practices produce benchmarking design
limits the likelihood of breakthrough discoveries. Two
Despite their for characteristics of this group of officials hint at their
preference limiting comparisons by
size, or even more restrictive concern that will be used as
general similarity, performance comparisons
officials in this second are a card. First is their insistence on
grounds, category clearly management report
more to external than those in the the or factor
open comparison removing population economy-of-scale
first category who would to it from the equation, even if scale economies are weak or
prefer reject altogether.
Nevertheless, even these officials reveal a of nonexistent for a given rather than
degree function, simply
caution that a to overesti for these effects. This a
perhaps suggests tendency controlling suggests preoccu
pation with having a "level playing field."When
mate the of economies of scale (hence
importance
their reluctance to be to cities), a few local officials will contend
compared larger pressed, government
sense of anxiety over the use of that all the best ideas for service delivery reside only in
performance compari

The Use of Performance Data to Improve Municipal Services 311

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions
Table 1 Reported Uses of Performance Data by Cities Participating in the North Carolina Benchmarking Project

City A
City B CityC City ED
City
Claimed uses of project data, beyond reporting
establishing performance / / / / /

targets
contracting decision/mgmt // / /
program evaluation // / / /
budget proposal/review // /
other (1) // /
Types of measures used
workload (output) / / /
efficiency / /
effectiveness (outcome) / / / /

Prefers comparison to Unrestricted (average, best, / /


Prefers comparison /
cities of similar size? worst, to project
all)
participant
average and
selected others

Reported applications Used to negotiate price & Data supported move Project data used Project data used Data confirmed benefits
of project data establish perf stds for refuse to automated to identify to compare of automated refuse
contract; to project service refuse trucks; used opportunities for costs and collection; data used to
costs for annexation; to to monitor refuse more efficient workload in evaluate contract costs;
review staff/equipment collection efficiency deployment of fire services, to analyze effects of

requests; as gauge for and effectiveness, refuse collection especially fire service delivery options on

redesign of service routes and waste diversion equipment and inspections. waste diversion rates; to
and monitoring perf; to rate; to evaluate crews, yielding evaluate use of seasonal
monitor community policing requests for substantial vs. permanent staffing; to
perf and deployment results; additional police budgetary savings; evaluate perf and set work
influenced emergency comm personnel; low to assess the levels in police/emergency
work leading to
plans, ratio of calls per costs and benefits comm; to analyze equip
improved perf; influenced telecommunicator of backyard vs. options for asphalt maint,
staffing decisions and prompted analysis curbside collection improving efficiency;
development of work order of emerg comm.; to (leading to the data analysis led to fire

system in asphalt maint; evaluate appropriate introduction of dept taking role in EMS;

incorporated into fire dept use of contractors voluntary curbside influenced fleet maint

goals and objectives, perf in asphalt maint.; to program). performance targets,


appraisals, analysis of station evaluate fleet maint.
locations, and analysis for operation, vehicle
fire insp; used for analysis replacement policy,
of fleet maint., identifying and set perf targets

opportunity for staff for mechanics; to


reduction and the need for consider comparative
revised vehicle replacement employee turnover

schedule; prompted review rates in compen


of HRM processes, goals, sation deliberations,

staffing, and employee


benefits.

(1) "Other" uses noted by respondents included use of the project's measures in annexation studies and as a reference source for responding to manager
and council requests.

cities of their size, and none would concede that formance of managers, then a fair basis
larger establishing
are more efficient than medium of comparison is indeed However, if the
municipalities always important.
sized or smaller ones.
By insisting
that their city be purpose is to find new ideas for improving operations,
in a similar
compared only with similarly sized municipalities, then omitting all but those operating
a
they willingly sacrifice the possibility of learning fashion defeats this purpose.

valuable lesson from a or a smaller to


larger city city
the belief that comparison of like cities will be a fairer Project participants
in this second category?open
Second, the preference of some that to with "like"
comparison. mostly comparison cities?occupied

comparison units have the same mode of operation the broad middle range of participating municipalities.
for the function examined were in number and varied in their
being similarly emphasizes They large perfor
the importance of a level playing field. If, in fact, mance
management activities, including
some cities

the comparison will be used simply to judge the per that were among the
project's
leaders and others that

312 Public Administration Review March April


| 2008

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions
F
G
CityCity CityH City I City JCity K City L City M CityO
City N

/ / / /

/
/ /
/ / / /

_/_/_

/ / / / / /
/ / / / / / /
/ / /
// / / / / / Ultimately Ultimately Depts prefer
restricted to restricted to not to
like cities like cities compare
with others

Data used to Project data Project data Project data None


Project data led to Project data Project data None Project data
consideration of used to assess prompted used to assess staffing used to assess used to used in refuse
curbside collection shift from assess & equipment conversion assess collection
police staffing
and review & deployment; rear loaders pros/cons of needs in from backyard funding level contract

of equipment to adjust fire for refuse automated res. refuse to curbside for asphalt negotiation; to
crewdept work collection to refuse & leaf/litter collection of maint.; review asphalt
type and
residential maint to
configuration for plan; to focus side loaders collection; collection, comparative costs;
and smaller to push refuse; stats for
analyze HRM
recycling services; improvement including
used in evaluating efforts and crews; used recycling automation provided fire service centralization,

police deployment work planning to assess vendor for options; impetus to prompted
strategies; in building needs for improved staffing needs analyze emerg scrutiny,
in assessing inspections. support performance in police/fire commun

supervisory staff staff in reporting; svcs and ications.


size in emergency police to review building
communications services. asphalt maint inspection;
(staff increased); instrategies. identified
workforce planning and remedied
for fire insp; in inadequacies
monitoring fleet in perf info
maint performance. for emerg
comm and

asphalt maint;
analysis of
fleet services.

in a few data-driven third. These cities are more than others to use
engaged only management likely
initiatives. measures to Their
performance improve operations.
list of management initiatives tended to be or
longer
A third category, occupied consistently by only one more
significant
in terms of documented service

project participant (city A) and intermittently by


or of
improvement magnitude budgetary impact.
another (city C), includes local governments that
embrace comparisons
even when the initial results of Incorporating Performance Measurement
these comparisons reveal their own
performance
to be into Key Management Systems
These are the enthusiastic Performance measurement has long been
disappointing. comparers. promoted
For them, the are the first step in a series as a method of greater in
comparisons achieving accountability
of that lead to performance The local This seems
steps improvement. government. point indisputable,
first step provides the impetus for the second and the but some officials have defined accountability
more

The Use of Performance Data to Improve Municipal Services 313

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions
narrowly than have others. For officials ability and the minimal value of raw workload counts
subscribing
to the narrowest definition, means for management or decisions, are
accountability policy they unlikely
and simple. to use measures in
performance reporting, plain They performance meaningfully strategic
believe that an accountable or or con
city county govern planning management systems, performance
ment will keep the governing body, media, and tracts, departmental or individual work plans, perfor
citizens informed about the financial mance audits,
government's targets, performance program
condition and the of its major func evaluations, service cost
performance improvement strategies,
tions. These governments often report benefit annexation and other studies,
performance analyses, special
measures in their documents. Some or These uses are much more
budget produce budget proposals. likely
separate performance reports
or
post performance
to be found in local governments where officials take
measures on their Web site. Those the broader view of accountability and where
perceiving perfor
most be inclined to mance measurement is considered an
accountability narrowly may indispensable
view measurement as a chore in In such
performance necessary ingredient performance management. gov
that must be done to fulfill their accountability ernments, performance
measurement is a tool that
In this view, of dollars, reassurance to the manager or
obligation. expenditures provides supervisor
time, and energy to collect and report when is on and sounds an alarm
performance performance target
measures are a cost of business rather than an when falls short of expectations,
doing performance signal
investment in service and as such, the need for focused attention and a new
improvement, ing perhaps
this cost should be kept at aminimal level, if pos strategy and helping the organization fulfill its obliga
sible. many of these cities and counties tion for conscientious that delivers
Accordingly, management qual
raw counts in an efficient manner.
load up their
performance reports with ity services
of workload After all, these are the
(outputs).

simplest and cheapest


measures to collect and
Participating municipalities in theNorth Carolina
tabulate?and the elected officials and were about their use of project data
perhaps project questioned
citizens will be the number of transac for four management estab
impressed by purposes beyond reporting:
tions being processed or the tons of garbage being lishing performance targets; contracting and managed
collected. The measures of of options as well as
higher-order efficiency competition, including analysis
and effectiveness are more difficult to contract and management; evaluation;
compile design program
and often are not officials a and and review. Two of the cities
attempted by taking budget proposals
minimalist view of accountability. (citiesA and B) reported all four uses. For instance, city
A, the city mentioned previously for having used
A broader view of accountability includes the obliga benchmarking project data to avoid a price hike from
tion for basic but extends its refuse collection from
performance reporting clearly benefited
contractor,

beyond the rawworkload counts into dimensions of having incorporated


into its
these data
performance
service and effectiveness. Account contract and budgeting
efficiency, quality, management, monitoring,
able officials, in this view, are stewards of systems. We would assert that cities A and B are among
responsible
the government's resources who understand both the three or four in this set that have most fully adopted
their to services that balance the the broader definition of accountability. Not coinciden
obligation provide
two cities also some of the most
community's desires
for quality and efficiency and tally, these provide
their to evidence of their perfor extensive of the application of performance
obligation produce examples
mance on this score. In order to man data to
conscientiously improve operations.
age their officials this broader view
operations, taking
must be able to assure themselves and others that Two other cities (cities D and G), recognized in
are reasonable levels of efficiency and other venues for the of their manage
they achieving sophistication
service For this, must have reliable ment and their use of data in
quality. they systems performance
measures of efficiency and effectiveness (outcomes) have less of the data from this
general, incorporated
that will either alert them to and into their systems and report fewer
problems prompt project applica
the development of new management strategies
or tions of the data from this project than do a few of
reassure them that are their their counterparts. Nevertheless, their level of use
they meeting performance
targets. places them on the left-hand portion of table 1.Two
other cities that three of the four uses
report beyond
Officials taking
the narrow view of accountability reporting (cities C and E) are also among the project
are less likely
to venture
beyond workload measures leaders in the tangible application of project data for
and are to try to
unlikely incorporate performance operations improvement.
measures into systems. For them,
key management
it seems rational and prudent to collect the The of data into manage
only incorporation performance
measures and to divert as few resources as ment is not a of the actual
simplest systems perfect predictor
from service to the measurement use of performance measurement to
possible delivery adjust operational
of Given their narrow view of account or to the quality or of
performance. processes improve efficiency

314 Public Administration Review March April


| 2008

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions
services. Nor is the failure to to make decisions" (2002, 363). Berman writes
incorporate performance
data into key management an absolute that many managers see performance measurement
systems guar
antee that the will not use its measures as "a required management chore with few poten
organization
for service Some of the North Carolina tial advantages" (2002, 349). Streib and Poister
improvement.
cities that have been slow to data detect only "a narrow range of benefits," "few
incorporate project
substantial no effects on
into their management systems nevertheless have been impacts," and significant
able to report beneficial of project data. "bottom line issues" (1999, 119). Bouckaert and
applications
even the small set of cities Peters observe that the costs of performance
However, among engaged
in the North Carolina a measurement are readily apparent, while
project, positive relationship
between the of performance data in the "expected (or hoped for) benefits from
incorporation key
and the application of these data are "sometimes
management systems performance-based management"
for service is evident. invisible" (2002, 360). Poister notes that interest in
improvement
performance measurement waned in the 1980s

Conclusions "because measures were increasingly perceived


as

As the of performance measures not making meaningful contributions to decision


collecting by city
and has become more common, making" and that lingering "skepticism remains
county governments
observers have noted with about both the feasibility and the utility of mea
increasingly disappoint
ment the meager use of these measures to surement systems" (2003, 6, 272). Coplin, Merget,
improve
the quality or of services. As researchers seek and Bourdeaux contend that most government
efficiency

explanations for the use or nonuse of


performance agencies do not have systems in place that make

data in local government, look to the performance data "part of the decision-making
they might
of the cities participating in the North processes and have not made a serious commit
experience
to do so, whether
Carolina Benchmarking Project for three possibilities. ment they profess to or not"
The experience of 15 participating municipalities (2002, 700). Weitzman, Silver, and Brazill write

suggests that the likelihood that performance data will that, despite the assumption that good perfor

be used to influence is enhanced the mance data will lead to improved decision making,
operations by
collection of and reliance on measures, "the evidence to support the leap from better
higher-order
measures, rather than work information to better policy is not yet substanti
especially efficiency simply
load or output measures; the
willingness
of officials to ated" (2006, 397). Dubnick found "nothing in the
. . . that would
embrace comparison with other governments or service existing literature provide a logical

producers; and the incorporation of performance (let alone a theoretical or empirical) link between
measures into account giving and performance" (2005, 403).
key management systems.
Kelly reports, "We know a lot about how to con
struct and report performance measures, but we
Acknowledgments
The authors the assistance cannot say specifically why we go to all the trouble.
gratefully acknowledge
of Dale J. Roenigk, director of the North Carolina According to our best evidence,
nothing much

Benchmarking Project, in helping compile the infor


as a result of
changes adopting performance
mation for this article. measurement systems" (2002, 375).
\. Though we readily acknowledge that reporting

Notes measures is an important use of


performance mea
1. Although reviews of performance surement for the purpose of accountability, the focus
reporting docu
ments have revealed the tendency of some local of this study is the use of performance measures to

officials to overstate
their government's measure influence decisions and improve services. Simply
ment status in surveys (Ammons 1995; Hatry reporting measures does not necessarily reflect
1978; Usher and Cornia 1981), it is nevertheless reliance on these measures for decisions. Similarly,
safe to conclude that the practice of collecting although we agree that performance measures should
basic measures?especially workload or output be instrumental in the monitoring of operations,
indicators?is assertions of that use are
widespread. vague easily overstated and
2. Common failure to use performance measurement therefore are dismissed in this study.

for purposes is not confined to '. ). The 15 cities are Asheville,


beyond reporting Cary, Charlotte,
state and local governments. A National Concord, Durham, Gastonia, Greensboro,
Academy
of Public Administration panel examining early Hickory, High Point, Matthews, Raleigh, Salisbury,
federal efforts to implement the Government Wilson, andWinston-Salem, North
Wilmington,
Performance and Results Act found "little evidence Carolina. As a project participant, each city agrees
inmost plans that the performance information to commit administrative resources necessary to
would be used to improve program performance" compile its data in a timely manner and to pay an
(NAPA 1994, 8). annual fee to offset the costs borne by the univer
3. Grizzle notes the common complaint "that deci sity inmanaging the project. Any North Carolina
sion makers seldom use performance information municipality may join the project. For more

The Use of Performance Data to Improve Municipal Services 315

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions
information on this project, see Activities, and Functions. Rev. ed.Washington, DC:
http://www.sog.
unc.edu/programs/perfmeas. U.S. Government Printing Office.
6. See information about the International City Coplin, William D., and Carol Dwyer. 2000. Does

Management Associations Center for Performance Your Government Measure Up? Basic Toolsfor Local
Measurement at
http://www.icma.org. Officials and Citizens. Syracuse, NY: Syracuse
University, Maxwell Community Benchmarks

References Program.
Ahlbrandt, Roger S., Jr. 1973. Efficiency in the Coplin, William D., Astrid E. Merget, and Carolyn
Provision of Fire Services. Public Choice 16(1): Bourdeaux. 2002. The Professional Researcher as

1-15. Change Agent in the Government-Performance

American Society for Public Administration (ASPA). Movement. Public Administration Review 62(6):

1992. Resolution Encouraging the Use of Performance 699-711.


Measurement and Reporting by Government DeBoer, Larry. 1992. Economies of Scale and Input
DC: ASPA. Substitution in Public Libraries. Journal
Organizations. Washington, of Urban
Ammons, David N. 1995. Overcoming the Economics 32(2): 257-68.

Inadequacies of Performance Measurement in de Lancer Julnes, Patria, and Marc Holzer. 2001.
Local Government: The Case of Libraries and Promoting the Utilization of Performance
Leisure Services. Public Administration Review Measures in Public Organizations: An Empirical

55(1): 37-47. Study of Factors Affecting Adoption and


-. 2000. as a Performance Public Administration Review
Benchmarking Implementation.
Management Tool: Experiences among 61(6): 693-708.

Municipalities inNorth Carolina. Journal of Public Deller, Steven C, David L. Chicoine, and Norman

Budgeting, Accounting and Financial Management Walzer. 1988. Economies of Size and Scope in

12(1): 106-24. Rural Low-Volume Roads. Review of Economics and

Ammons, David N., Charles Coe, and Michael Statistics 70(3): 459-65.
Lombardo. 2001. Performance-Comparison Dubnick, Melvin J. 2005. Accountability and the

Projects in Local Government: Participants' Promise of Performance. Public Performance and

Perspectives. Public Administration Review 61(1): Management Review 28(3): 376-417.


100-110. Duncombe, William, and John Yinger. 1993. An

Behn, Robert D. 2003. Why Measure Performance? Analysis of Returns to Scale in Public Production,

Different Purposes Require Different Measures. with an Application to Fire Protection. Journal of
Public Administration Review 63(5): 586-606. Public Economics 52(1): 49-72.
-. 2006. The Varieties of CitiStat. Public Frank, Howard A. and Jayesh D'Souza. 2004. Twelve

Administration Review 66(3): 332-40. Years into the Performance Measurement

Berman, Evan M. 2002. How Useful Is Performance Revolution: Where We Need to Go in

Measurement? Public Performance andManagement Implementation Research. International Journal of


Review 25(4): 348-51. Public Administration 27(8-9): 701-18.
Berman, Evan M., and XiaoHu Wang. 2000. Fox, William F. 1980. Size Economies in Local

Performance Measurement in U.S. Counties: Government Services: A Review. Rural Development

Capacity for Reform. Public Administration Review Research Report No. 22. Washington, DC:

60(5): 409-20. U.S. Department of Agriculture.

Bouckaert, Geert, and B. Guy Peters. 2002. Governmental Accounting Standards Board (GASB).
Performance Measurement and Management: The 1989. Resolution on Service Efforts and

Achilles' Heel inAdministrative Modernization. Accomplishments Reporting. Norwalk, CT: GASB.


-. 1994.
Public Performance andManagement Review 25(4): Concepts Statement No. 2 of the
359-62. Governmental Accounting Standards Board on

Size and Economies to Service Efforts and


Boyne, G. A. 1995. Population of Concepts Related
Scale in Local Government. Policy and Politics Accomplishments Reporting. Norwalk, CT: GASB.

23(3): 213-22. Governmental Accounting Standards Board (GASB),

Burke, Brendan E, and Bernadette C. Costello. 2005. and National Academy of Public Administration

The Human Side of Managing for Results. (NAPA). 1997. Report on Survey of State and Local
American Review of Public Administration 35(3): Government Use and Reporting of Performance
270-86. Measures. Washington, DC: GASB.

Clay, Joy A., and Victoria Bass. 2002. Aligning Grizzle, Gloria A. 2002. Performance Measurement

Performance Measurement with Key Management and Dysfunction: The Dark Side of Quantifying

Processes. Government Finance Review 18: 26-29. Work. Public Performance and Management Review

Comptroller General of the United States. 1988. 25(4): 363-69.


Governmental Auditing Standards: Standards for Gyimah-Brempong, Kwabana. 1987. Economies of

Audit of Governmental Organizations, Programs, Scale inMunicipal Police Departments: The Case

316 Public Administration Review March April


| 2008

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions
of Florida. Review -.
of Economics and Statistics 69(2): 1994. Toward Useful Performance
352-56. Measurement: Lessons Learned from Initial Pilot

Haber, Samuel. 1964. Efficiency and Uplift: Scientific Performance Plans Prepared under the Government
Management in the Progressive Era 1890?1920. Performance and Results Act. Washington, DC:

Chicago: University of Chicago Press. NAPA.

Halachmi, Arie. 2002. Performance Measurement, Newton, K. 1982. Is Small Really So Beautiful? Is Big

Accountability, and Improved Performance. Public Really So Ugly? Size, Effectiveness, and Democracy

Performance andManagement Review 25(4): in Local Government. Political Studies 30(2):

370-74. 190-206.

Hatry, Harry P. 1978. The Status of Productivity Portland, Oregon, Office of the City Auditor. 2003.
Measurement in the Public Sector. Public City of Portland Service Efforts and
Administration Review 38(1): 28-33. Accomplishments: 2002-03. http://www.
-. 2002. Performance Measurement: Fashions portlandonline.com
and Fallacies. Public Performance andManagement Osborne, David, and Ted Gaebler. 1992. Reinventing
Review 25(4): 352-58. Government: How the Entrepreneurial Spirit Is
Hatry, Harry P., James R. Fountain, Jr., Jonathan M. Transforming the Public Sector. Reading, MA:

Sullivan, and Lorraine Kremer. 1990. Service Addison-Wesley


Efforts and Accomplishments Reporting: Its Time Has Osborne, David, and Peter Hutchinson. 2004. The
Come. Norwalk, CT: Governmental Accounting Price of Government: Getting the Results We Need in

Standards Board. an
Age of Permanent Fiscal Crisis. New York: Basic
Hirsch, Werner Z. 1964. Local vs. Areawide Urban Books.
Government Services. National Tax Journal 17(4): Osborne, David, and Peter Plastrik. 2000. The
331-39. Reinventor's Fieldbook: Toolsfor Transforming Your
-. 1965. Cost Functions of an Urban Government. San Francisco: Jossey-Bass.
Government Service: Refuse Collection. Review of Ostrom, Vincent, Robert Bish, and Elinor Ostrom.
Economics and Statistics 47(1): 87-93. 1988. Local Government in the United States. San
-. 1968. The Supply of Urban Public Services. Francisco, CA: Institute for Contemporary Studies.
Baltimore: Johns Hopkins University Press. O'Toole, Daniel E., and Brian Stipak. 2002.
Ho, Alfred, and Paul Coates. 2004. Citizen-Initiated Productivity Trends in Local Government

Performance Assessment: The Initial Iowa Budgeting. Public Performance and Management

Experience. Public Performance andManagement Review 26(2): 190-203.


Review 27(3): 29-50. Page, Sasha, and Chris Malinowski. 2004. Top 10
International City Management Association (ICMA). Performance Measurement Dos and Don'ts.
1991. Practices for Effective Local Government Government Finance Review 20(5): 28-32.

Management. Washington, DC: ICMA. Poister, Theodore H. 2003. Measuring in


Performance
Jones, Ann. 1997. Winston-Salem's Participation in Public and Nonprofit Organizations. San Francisco:
the North Carolina Performance Measurement Jossey-Bass.
Project. Government Finance Review 13(4): 35-36. Poister, Theodore H., and Gregory Streib. 1999.

Joyce, Philip G. 1997. Using Performance Measures for Performance Measurement inMunicipal
or Is It the Same Old Tune?
Budgeting: A New Beat, Government: Assessing the State of the Practice.
In Using Performance Measurement to Improve Public Public Administration Review 59(4): 325-35.
and Nonprofit Programs, edited by Kathryn E. Ridley, Clarence E., and Herbert A. Simon. 1943.
Newcomer, 45-61. San Francisco: Jossey-Bass. Measuring Municipal Activities: A Survey of

Kelly, JanetM. 2002. Why We Should Take Suggested Criteria for Appraising Administration.
Performance Measurement on Faith. Public
Chicago: International City Managers' Association.
Performance andManagement Review 25(4): 375-80. Savas, E. S. 1977a. An Empirical Study of
Kitchen, Harry. 1976. A Statistical Estimation of an Competition inMunicipal Service Delivery. Public

Operating Cost Function for Municipal Refuse Administration Review 37(6): 717-24.
Collection. Public Finance Quarterly 4(1): 56-76. -. 1977b. The Organization and Efficiency of
Melkers, Julia, and Katherine Willoughby. 2005. Solid Waste Collection. Lexington, MA: Lexington
Models of Performance-Measurement Use in Local Books.
Governments: Understanding Budgeting, Schachter, Hindy Lauer. 1989. Frederick Taylor and
and Lasting Effects. Public the Public Administration
Communication, Community: A
Administration Review 65(2): 180-90. Reevaluation. State University of New York
Albany:
Mosher, Frederick C. 1968. Democracy and the Public Press.
Service. New York: Oxford University Press. Schiesl, Martin J. 1977. The Politics of Efficiency:
National Academy of Public Administration (NAPA). Administration and Reform inAmerica,
Municipal
1991. Performance Monitoring and Reporting by 1880-1920. Berkeley: University of California
Public Organizations. Washington, DC: NAPA. Press.

The Use of Performance Data to Improve Municipal Services 317

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions
Silverman, Eli B. 1992. NYPD Battles Crime: Usher, Charles L., and Gary Cornia. 1981. Goal

Innovative Strategies in Policing. Boston: Setting and Performance Assessment inMunicipal

Northeastern University Press. Budgeting. Public Administration Review 41(2):

Smith, Dennis C, andWilliam J. Bratton. 2001. 229-35.

Performance Management inNew York City: Walzer, Norman. 1972. Economies of Scale and

CompStat and the Revolution in Police Municipal Police Services: The Illinois Experience.

Management. In Quicker, Better, Cheaper: Review of Economics and Statistics 54(4): 431-38.

Performance inAmerican Government, Wang, XiaoHu. 1997. Local Officials' Preferences of


Managing
edited by Dall W Forsythe, 453-82. Albany, NY: Performance Measurements: A Study of Local

Rockefeller Institute Press. Police Services. PhD diss., Florida International

Streib, Gregory D., and Theodore H. Poister. 1999. University.


the Validity, and -. 2002. Assessing Performance Measurement
Assessing Legitimacy,

Functionality of Performance Measurement Impact: A Study of U.S. Local Governments. Public

Systems inMunicipal Governments. American Performance andManagement Review 26(1): 26-43.

Review of Public Administration 29(2): 107-23. Weitzman, Beth C, Diana Silver, and Caitlyn Brazill.

Travers, T., G.Jones, and J. Burnham. 1993. The 2006. Efforts to Improve Public Policy and

Impact of Population Size on Local Authority Costs Programs through Data Practice: Experiences in 15

and Effectiveness. York, UK: Joseph Rowntree Distressed American Cities. Public Administration

Foundation. Review 66(3): 386-99.

318 Public Administration Review March April


| 2008

This content downloaded from 202.43.95.117 on Sun, 20 Sep 2015 15:13:56 UTC
All use subject to JSTOR Terms and Conditions