You are on page 1of 95

Financial Management Series

Number 8
(Instructional Version)

Performance Measurement
& Performance Based
Budgeting (PBB)
Alan Probst
Local Government Specialist
Local Government Center
UW-Extension
Part I

Overview
Performance Budgeting
 Originated in late 1940’s

 Congress enacted through National


Security Act of 1949 for newly formed
Department of Defense

 Government Performance and Results


Act (GPRA)
Performance Budgeting

Based on the assumption that presenting


performance information alongside
budget amounts will improve budget
decision-making by focusing funding
choices on program results
Performance Based Budgeting
 Performance based budgeting cannot begin
until a system of performance measurement
has been instituted
 A functional performance based budgeting
system cannot be expected to produce the
long-term desired results in the first year of
its inception
 Must build a Performance Based
Management System
Management Tool
Performance budgets focus on missions,
goals, and objectives to explain why money
is being spent and provide a way to allocate
resources to achieve specific results

PBB is intended to be a management tool for


program improvement, not a “carrot and
stick” methodology used to “punish”
departments for not meeting goals
Why is this Important?
 Most Federal grants now require outcome
evaluations (performance measurement) in their
applications

 Bond sales require indicators of financial condition


which are well presented by performance data

 Local government revenues are becoming


insufficient making effective use of resources
imperative

 Promotes the logical tie between planning and


budgeting
Why is this Important?
 Both the Government Accounting Standards
Board (GASB) and the Government Finance
Officers Association (GFOA) are promoting
performance measurement indicating it may
soon become a requirement

 Provides a way to quantify to the citizens


how well their local government is doing
compared to previous years and other
similar communities; i.e. “how much bang
they’re getting for their buck”
CAUTION!
One of the greatest mistakes in Performance
Based Budgeting is to make simplified
assumptions based on unrefined results and
then apply a system of rewards and
punishments based on them

Such an approach frequently yields adverse


program impacts
Unintended Consequences
 If PBB used as a reward and punishment
system, how do you ensure that reducing a
budget by, say 5% for poor performance,
doesn’t reap a future 20% decrease in future
performance?

 How do you ensure you’ve considered all


factors that may have affected the decline in
performance?
Potential Flaws
Incorrect assumptions or conclusions

 Police: Arrests are up; we gave you more


money, what’s wrong?
 Police: Arrests are down, we gave you more
money, what’s wrong?
 Do more arrests mean better police work,
more crime, less crime, better crime
prevention, or less police work?
Example
An effective Police Department deters crime, how does
one measure “deterred’ crime?

“Police arrests are down 5% from last year so, under


Performance Based Budgeting, we should reduce
the Police budget by 5% until they improve their
results”

Such a simplistic approach fails to account for the


success of crime prevention efforts and community
policing, therefore, punishing good performance
Interim Solution
Until a performance measurement system can
be fully implemented, an interim solution
may help set the groundwork

 Departments heads provide a bullet narrative


with annual budget requests including:
 what department accomplished last year?
 what is different in this year’s budget request?
 what goals has the department set for the coming
year?
Part II

Performance
Measurement
Performance Measurement

The regular systematic collection,


analysis, and reporting of data that
tracks resources used, work produced,
and whether specific outcomes were
achieved by an organization

Note: Measurements are only meaningful to the degree that they


are a basis for strategic and operational decision-making
Performance Measurement
Performance Measurement should:

 Be based on program goals and objectives that tie to


a statement of program mission or purpose
 Measure program outcomes
 Provide for resource allocation comparisons over
time
 Measure efficiency and effectiveness for continuous
improvement
 Be verifiable, understandable, and timely
Performance Measurement
 Be consistent throughout the strategic plan,
budget, and accounting and reporting
systems over time

 Be reported internally and externally (Federal


grants do and GASB may soon require it)

 Be monitored and used in managerial


decision-making processes
Performance Measurement
 Be limited to a number and degree of
complexity that can provide an efficient and
meaningful way to assess the effectiveness
and efficiency of key programs

 Motivate staff at all levels to contribute


toward organizational improvement
Principles
Principle I

Establish broad goals to guide government decision-


making

 Basis for the development of policies, programs, and


service types and levels to be provided

 Developed after an assessment of community


conditions and a review of internal operations of the
government
(GFOA)
Principle II
Develop approaches to achieve goals

 A government should have specific policies, plans,


programs, and management strategies to define how
it will achieve its long-term goals

 It is the policies, plans, and programs that define


how the government will go about accomplishing
these goals

(GFOA)
Principle III
Develop a budget with approaches to achieve goals

 Prepare and adopt a financial plan and budget that


moves toward achievement of goals within the
constraints of available resources

 Provides for the preparation of a financial plan,


capital improvement plan, and budget options

(GFOA)
Principle IV
Evaluate performance and make adjustments

 Program and financial performance should be


continually evaluated, and adjustments made, to
encourage progress toward achieving goals

 Based on this review, the government may need to


make adjustments to the budget, plans, and policies
if goals are to be achieved

(GFOA)
Performance Indicators
 Input
 Output
 Efficiency
 Service Quality
 Outcome
 Explanatory Data
Performance Indicators should:
 Be quantifiable and measurable
 Be relevant, understandable, timely,
consistent, comparable, and reliable
 Constitute a family of measures
- input
- output
- efficiency
- service quality
- outcome
Types of Performance Indicators
 Input Indicators

- resources used to produce an


output
- examples
 costs (direct costs plus fringe benefits)
 labor hours
Types of Performance Indicators
 Output Indicators

- quantity of units produced


- typically under managerial control
- examples
 Miles of pipe visually inspected
 Clients served
Types of Performance Indicators
 Efficiency Indicators

- ratio of inputs used per unit of output (or


outputs per input)
- examples
 Cost per unit: cost per ton of refuse collected, cost
per prisoner boarded, cost per transaction, etc.
 Productivity: hours per consumer complaint, plans
reviewed per reviewer, etc.
Efficiency vs. Effectiveness
Efficiency is related to cost
effectiveness, i.e. lowest costs for a
given output level

Effectiveness is related to if the service


level meets the demands of the citizens
Types of Performance Indicators
 Service Quality Indicators

- how satisfied customers are


- how accurately a service is
provided
- how timely a service is provided
 Percentage of respondents satisfied with service
 Frequency of repeat repairs
 Average wait time
Types of Performance Indicators
 Outcome Indicators

- are qualitative consequences


associated with a program/service
- focus on the ultimate “why” of
providing the service
- examples include:
 Reduction in fire deaths/injuries
 Increase in job trainees who hold a job for more than six
months
 Decrease in low birth-weight babies
Four-Step Methodology*
 Step 1: Review and evaluate existing
department mission and cost center
goals
 Step 2: Identify service areas
 Step 3: Define service area objectives
 Step 4: Identify indicators that
measure progress toward objectives

*(Fairfax County, VA Performance Measurement System)


Step 1
Cost Center Goal Statement
 States what is to be accomplished (outcome)
 States what is to be provided/produced
(output)
 States why cost center exists
 Identifies customers
 Transcends several years
 Begins with “To and a verb”
Cost Center Goal Template &
Example
To provide/produce (fill in service or product)
to (fill in customer) in order to (statement
what you intend to accomplish).

Maternal and Child Health Services


To provide maternity, infant and child health
care and/or case management to at-risk
women, infants and children in order to
achieve optimum health and well-being
Step 2
Identifying a Service Area
 Identify your major activities
 Do not identify every activity; only major
activities
- critical to success of agency’s mission
- consume significant portion of cost center
(department) budget
- politically sensitive or frequently in spotlight
- significant customer service focus
 Group activities that have common
objectives and/or customers
Step 3
Service Area Objectives
 Support cost center goal
 Reflect planned benefits to customers
 Allow measurement of progress
 Quantify portion of the cost center goal
that will be accomplished within the
fiscal year
 Describe quantifiable future target
(optional)
Step 3
Objective Statement Template &
Example
To improve/reduce/maintain (accomplishment) by (a
number or percentage), (from X to Y) toward a target
of (a number).

Maternal and Child Health Services


To improve the immunization completion rate of
children served by the Health Department by 3
percentage points, from 77 percent to 80 percent,
toward a target of 90 percent, which is the Healthy
people year 2010 goal.
Step 4
Indicator Definitions & Examples
Category Definition Example

Input Resources used to produce Cost (direct


costs an output plus fringe
benefits)
Staff hours

Output Quantity or number of units produced. Res. properties


Activity-oriented, measurable and assessed
within usually managerial control. Clients served
Calls
responded to
Step 4
Indicator Definitions & Examples
Category Definition Example

Efficiency Inputs per unit of output or outputs Cost per appraisal


per input Appraisal per
appraiser

Service Timeliness, accuracy and/or customer Errors per data


entry
Quality satisfaction of the service provided operator
Response time
Percentage of
customers satisfied
Step 4
Indicator Definitions & Examples
Category Definition Example

Outcome Qualitative consequences Job trainees


associated with a program/service. who hold
a
Focuses on the ultimate “why” of job for
more
providing a service than 6
months
Input Indicators
Resources used to produce an output
 Cost (budgeted or actual)

 Staff-year equivalents (SYE)

 Full-time equivalents (FTE)

 Direct labor hours (DLH)


Costs as an Input Indicator
Direct costs plus fringe benefits
Direct costs are those devoted to a
particular service and include:
 Personnel services
 Operating expenses

 Recovered costs

 Capital equipment
Output Indicators
What was produced/provided
Usually end in “ed”
Questions to ask
 What services were delivered?
 What volume was provided?

 How many units of service?


Examples
Service Area Indicator

Fire suppression Incidents responded to

Human Resources Vacancies filled

Library New materials circulated


Efficiency Indicators
Inputs used per unit of output
Costper unit where the input is money/dollars
Productivity where the input is staff hours

Examples:
Cost per senior lunch served
Cost per client

Investigations conducted per detective

Hours per fire inspection


Efficiency Indicators
Service Area Indicator

Fire Suppression Cost per incident

Senior-based Services Cost per client

Human Resources Cost per vacancy filled

Custodial Services Cost per square foot cleaned


Service Quality Indicators
Measures customer satisfaction,
timeliness, and/or accuracy of a service

Examples:
 Customer surveys
 Response logs

 Error rates
Service Quality Indicators
Service Area Indicator

Fire Suppression Average suppression response time

Senior-based Services Percent of clients satisfied with


services provided

Human Resources Satisfaction rate with vacancy


processing

Custodial Services Percent of customers satisfied


with custodial services
Outcome Indicators
 Describe the benefit of the service to
the customer
 Describe what was changed or
accomplished as a result of the service
 Questions to ask:
 How has the customer benefited?
 Why is the customer better off?

 What is the impact of the service?


Outcome Indicators
Service Area Indicator

Fire Suppression Fire deaths per 10,000


population
Fire injuries per 10,000
population

Senior-based Services Percent of clients who remain in the


community after one year of service
or information.

Human Resources Average recruitment time

Custodial Services Percentile comparisons of cost per


square foot to IFMA standards
The Logic Model
“Begin with the end in mind”

Start by asking:
 What results are we seeking?

 What are we hoping to


accomplish?
 How will we accomplish it?
What is the Logic Model?
 A picture of a program
 A way to show the relationship between
what we put in (inputs), what we do
(outputs) and what results occur
(outcomes)
 Sequence of if/then relationships
 Core of program planning and
evaluation
Logic Model
Inputs Outputs Outcomes
What we What we do Short-Term Medium-Term Long-Term
invest

Staff Workshops Awareness Behavior Conditions


Dollars Outreach Knowledge Decisions Environment
Volunteers Inspections Attitudes Policies Social
Materials Skills Economic
Equipment Civic
Technology
Logic Model – Fire Suppression

Inputs Outputs Outcomes


What we What we do Short-Term Medium-Term Long-Term
invest

Staff Training Inspections Response Protection


Dollars Inspections Suppression time of lives &
Volunteers Emergency responses Fire property
Materials response Public containment (fire deaths
Technology education Prevalence injuries,
of smoke
detectors
Example
Service Area Objective
Teen Pregnancy Prevention

Acceptable Unacceptable

Reduce the teen pregnancy rate by 2 Increase the number of


pregnancies per 1,000 population localities with comprehensive
from 42 to 40 in localities with teen services from 20 to 27
comprehensive teen services
Performance Management Model
Goals General Goals of program:
 Provide quality services to all customers
 Maintain or improve performance
 Provide economical services

Inputs Resources:
 Money
 Facilities
 Equipment
 Supplies
 Contracted services
Performance Management Model
(cont.)
Activities Work processes:
 Salting roads
 Making arrests
 Processing bills
 Performing inspections

Outputs Goods & Services produced:


 Statistical measurements
 Miles of roads repaired
 Tons hauled or recycled
 Positions filled
Performance Management Model
(cont.)
Outcomes Results and Impacts
 100% of customers will report being satisfied
 95% will be error free
 90% of services will be within +/- 2% of comparable
service within the private sector

Performance Measurement
 Administration of customer satisfaction surveys
 Tracking number of jobs, error rates, average per job
 Cost comparison to private sector services
 Quarterly and annual reports summarizing services
provided, ouputs ad outcome achievement
Remember
 Quantify objectives
 Associate objectives with an outcome
 Word outcomes the same as objectives
 Provide a complete Family of Measures
 Avoid confusing indicators (e.g. efficiency and
service quality)
 Reference the correct baseline to target year for
objective
 Define service areas by program
objective/customers rather than process function
Part III

Benchmarking
Definition
“Formal benchmarking is the continuous,
systematic process of measuring and
assessing products, services and
practices of recognized leaders in the
field to determine the extent to which
they might be adapted to achieve
superior performance.”

Benchmarking & Best Practices, Treasury Board of Canada


Another Definition

“Benchmarking is the practice of being


humble enough to admit that someone
else is better at something and wise
enough to try and learn how to match
and even surpass them at it.”
Types of Benchmarking
 Internal – commonly one year
compared to a previous year’s
performance
 External – your performance compared
to another similar organization
 Operational – your recent annual or
periodic performance
 Strategic – long term performance
Internal Operational
 Probably the most common measure at
the local government level
 Measures current performance versus
an established benchmark from prior
performance
 Example: Police Dept. criminal cases
closed this year versus the average
over the past ten years
External Operational
 Measurements against other organization’s
performance
 Example:

Total Structure Fires Incidents Per 10,000 Population

Whitewater

Platteville

Series2
Janesville
Series1

Eau Claire

Beloit

0 20 40 60 80
POLICE
Violent Crimes Reported Per 1,000 Population

9
8
7
6 Series1
5
Series2
4
3 Series3
2
1
0
Benchmarks
 Internal Benchmarks  External Benchmarks

- Overall spending - Private sector wages


- Growth in tax base - Neighboring cities
- Growth in income - Similar sized counties
- New home starts - Statewide groupings
- Miles within service - Statewide averages
area
Benchmark Standards
 Program dollars spent per capita
 Spending per $1,000 property
assessment
 Percentage growth over time

 Adjustments for inflation

 Other specific service standards


Setting Targets
 Benchmarking
 National standards
 Mandates
 Board direction
 Past Performance
 Internal goals
 Citizen demands
Part IV

Performance Based
Budgeting (PBB)
Performance Based Budgeting
(PBB)
 Performance-based budgeting relies on:

1. Strategic planning
2. Operational planning
3. Performance accountability
4. A realistic performance measurement system

to build budgets.
Performance-based Budgeting (PBB)
Performance-based budgets focus on “return on
investment”—that is, what do we get for our investment of
resources?
 Basic service level (or continuation of basic services)?
 Increased services (more services to same recipients or
expansion of same services to more recipients)?
 Better (higher quality) services?
 More efficient services (cost savings in service
delivery)?
 Mitigation or resolution of a problem?
Performance-based Budgeting
(PBB)
“PBB is budgeting for results - with an eye
on the price tag”

 PBB emphasizes program effectiveness and bases decision


making (whether for continuation or enhancement of a
program) on outcomes.

 However, the costs of achieving those outcomes must be


scrutinized to ensure efficient service delivery and
maximize allocation of scarce resources.
Goal Setting
A SMART goal is defined as such:

 Specific – Is the goal clear and to the point?


 Measurable – Can you tell if it is
accomplished?
 Attainable – Is it a realistic goal?
 Relevant – Is it a priority of the organization?
 Trackable – Results are compared over time?
SMART Examples
 Yes: To respond to all fire calls within the
city within 7 minutes of dispatch
 No: To protect all property within the city to
a high level of safety

 Yes: To process all building permit requests


within 48 hours of application
 No: To process all building permit requests
in the shortest time possible
Rudimentary PBB
A rudimentary form of PBB to be implemented until a
formal system can be produced could include the
following in each department’s budget request:
 An explanation of the department’s overall goals
 An explanation of what the department has accomplished in
the past year
 An explanation of what the department intends to
accomplish in the coming year
 An explanation as to what is different from last year in the
proposed budget and why
 A GASB compliant budget showing past year budget
expenditures
Performance-based Budgeting
(PBB)
Program Structure
 Strategic plans, operational plans, and performance
based budgets are geared to program structures
 Funds are appropriated to departments/programs
 A program is a grouping of activities directed toward
the accomplishment of a clearly defined objective or
set of objectives
 Program structure is an orderly, logical array of
programs and activities that indicates the
relationship between each
TOP TEN REASONS WHY PERFORMANCE-BASED
BUDGETING WON’T WORK IN MY DEPARTMENT

10. It doesn’t matter what we do because we have federal/state funding.


9. We just reorganized and we don’t know what we’re doing yet.
8. Everything is just fine as it is; we’ve always done it this way.
7. We’re too busy getting REAL work done to bother with this.
6. We need more staff, more money, more time, more ( fill in the blank ) to do this.
5. We can’t target outcomes; they’re too specific.
4. We can’t measure what we do.
3. You’ll misinterpret any information we give you.
2. We can’t be accountable because we have no control over anything.
1. We’re different. This shouldn’t apply to us. We need an exemption.
 OFTEN FOLLOWED BY:
All right, just give me a form and tell me
what you want me to say.

If I give them something, then


they’ll go away. (Maybe this
whole thing will just go away.)

NOTE: This genie won’t go back


into the lamp.
INSTITUTIONALIZING PERFORMANCE
MEASUREMENT:
Make performance an integral part of your
management processes.

 Use metrics to understand and measure how a


process works and the results it generates.

 Develop an internal performance accountability


process.

 Integrate performance into policy and budget


decision making and everyday program
management.
INSTITUTIONALIZING PERFORMANCE
MEASUREMENT:

POINTER: When you break down a policy, program, or


process into its component parts, you use "systems logic"
to develop a model of how it should work

INPUT PROCESS OUTPUT & OUTCOME


QUALITY
EFFICIENCY
Managers should use metrics to gauge and assess program and
processes, diagnose problems, and formulate solutions.
Example

Service
Service Area   Objective   Input Output Efficiency Quality   Outcome

To maintain fire loss


at 0.02% or less of
Fire Total Property 7.3 0.027%
Suppression Valuation $249,000 77 $3,234 minutes fire loss

incidents Average
responded cost per response average fire
to response time loss percent
Example 2

Service
Service Area Objective Input Output Efficiency Quality Outcome

Street
Reconstruction 5% $1,374,500 4 4.7% 75% 7%

Engineering
design
Maintain costs as
construction cost a percent Percent of
growth to no Budget/actual of total projects Contract
more than 5 costs Projects project complete cost
Capital Facilities percent Staff completed cost d on time growth (%)
INSTITUTIONALIZING PERFORMANCE
MEASUREMENT:

Metrics (performance indicators) measure process and product.

Inputs Process Outputs & Outcomes


(Demand) (Products) (Results)
(Need) (Services)
(Size of Problem)
(Resources)
Outputs (Expenditures compared to productivity;
caseload per staff member.)
Inputs
Efficiency: (Cost per item produced, service
Outputs or Outcomes provided, or client served; cost per
result achieved.)
Cost
Outputs or Outcomes (Production or turnaround time;
timeliness of results.)
Time

Quality: Effectiveness in meeting the expectations of customers, other stakeholders;


and expectation groups.
INSTITUTIONALIZING PERFORMANCE
MEASUREMENT:

The volume of performance


information that must be managed
can be staggering.
Watch out for the “shotgun” or
“kitchen sink” approach--
reporting just about every type
of measurement or statistic that
is already gathered or can be
counted easily. This leads to a
heavy emphasis on transactional
data--inputs and outputs--rather
than results.
INSTITUTIONALIZING PERFORMANCE
MEASUREMENT:
 Concentrate on the development of balanced sets of
performance indicators in order to provide a clear picture of
performance without overwhelming users with needless detail

 Five types of performance indicators:


 Input
 Output
 Outcome
 Efficiency
 Quality

 A balanced set may include more than one of any indicator


indicator type and none at all of some but must have at
least one measure of outcome, efficiency, or quality
INSTITUTIONALIZING PERFORMANCE
MEASUREMENT:

Present performance information at


different levels in order to surface key data
while maintaining the availability of support
and explanatory material.
 Key Performance Indicators
 Supporting Performance Indicators
 General Performance Information
 Explanatory Notes

 Get consensus among data users on indicator


types and levels before indicators are reported.
INSTITUTIONALIZING
PERFORMANCE MEASUREMENT:

A key performance indicator is a performance indicator that


is included in the Budget Supporting Documents

Factors in determining key level include:


 Most direct measure of outcome?
 Critical success factor?
 Big ticket item?
 Hot button item?
 History and who values?
INSTITUTIONALIZING
PERFORMANCE MEASUREMENT:

General performance information (GPI) may be


included in the budget. However, values for general
performance indicators are reported for prior year
actual only.

GPI may include:

Multi-year histories or trends

External comparisons (national or regional)


MANAGING ACCURACY:

Beware of:
High balls and low balls (unrealistically high or low
performance targets)

Instant replays (reporting the same performance


level over and over, regardless of circumstances)

Greased pigs (indicators for which name, definition,


or method of calculation change so often that you
can’t get a handle on them)
Beware of: (cont.)

Orphans (indicators for which no one


claims responsibility)

Statistical illiteracy (calculations that don‘t


add up)

Limp excuses (meaningless explanations of


performance variances)
MANAGING ACCURACY:
 Getthe right start by developing
meaningful, valid, accurate, and reliable
performance indicators
 Provide documentation for each performance
indicator identified in the strategic plan.
 Strategic planning guidelines include
performance indicators
INTEGRATING PERFORMANCE INTO
BUDGET DECISION MAKING:
 Establish the link between resources and
results early and maintain that link through
budget development, appropriation, and
budget control processes.
Set performance standards linked to appropriation levels

• Performance standards are the expected levels of


performance associated with a performance indicator for a
particular period and funding level. They link dollars and
results

• Performance standards are one way to demonstrate


RETURN ON INVESTMENT--what we can expect to receive
for our money (easier to explain to stakeholders)
Integrating Performance into budget
decision-making

Establish the link between resources and results early


and maintain that link through budget development,
appropriation, and budget control processes.
 During budget development, performance indicator values
associated with the funding level recommended in budget
discussions are proposed performance standards
 During the budget process, performance indicator values become
performance standards linked to the funding amounts actually
appropriated in the budget
 Performance standards may be modified only through approved
processes
 Performance standards are monitored and tracked
References
“Performance Based Budgeting – Putting The Pieces Together,” Carolyn
S. Lane, Deputy Director, Office of Planning and Budget, Division of
Administration, State of Louisiana, September 2006

“Performance Management: Using Performance Measurement for


Decision Making” Recommended Practice (2002 & 2007) Government
Finance Officers Association (GFOA)

“Fairfax County’s Performance Measurement System” Performance


Measurement Team, Dept. of Management & Budget, Fairfax County,
Virginia, June 2006

“Performance Management Handbook” Eau Claire County, WI, January


2007

“Moving From Line Item to Performance Based Budgeting: Craig Maher,


UW Oshkosh

You might also like