You are on page 1of 42

OPHA 2004

Toronto, Ontario

Benchmarking
Mini-Workbook

Charlene Beynon
Monique Stewart
OPHA 2004

If We Knew Then What We Know Now:


Benchmarking in the Real World

Dear Colleague:

The enclosed resources represent “work in progress” and are based on our “lived experiences”
from a number of public health benchmarking projects –If only we knew then what we know
now. We wish to express our sincere thanks and appreciation to those colleagues who have
participated in the benchmarking projects and have assisted us in articulating many “lessons
learned” and developing these resources.

Our intent is to offer resources and practical tools which will make the implementation of
benchmarking in public health more meaningful, effective, efficient and successful in identifying
“best practices”.

We do hope that this Mini-Workbook will:


• enhance your understanding of benchmarking
• encourage you to explore the application of benchmarking in your work setting
• assist you in seeking out benchmarking partners
• highlight common pitfalls and identify strategies to lessen their potential impact
• encourage you to share your results on the Ontario Public Health Benchmarking
Partnership Website (www.benchmarking-publichealth.on.ca)

We welcome your ongoing comments and feedback as we move closer to identifying what are
those “best practices” in using benchmarking to enhance the practice of public health. We
invite you to share your experiences. May we move closer to realizing the full potential of this
quality improvement tool.

Happy benchmarking!

Until next time,

Charlene Beynon Monique Stewart


PHRED Program Director PHRED Program Director
Middlesex-London Health Unit Ottawa Public Health
50 King Street 495 Richmond
London, Ontario Ottawa, Ontario
N6A 5L7 K2A 4A4
Phone: 519-663-5317 ext. 2484 Phone: 613-724-4122 ext. 23467
Email: cbeynon@uwo.ca Email: monique.stewart@ottawa.ca
OPHA 2004
Benchmarking Mini-Workbook

Table of Contents

Presentation: If We Knew Then What We Know Now: Benchmarking in the Real World...1

Benchmarking Code of Conduct .............................................................................................5

Ontario Public Health Benchmarking Partnership: Benchmarking at a Glance .................7

Deciding What to Benchmark..................................................................................................9

A Benchmarking Checklist ....................................................................................................11

Prioritizing Program Components for Benchmarking.........................................................13

Recommended Resources.....................................................................................................15

Tools ........................................................................................................................................19
Benchmarking Work Sheets ..............................................................................................21
Selecting Our Options/Increasing the Odds.......................................................................25
Assessing Feasibility of Success .......................................................................................27
Draft Project Discussion Points..........................................................................................29
Southwest Benchmarking Feasibility Assessment.............................................................31

Personal Notes .......................................................................................................................33


55th Annual Ontario Public Health Association Conference

Benchmarking
If Only We Knew Then,
ƒ The process of identifying, sharing,
What We Know Now: understanding, learning from and adapting
outstanding or best practices from
Benchmarking In The organizations anywhere in the world in the
quest for continuous improvement and
Real World breakthroughs.

(APQC Benchmarking Code of Conduct, 2002)


ƒ Charlene Beynon, Middlesex-London PHRED Program
ƒ Monique Stewart, Ottawa PHRED Program
ƒ Michelle Sangster Bouck, Middlesex-London PHRED Program

OPHA 2004
Learning Objectives Another Definition
Thinking Smart ƒ The process of consistently researching for new
ƒ Critically reflect on current practice, and ideas for methods, practices, processes
incorporate research and evaluation findings to
improve clients services ƒ Either adopting the practices or adapting the good
features and implementing them to become the
Ends in View best of the best.
ƒ Promote benchmarking
ƒ Explore common pitfalls and Critical Success Balm, 1992
Factors

Ontario Public Health


Goal of Benchmarking
Benchmarking Partnership
z Capture comparable data in order to
A collaborative initiative involving:
deduce meaningful comparison on
ƒ Public Health Research, Education &
Development (PHRED) performance between organizations for the
ƒ The Association of Local Public Health Agencies purpose of inspiring improvement and
(alPHa) evaluating performance.
ƒ The Ontario Council on Community Health
Accreditation (OCCHA)

Pilot Projects (1998-


(1998-1999) Relevance to Public Health
ƒ improves quality of service and program delivery
ƒ Pilot project done to address the following: ƒ eliminates the tendency to re-invent the wheel by
– What is benchmarking? recognition and sharing of information
– Relevance of benchmarking in public health ƒ cost savings from improved practices (financial savings)
ƒ 3 Pilot projects: ƒ support creative initiatives
– Immunization Record Processes ƒ facilitates communication, team building & networking
– Food Premises Inspection ƒ promotes accountability
– Partner Notification for Chlamydia

1
Lessons Learned from Public Health Benchmarking
Pilot Projects Web Site
ƒ Requires many steps, patience and
commitment www.benchmarking-publichealth.on.ca
ƒ Data collection not standardized ƒ Web-Based
ƒ Use data that are available and easily ƒ Health Units enter own data
retrievable
ƒ Select comparator Health Units
ƒ Keep the indicators “simple”
ƒ Beware of seasonal realities ƒ Select basis for comparison
ƒ Remember context ƒ Program picks three best
ƒ Resource intensive ƒ Can browse through practices related to
ƒ Determining best practices is challenging! different indicators

Survey of Pilot
Project Participants
ƒ Overall, participants were very positive
ƒ Acknowledged role of PHRED and OPHBP in
A case study:
providing support, expertise & coordination
ƒ Benchmarking process created networking Dental screening
opportunities
ƒ Need to keep benchmarking process simple benchmarking
ƒ Some specific program changes were made or
a conscious decision was made not to change investigation
ƒ 2 areas of concern: anonymity & comparability
ƒ Participating in projects had a positive influence
on participants practice

Our Report
Benchmarking Projects
ƒ 9 benchmarking projects completed or Dental Benchmarking Project: Report 1:
in progress Descriptive Characteristics of Dental
– 3 Pilot Projects Screening Programs in 10 Ontario Public
– Breastfeeding Supports Health Units
– Heart Health Coalitions
– School Health www.benchmarking-publichealth.on.ca
– Universal Influenza Immunization www.phred-redsp.on.ca
– Dental Screening
– West Nile Virus

10 Dental sites
Participation of Health Units ƒ Haliburton-Kawartha-Pine Ridge
Benchmarking Project Development ƒ Hamilton
ƒ 32 Health Units have participated in at least ƒ Hastings & Prince Edward
one, and 24 have participated in more than ƒ Leeds-Grenville-Lanark
one ƒ Middlesex-London
Benchmarking Website ƒ Niagara
ƒ 34 Health Units have completed at least ƒ Peel
one on-line survey, and 28 have completed ƒ Simcoe
more than one on-line survey ƒ Waterloo
ƒ Wellington-Dufferin-Guelph

2
Our Middlesex-
Middlesex-London Strategic Questions
project team
ƒ Charlene Beynon Advancing
ƒ Joan Carrothers Necessary benchmarking
prerequisites in public health
ƒ Meizi He
ƒ Bernie Lueske
ƒ Ruth Sanderson

Consultant:
Monique Stewart, Ottawa PHRED Program

The Lived Experience

If we knew then, From the eyes of the project team:

what we know now ƒ focus and limit scope


ƒ take time to develop a project plan
ƒ know the program

- GO for it! ƒ
ƒ
identify a few key indicators
critically examine the data information
systems

Key Questions And more lived experiences


ƒ What is the performance issue?
– Is benchmarking the right tool? ƒ ensure availability of quality data
– What level of benchmarking is required? ƒ develop a data collection tool “fit for the
task”
ƒ Does the issue justify the investment? ƒ develop your analysis plan
ƒ Is there sufficient variability? ƒ provide specific directions re data
extraction
ƒ Is there a commitment to change practice? ƒ pilot / revise tools
ƒ view benchmarking as an iterative process

A Closer Look Dental Screening


ƒ Identify the benchmarking question. Benchmarking Outputs
ƒ rich description
ƒ Estimate time required.
ƒ Build the project team ƒ reflection & dialogue

ƒ Recruit partners to ensure comparability ƒ learning about the process and the
program
’ƒ Assess data availability/quality

ƒ local program changes

3
The Question:
Best Practices Is there a better way?
Definition ƒ collect the indicator data first?
ƒ processes that yield better outcomes
ƒ effectiveness ƒ then collect the practice data based on
indicator results?
ƒ less resources not more
ƒ efficiency
need to know vs nice to know
Question: Why are they successful?

Key Messages
Best Practices
ƒ Benchmarking is a quality improvement tool
Caveats
that can promote exemplary performance
ƒ misnomer – unlikely one best practice and demonstrate accountability
ƒ prerequisite practices ƒ Success is dependant on many factors
ƒ importance of comparators
ƒ Comparable quality data is a prerequisite!
ƒ customization
ƒ Potential is tremendous
ƒ time limited

Best Practices For Further Information:


ƒ What is the benchmark? Charlene Beynon Monique Stewart
ƒ is there an “industry” standard? Middlesex-London Ottawa
ƒ data sources, e.g. RRFSS, CCHS PHRED Program PHRED Program
ƒ literature
ƒ expert opinion 519-663-5317 613-724-4122
ext. 2484 ext. 23467

cbeynon@uwo.ca monique.stewart@ottawa.ca

Best Practices
For next time:
ƒ identify at the outset how intend to define
best practices
ƒ will the indicators and data support the
definition?

4
ONTARIO PUBLIC HEALTH BENCHMARKING PARTNERSHIP
(OPHBP)

The Ontario Public Health Benchmarking Partnership is a collaborative initiative of the Public
Health Research, Education & Development (PHRED) Programs, the Association of Local
Public Health Agencies (alPHa), and the Ontario Council of Community Health Accreditation
(OCCHA).

MISSION: To promote and facilitate the development and sustainability


of benchmarking in Ontario public health units.

BENCHMARKING: The process of consistently researching for new ideas for


methods, practices, and processes; and of either adopting
the practices or adapting the good features, and
implementing them to become the best of the best. (Balm,
1992, p. 16)

CODE OF CONDUCT
PURPOSE: To address the appropriate behaviour for all participants involved in
benchmarking in Ontario public health units through the Ontario Public Health
Benchmarking Partnership (OPHBP).

To contribute to efficient, effective and ethical benchmarking, public health units and all other
individuals /partners agree to abide by the following principles for benchmarking:

1. Principle of Information Exchange:


Be willing to provide the same level of information that you request, in any benchmarking
exchange.
2. Principle of Use:
Use information obtained through the benchmarking partnership only for the purpose of
improvement of operations within the partnering health units themselves. External use or
communication of a benchmarking partner’s name with their data of results and/or
observed practices requires the permission of that health unit.
3. Principle of Preparation:
Demonstrate commitment to the efficiency and effectiveness of the benchmarking process
with adequate preparation at each process step.
4. Principle of Contact:
Initiate contacts, whenever possible, through the benchmarking contact person designated
by the participating health unit /organization. Obtain mutual agreement with the contact on
any hand-off of communication or responsibility to others. Obtain a partner’s permission
before providing their name in response to a contact request.

5
5. Principle of Confidentiality:
Treat any benchmarking interchange as something confidential to the individuals and
organizations involved. Information obtained must not be communicated outside the
Ontario Public Health Benchmarking Partnership without prior consent of participating
benchmarking partners. An organization’s participation in a benchmarking project should
not be communicated to any third party without their permission.

ETIQUETTE AND ETHICS:


The Ontario Public Health Benchmarking Partnership believes participation in the
benchmarking process is based on openness and trust. The following guidelines apply to all
participants in the benchmarking process:
• Treat information obtained through participation in the benchmarking process as internal
privileged information.
• Do not disparage a participating health unit’s business or operations to a third party.
• Enter into each benchmarking project with a mutual goal of improvement.

BENCHMARKING PROTOCOL
• Know and abide by the Benchmarking Code of Conduct.
• Have basic knowledge of benchmarking and follow the benchmarking process.
• Have determined what to benchmark, identified key performance variables, recognized
superior performing partners and completed a rigorous self-assessment.
• Have developed a questionnaire and will share this in advance if requested.
• Have the authority to share information.
• Work through a specified host and mutually agree on scheduling and meeting
arrangements.
Face-to-face meeting guidelines:
• Provide meeting agenda in advance.
• Be professional, honest, courteous and prompt.
• Introduce all attendees and explain why they are present.
• Adhere to the agenda; maintain focus on benchmarking issues.
• Use language that is universal.
• Do not share proprietary information without prior approval from the proper authority of
all participants.
• Share information about your process(es) if asked, and consider sharing study results.

SUMMARY: BENCHMARKING CODE OF CONDUCT


Be willing to give what you get.
Respect confidentiality.
Keep information internal.
Don’t refer without permission.
Be prepared at initial contacts.
Source/Adapted from: The Electronic College of Process Innovation. The Benchmarking Code of Conduct.
(http://www.dtic.mil/c3i/bpred/0057.htm)
Reference: Balm, G.J. (1992). Benchmarking: A practitioner’s guide for becoming and staying the best of the
best. Schaumberg, Illinois: OPMA Press

6
ONTARIO PUBLIC HEALTH BENCHMARKING PARTNERSHIP
BENCHMARKING AT A GLANCE

Benchmarking:
the process of consistently researching for new ideas for methods, practices, processes; and of either
adopting the practices or adapting the good features and implementing them to become the best of the best.
(Balm 1992, p. 16)

A Seven Step Benchmarking Process:


• Identify what needs to be benchmarked
• Determine performance measures, collect and analyze internal data
• Select benchmarking partners, e.g. comparator health units
• Access data from benchmarking partners
• Identify best practices and determine performance gaps
• Implement an action plan
• Monitor results and recalibrate benchmarks
. . . and the cycle continues (Sales & Stewart, Benchmarking Tool Kit,

What factors influence your decision to initiate a benchmarking investigation?

Some factors to consider:


• Is this program
“benchmarkable”? Can you
map out program components?
• Are other “organizations”
experiencing better results?
• Do the differences in “results”
merit the investment of
resources?
• Is this “issue” an organizational
priority? Are there more urgent
priorities?
• Is there political/community
pressure to “do things
differently”?
• What questions(s) do you want
answered? Need to be
specific
• What processes are used by
other sites that achieve better
results (important to know process – sometimes articulating your process will highlight
where changes are indicated)

Other issues to explore:


What level of data collection is needed?
• Is the information you need to answer your question(s) already available, e.g. literature
searches, systematic reviews, and if so are the findings transferable to your setting or are
modifications required? Do you have enough information to make and support a
recommendation? If so, develop, implement and evaluate an action plan

7
• OR do you need to initiate a more comprehensive benchmarking investigation?

Additional factors to consider:


Determine organizational readiness to proceed.
Is there:
• management and organizational support, including commitment, dedicated time and
budget?
• involvement of front line staff?
• a benchmarking team with the necessary commitment and expertise?
• an environment conducive to change, i.e. dependent on findings are you willing to introduce
a new program or to change a program component?
• a reasonable timeline?

If so, initiate a benchmarking investigation – refer to the above seven-step process.

Remember to keep it manageable.

Factors to consider when selecting benchmarking partners:

Look for organizations that are:


• recognized for their expertise
• similar1 to yours, e.g. serve a similar population, e.g. rural/urban, age distribution,
socioeconomic status, staffing mix, geography, etc.
• are accessible, e.g. site visit, conference calls
• willing to share information and to participate
1
Can be from a different sector but offer a similar service, e.g. community health centre, call centre.

The number of partners will depend on your benchmarking question, timeline and resources available.

For Additional Information - see the Benchmarking Tool Kit

References:
Balm, G. J. (1992). Benchmarking: A practitioner’s guide for becoming and staying best of the best.
Schaumburg, Illinois: QPMA Press
Sales, P. D., & Stewart, P. J. (1998). Benchmarking Tool Kit: a blueprint for public health practice.
PHRED Program: Middlesex-London Health Unit, Ottawa-Carleton Health Department.
Source:
Beynon, C. (1999). Benchmarking. From Regional Workshops, Public Health Needs, Effective
Interventions, Benchmarking: Implications for Public Health Units
Recommended Readings:
Ammons, D. N. (2001). Municipal benchmarks: Assessing local performance and establishing
community standards. California: Sage Publication Inc.
Keehley, P., Medlin, S., MacBride, S., and Longmire, L. (1997). Benchmarking for best practices in the
public health sector: Achieving performance breakthroughs in federal, state and local agencies. San
Francisco: Jossey-Bass.

November 7, 2001/Reviewed: November 2004 Charlene Beynon, Middlesex-London Health Unit

8
Deciding What to Benchmark?

The process should:

be meaningful
be highly visible
be resource intensive
have a history of problems
have the opportunity to improve
have an environment conducive to change
be understood
support the agency mission, vision and strategic directions
need ideas from other sources to be improved

Is this process worth benchmarking at this time?

- Keehley, 1997, pp 87-88

9
10
A Benchmarking Checklist

Before embarking on a benchmarking investigation assess the following and determine the
likelihood of a successful outcome.

Key Decision Points Yes No


1. A S.M.A.R.T.1. benchmarking question can be identified?
2. There is a measure to determine what is “best”?

3. Performance improvement is possible?

4. There is a willingness to make changes?

5. There is a sustained commitment to participate for the project’s


duration?
6. A detailed program description addressing how the program is
delivered can be articulated for each comparator’s program?

7. A few key indicators (effectiveness and efficiency) can be identified?

8. Required data is available?

9. If data not available, it is feasible with current resources to retrieve or


collect?
10. Data collection can be standardized?
11. Benchmarking expertise is available?

12. There are sufficient resources, e.g. staffing, budget, etc to sustain the
project?
1.
S = Specific M = Measurable A = Achievable R = Realistic T = Timeline

This Checklist has not been validated. It has been developed from experiences gained from
benchmarking public health programs and is intended to stimulate strategic dialogue and
decision-making with an overall goal of increasing the likelihood of a successful outcome.

For further information contact:


Charlene Beynon
Middlesex-London Health Unit
cbeynon@uwo.ca
519-663-5317 ext. 2484

11
12
Prioritizing Program Components
for Benchmarking

Criteria Rationale Scoring


Ease of data New data collection for the sole + no new data collection required
collection purpose of benchmarking is a 0 minor effort will likely be
barrier to implementation required
- substantial effort required for
new data collection

Data collection tool Drafting and piloting of + data collection tool(s) available
measurement tools has potential 0 minor tool creation required
to be resource and time - substantial effort likely required
consuming to develop tools

Sufficient program Can not benchmark an activity + component actively


implementation that is not being done implemented
0 some implementation
- little or no implementation

Sufficiently similar Mandatory Program requirements + programs highly similar


program leave discretion regarding 0 similar
implementation approaches. Processes need to - large divergence in approaches
be sufficiently similar to allow
comparison

Resource The more resources are + significant resources


commitment committed to a component, more 0 moderate resources
likely willing to engage in - little or not resources
benchmarking effort committed

Source: Ontario Public Health Benchmarking Partnership

13
14
RECOMMENDED RESOURCES

www.benchmarking-publichealth.on.ca

BENCHMARKING:

Ammons, D.N. (2001). Municipal benchmarks: Assessing local performance and establishing
community standards. Thousand Oaks: Sage Publication Inc.

Ammons, D.N. (1996). Municipal benchmarks: Assessing local performance and establishing
community standards. Thousand Oaks: Sage Publication Inc.
Balm, G.J. (1992). Benchmarking: A practitioner’s guide for becoming and staying best of the
best. Schaumburg, Illinois: QPMA Press.
Beynon, C. & Wilson, V. (1998). Benchmarking in Public Health: An idea whose time has
come. Public Health & Epidemiology Report Ontario, 9(7), 162-163.

Bolan, S. (2001). Competitive calibration. Computing Canada, 27(10), p. 25.

Camp, R. C. (1989). Benchmarking: The search for best practices that lead to superior
performance. Part I: Benchmarking defined. Quality Progress, January, 61-68.

Camp, R. C. (1989). Benchmarking: The search for best practices that lead to superior
performance. Part II: Key process steps. Quality Progress, February, 70-75.

Camp, R. C. (1989). Benchmarking: The search for best practices that lead to superior
performance. Part III: Why benchmark?. Quality Progress, March, 61-68.

Codling, S. (1996). Best Practice Benchmarking: An International Perspective. Texas: Gulf


Publishing Company.

Dattakumar, R. & Jagadeesh, R. (2003). A review of literature on benchmarking.


Benchmarking, 10(3), 176-209.

Davies, A. J. & Kochhar, A. K. (2000). A framework for the selection of best practices.
International Journal of Operations and Production Management, 20(10), 1203.

Doebbeling, B.N., Vaughn, T.E., Woolson, R.F., Peloso, P.M., Ward, M.M., Letuchy, E., Boots-
Miller, B.J., Tripp-Reimer, T., & Branch, L.G. (2002). Benchmarking veterans affairs medical
centers in the delivery of preventive health services. Medical Care, 40(6), 540-554.

Dunkley, G., Stewart, M., Basrur, S., Beynon, C., Finlay, M., Reynolds, D., Sanderson, R., &
Wilson, V. (2001). Benchmarking in public health. Public Health & Epidemiology Report
Ontario, 12(6), 211-215.

Dunkley, G., Wilson, V., & Stewart, M. (2000). Benchmarking Pilot Project: Testing the
concept in public health. Public Health & Epidemiology Report Ontario, 11(1), 14-21.

15
Ellis, J., Cooper, A., Davies, D., Hadfield, J., Oliver, P., Onions, J. & Walmsley, E. (2000)
Making a difference to practice: Clinical benchmarking part 2. Nursing Standard, 14(33), 32-
35.

Fitz-Enz, J. (1992). Benchmarking best practices. Canadian Business Review, 19(4), 28-31.

Gunasekaran, A. (2001). Benchmarking tools and practices for twenty-first century


competitiveness. Benchmarking, 8(2), 86-87.

Herman, R. C. & Provost, S. (2003). Interpreting measurement data for quality improvement:
Standards, means, norms, and benchmarks. Psychiatric Services, 54(5), 655-657.

Johnson, B., & Chambers, J. (2000). Food service benchmarking: Practices, attitudes and
beliefs of foodservice directors. The American Dietetic Association, 100(2), 175-180.

Keehley, P., Medlin, S., MacBride, S., & Longmire, L. (1997). Benchmarking for best practices
in the public health sector: Achieving performance breakthroughs in federal, state and local
agencies. San Francisco: Jossey-Bass.

Loveridge, N. (2002). Benchmarking as a quality assessment tool. Emergency Nurse, 9(9),


24-29.

Mancuso, S. (2001). Adult-centered practices: Benchmarking study in higher education.


Innovative Higher Education. 25(3), 165-181.

Ossip-Klein, D.J., Karuza, J., Tweet, A., Howard, J., Obermiller-Powers, M., Howard, L., Katz,
P., Griffin-Roth, S., & Swift, M. (2002). Benchmarking implementation of a computerized
system for long-term care. American Journal of Medical Quality, 17(3), 94-102.

Sales, P.D., & Stewart, P.J. (1998). Benchmarking Tool Kit: A blueprint for public health
practice. Middlesex-London and Ottawa-Carleton Public Health Research, Education and
Development (PHRED) Programs.

Tepper, D. (2002). Benchmarking: Measuring productivity and outcomes. PT-Magazine of


Physical Therapy, 10(1), 48-52.

Vassallo, M.L. (2000). Benchmarking and evidence-based practice: Complementary


approaches to achieving quality process improvement. Seminars in Perioperative Nursing,
9(3), 121-124.

Wilson, B. & Beynon, C. (1998). Introducing benchmarking to Ontario Health Units. Public
Health & Epidemiology Report Ontario, 9(8), 183-186.

Witt, M. J. (2002). Practice re-engineering through the use of benchmarks: Part II. Medical
Practice Management, March/April, 237-242.

Zairi, M. (1998). Benchmarking for Best Practice. London: Butterworth-Heinemann.

Zairi, M. & Leondard, P. (1994). Practical Benchmarking: The Complete Guide. London:
Chapman & Hall.

16
PROGRAM LOGIC MODELS:

Dwyer, J. (1996). Applying program logic model in planning and evaluation. Public Health &
Epidemiology Report Ontario, 17(2), 38-46.

Rush, B. & Ogborne, A. (1991). Program logic models: Expanding their role and structure for
program planning and evaluation. The Canadian Journal of Program Evaluation. 6(2), 95-106.

McLaughlin, J.A., & Jordan, G.B. (1999). Logic models: A tool for telling your program’s
performance story. Evaluation and Program Planning, 22(1), 1-14.

WEBSITES FOR PROGRAM LOGIC MODELS:


• www.benchmarking.co.uk
• www.bja.evaluationwebsite.org/html/roadmap/basic/program_logic_models
• http://garberconsulting.com/Program_Logic_Model.htm
• www.uottawa.ca/academic/med/epid/toolkit.htm

ONTARIO PUBLIC HEALTH BENCHMARKING PARTNERSHIP – DOCUMENTS & REPORTS:

✰ see - www.benchmarking-publichealth.on.ca (documents & reports)


• Benchmarking: Breastfeeding Support in Public Health – December 2000
• Towards Benchmarking Heart Health Coalitions: Developing A Systematic Process for
Documenting and Enriching Community/Health Unit Partnerships – April 2001
• Benchmarking in Public Health in Ontario: Web-Site User’s Manual (2nd Ed.) – July 2002

Charlene Beynon Monique Stewart


Middlesex-London Health Unit City of Ottawa Public Health & Long-Term
PHRED Program Care Branch
cbeynon@uwo.ca PHRED Program
519-663-5317 ext. 2484 monique.stewart@ottawa.ca
613-724-4122 ext. 23467

November 7, 2001/Updated: November 2004

17
18
Tools

The following resources were developed from “lessons learned” from the Dental Screening
Benchmarking Investigation to assist future project teams in :
• determining if benchmarking is the tool of choice, and
• mapping out pre-requisites for a successful benchmarking investigation.

These resources were used to facilitate discussion with colleagues in the Southwest to
determine the feasibility of conducting a benchmarking investigation focusing on postpartum
depression.

o Benchmarking Work Sheets


o Selecting Our Options/Increasing the Odds
o Assessing Feasibility of Success
o Draft Project Discussion Points
o Southwest Benchmarking Feasibility Assessment

We look forward to improving these resources and welcome your questions and comments.

19
20
Benchmarking Worksheets

1. Identify major components for a Program Logic Model.

2. What component do you want to benchmark? i.e. Where do you need to improve performance? And where can you
improve performance?

3. What is your benchmarking question?

We want to benchmark:

21
4. What is the benchmark? (i.e. What is the gold standard? What will you compare your results to in order to identify “Best
Practices?)

5. Develop a Program Logic Model for the component to be benchmarked.

Component

Activities

Target Audience

Short term Outcomes

Long Term Outcomes


22
6. Identify a maximum of 3 –4 indicators

Principle: include both effectiveness and efficiency indicators.

Does it work? Is it worth doing?

Indicators Data Sources Retrievable

1.

2.

3.

4.

An indicator:
a quantifiable measure of an outcome or objective

For further information contact:


Charlene Beynon
Middlesex-London Health Unit
50 King Street, London, Ontario, N6A 5L7
cbeynon@uwo.ca /
519-663-5317 ext. 2484

23
24
Benchmarking Worksheet
Selecting Our Options/Increasing the Odds

Program
Component1.
What is the
benchmarking
question?
What is the
benchmark?

What are the


indicators?

Can we get the


data?

Other

Post Note:
The goal is to focus on only one program component. Other components can be addressed in subsequent benchmarking
investigations. If there is debate about which component should be addressed, this Work Sheet can assist in articulating which
focus is more likely to yield a successful benchmarking outcome.

November 2003 – Revised: November 2004


1. Program Logic Model

25
26
Benchmarking
Assessing Feasibility of Success
Worksheet

Component to be Ease of Data Collection


benchmarked

Data Collection Tools

Sufficient Program Implementation

Sufficiently Similar Implementation

Resource Commitment

27
28
Postpartum Depression
Benchmarking Investigation
Draft Project Discussion Points
1. Identify major components for a Program Logic Model.
2. What component do you want to benchmark?
Principle: Select only 1
3. What benchmarking question do you want to answer?
Principle: the question should be SMARTa
Question: is there a benchmark?
4. Initiate discussion about how intend to identify “Best Practices”.

5. What data is required to answer the question?


Question: is the data easily retrievable?
6. Identify a maximum of 3-4 indicators.
Principle: include both effectiveness and efficiency indicators
7.. Review Benchmarking Checklist.
Decision Point: i) continue, ii) different action, iii) stop
8. Develop questionnaire or other data collection tool to collect indicator data
only*.
9. Pilot the questionnaire and revise as needed. Several iterations may be
required.
10. Collect indicator data.
11. Review findings with practitioners and other recognized experts.
12. Collect practice data to link with best results.
13. Identify “Best Practices”.
14. Implement.
15. Evaluate.
16. Disseminate experiences.
17. Monitor results.

a = S-Specific, M-Measurable, A-Achievable, R-Realistic, T-Timeline

* Usually indicator and practice data are collected simultaneously. This option requires further
study. It is being presented as one way to keep the project manageable, time limited and is
based on experiences from other projects.

Although represented as a series of linear steps, benchmarking is an interactive process

November 2003 – Revised: November 2004

29
30
Postpartum Depression
Southwest Benchmarking Feasibility Assessment

The Question: is benchmarking the right tool to answer the question(s) that needs to be
answered

When to Benchmark*:

1. Is there a performance issue?

2. Is someone doing better? Is it really better? Compared to what?

3. Can we identify and compare practices

4. Are the practices transferable and able to be customized to other settings?

* see the “Benchmarking Checklist” and “Priority Program Components for Benchmarking”

Key Phrases:
• Who is doing better at. . . (identifying women at risk for postpartum depression)?

• What practices are most effective and efficient. . .(identifying women at risk for postpartum
depression)?
• type of contact
• with what tool?
• when?

November 2003

31
32
Personal Notes

33
34
35
36

You might also like