OPHA 2004 Toronto, Ontario

Benchmarking Mini-Workbook

Charlene Beynon Monique Stewart

OPHA 2004

If We Knew Then What We Know Now: Benchmarking in the Real World
Dear Colleague: The enclosed resources represent “work in progress” and are based on our “lived experiences” from a number of public health benchmarking projects –If only we knew then what we know now. We wish to express our sincere thanks and appreciation to those colleagues who have participated in the benchmarking projects and have assisted us in articulating many “lessons learned” and developing these resources. Our intent is to offer resources and practical tools which will make the implementation of benchmarking in public health more meaningful, effective, efficient and successful in identifying “best practices”. We do hope that this Mini-Workbook will: • • • • • enhance your understanding of benchmarking encourage you to explore the application of benchmarking in your work setting assist you in seeking out benchmarking partners highlight common pitfalls and identify strategies to lessen their potential impact encourage you to share your results on the Ontario Public Health Benchmarking Partnership Website (www.benchmarking-publichealth.on.ca)

We welcome your ongoing comments and feedback as we move closer to identifying what are those “best practices” in using benchmarking to enhance the practice of public health. We invite you to share your experiences. May we move closer to realizing the full potential of this quality improvement tool. Happy benchmarking! Until next time,

Charlene Beynon PHRED Program Director Middlesex-London Health Unit 50 King Street London, Ontario N6A 5L7 Phone: 519-663-5317 ext. 2484 Email: cbeynon@uwo.ca

Monique Stewart PHRED Program Director Ottawa Public Health 495 Richmond Ottawa, Ontario K2A 4A4 Phone: 613-724-4122 ext. 23467 Email: monique.stewart@ottawa.ca

OPHA 2004 Benchmarking Mini-Workbook
Table of Contents
Presentation: If We Knew Then What We Know Now: Benchmarking in the Real World...1 Benchmarking Code of Conduct .............................................................................................5 Ontario Public Health Benchmarking Partnership: Benchmarking at a Glance .................7 Deciding What to Benchmark..................................................................................................9 A Benchmarking Checklist ....................................................................................................11 Prioritizing Program Components for Benchmarking.........................................................13 Recommended Resources.....................................................................................................15 Tools ........................................................................................................................................19 Benchmarking Work Sheets ..............................................................................................21 Selecting Our Options/Increasing the Odds.......................................................................25 Assessing Feasibility of Success .......................................................................................27 Draft Project Discussion Points..........................................................................................29 Southwest Benchmarking Feasibility Assessment.............................................................31 Personal Notes .......................................................................................................................33

55th Annual Ontario Public Health Association Conference

If Only We Knew Then, What We Know Now: Benchmarking In The Real World
Charlene Beynon, Middlesex-London PHRED Program Monique Stewart, Ottawa PHRED Program Michelle Sangster Bouck, Middlesex-London PHRED Program

Benchmarking
The process of identifying, sharing, understanding, learning from and adapting outstanding or best practices from organizations anywhere in the world in the quest for continuous improvement and breakthroughs.
(APQC Benchmarking Code of Conduct, 2002)

OPHA 2004 Learning Objectives
Thinking Smart Critically reflect on current practice, and incorporate research and evaluation findings to improve clients services Ends in View Promote benchmarking Explore common pitfalls and Critical Success Factors

Another Definition
The process of consistently researching for new ideas for methods, practices, processes Either adopting the practices or adapting the good features and implementing them to become the best of the best. Balm, 1992

Ontario Public Health Benchmarking Partnership
A collaborative initiative involving: Public Health Research, Education & Development (PHRED) The Association of Local Public Health Agencies (alPHa) The Ontario Council on Community Health Accreditation (OCCHA)

Goal of Benchmarking
Capture comparable data in order to deduce meaningful comparison on performance between organizations for the purpose of inspiring improvement and evaluating performance.

Pilot Projects (1998-1999) (1998Pilot project done to address the following:
– What is benchmarking? – Relevance of benchmarking in public health

Relevance to Public Health
improves quality of service and program delivery eliminates the tendency to re-invent the wheel by recognition and sharing of information cost savings from improved practices (financial savings) support creative initiatives facilitates communication, team building & networking promotes accountability

3 Pilot projects:
– Immunization Record Processes – Food Premises Inspection – Partner Notification for Chlamydia

1

Lessons Learned from Pilot Projects
Requires many steps, patience and commitment Data collection not standardized Use data that are available and easily retrievable Keep the indicators “simple” Beware of seasonal realities Remember context Resource intensive Determining best practices is challenging!

Public Health Benchmarking Web Site
www.benchmarking-publichealth.on.ca Web-Based Health Units enter own data Select comparator Health Units Select basis for comparison Program picks three best Can browse through practices related to different indicators

Survey of Pilot Project Participants
Overall, participants were very positive Acknowledged role of PHRED and OPHBP in providing support, expertise & coordination Benchmarking process created networking opportunities Need to keep benchmarking process simple Some specific program changes were made or a conscious decision was made not to change 2 areas of concern: anonymity & comparability Participating in projects had a positive influence on participants practice

A case study: Dental screening benchmarking investigation

Benchmarking Projects
9 benchmarking projects completed or in progress
– – – – – – –

Our Report
Dental Benchmarking Project: Report 1: Descriptive Characteristics of Dental Screening Programs in 10 Ontario Public Health Units www.benchmarking-publichealth.on.ca www.phred-redsp.on.ca

3 Pilot Projects Breastfeeding Supports Heart Health Coalitions School Health Universal Influenza Immunization Dental Screening West Nile Virus

Participation of Health Units
Benchmarking Project Development
32 Health Units have participated in at least one, and 24 have participated in more than one

10 Dental sites
Haliburton-Kawartha-Pine Ridge Hamilton Hastings & Prince Edward Leeds-Grenville-Lanark Middlesex-London Niagara Peel Simcoe Waterloo Wellington-Dufferin-Guelph

Benchmarking Website
34 Health Units have completed at least one on-line survey, and 28 have completed more than one on-line survey

2

Our Middlesex-London Middlesexproject team
Charlene Beynon Joan Carrothers Meizi He Bernie Lueske Ruth Sanderson Consultant: Monique Stewart, Ottawa PHRED Program

Strategic Questions

Necessary prerequisites

Advancing benchmarking in public health

The Lived Experience

If we knew then, what we know now

From the eyes of the project team: focus and limit scope take time to develop a project plan know the program identify a few key indicators critically examine the data information systems

- GO for it!
Key Questions
What is the performance issue?
– Is benchmarking the right tool? – What level of benchmarking is required?

And more lived experiences
ensure availability of quality data develop a data collection tool “fit for the task” develop your analysis plan provide specific directions re data extraction pilot / revise tools view benchmarking as an iterative process

Does the issue justify the investment? Is there sufficient variability? Is there a commitment to change practice?

A Closer Look
Identify the benchmarking question. Estimate time required. Build the project team Recruit partners to ensure comparability Assess data availability/quality

Dental Screening Benchmarking Outputs
rich description reflection & dialogue learning about the process and the program local program changes

3

Best Practices
Definition processes that yield better outcomes
effectiveness

The Question: Is there a better way?
collect the indicator data first? then collect the practice data based on indicator results?

less resources not more
efficiency

need to know vs nice to know
Question: Why are they successful?

Best Practices
Caveats misnomer – unlikely one best practice prerequisite practices importance of comparators customization time limited

Key Messages
Benchmarking is a quality improvement tool that can promote exemplary performance and demonstrate accountability Success is dependant on many factors Comparable quality data is a prerequisite! Potential is tremendous

Best Practices
What is the benchmark?
is there an “industry” standard? data sources, e.g. RRFSS, CCHS literature expert opinion

For Further Information:
Charlene Beynon Middlesex-London PHRED Program 519-663-5317 ext. 2484 cbeynon@uwo.ca Monique Stewart Ottawa PHRED Program 613-724-4122 ext. 23467 monique.stewart@ottawa.ca

Best Practices
For next time: identify at the outset how intend to define best practices will the indicators and data support the definition?

4

ONTARIO PUBLIC HEALTH BENCHMARKING PARTNERSHIP (OPHBP)
The Ontario Public Health Benchmarking Partnership is a collaborative initiative of the Public Health Research, Education & Development (PHRED) Programs, the Association of Local Public Health Agencies (alPHa), and the Ontario Council of Community Health Accreditation (OCCHA). MISSION: BENCHMARKING: To promote and facilitate the development and sustainability of benchmarking in Ontario public health units. The process of consistently researching for new ideas for methods, practices, and processes; and of either adopting the practices or adapting the good features, and implementing them to become the best of the best. (Balm, 1992, p. 16)

CODE OF CONDUCT
PURPOSE: To address the appropriate behaviour for all participants involved in benchmarking in Ontario public health units through the Ontario Public Health Benchmarking Partnership (OPHBP).

To contribute to efficient, effective and ethical benchmarking, public health units and all other individuals /partners agree to abide by the following principles for benchmarking: 1. Principle of Information Exchange: Be willing to provide the same level of information that you request, in any benchmarking exchange. 2. Principle of Use: Use information obtained through the benchmarking partnership only for the purpose of improvement of operations within the partnering health units themselves. External use or communication of a benchmarking partner’s name with their data of results and/or observed practices requires the permission of that health unit. 3. Principle of Preparation: Demonstrate commitment to the efficiency and effectiveness of the benchmarking process with adequate preparation at each process step. 4. Principle of Contact: Initiate contacts, whenever possible, through the benchmarking contact person designated by the participating health unit /organization. Obtain mutual agreement with the contact on any hand-off of communication or responsibility to others. Obtain a partner’s permission before providing their name in response to a contact request.

5

5. Principle of Confidentiality: Treat any benchmarking interchange as something confidential to the individuals and organizations involved. Information obtained must not be communicated outside the Ontario Public Health Benchmarking Partnership without prior consent of participating benchmarking partners. An organization’s participation in a benchmarking project should not be communicated to any third party without their permission.

ETIQUETTE AND ETHICS:
The Ontario Public Health Benchmarking Partnership believes participation in the benchmarking process is based on openness and trust. The following guidelines apply to all participants in the benchmarking process: • Treat information obtained through participation in the benchmarking process as internal privileged information. • Do not disparage a participating health unit’s business or operations to a third party. • Enter into each benchmarking project with a mutual goal of improvement.

BENCHMARKING PROTOCOL
• • • • • • Know and abide by the Benchmarking Code of Conduct. Have basic knowledge of benchmarking and follow the benchmarking process. Have determined what to benchmark, identified key performance variables, recognized superior performing partners and completed a rigorous self-assessment. Have developed a questionnaire and will share this in advance if requested. Have the authority to share information. Work through a specified host and mutually agree on scheduling and meeting arrangements.

Face-to-face meeting guidelines: • Provide meeting agenda in advance. • Be professional, honest, courteous and prompt. • Introduce all attendees and explain why they are present. • Adhere to the agenda; maintain focus on benchmarking issues. • Use language that is universal. • Do not share proprietary information without prior approval from the proper authority of all participants. • Share information about your process(es) if asked, and consider sharing study results. SUMMARY: BENCHMARKING CODE OF CONDUCT Be willing to give what you get. Respect confidentiality. Keep information internal. Don’t refer without permission. Be prepared at initial contacts.
Source/Adapted from: The Electronic College of Process Innovation. The Benchmarking Code of Conduct. (http://www.dtic.mil/c3i/bpred/0057.htm) Reference: Balm, G.J. (1992). Benchmarking: A practitioner’s guide for becoming and staying the best of the best. Schaumberg, Illinois: OPMA Press

6

ONTARIO PUBLIC HEALTH BENCHMARKING PARTNERSHIP
BENCHMARKING AT A GLANCE
Benchmarking: the process of consistently researching for new ideas for methods, practices, processes; and of either adopting the practices or adapting the good features and implementing them to become the best of the best. (Balm 1992, p. 16) A Seven Step Benchmarking Process: • Identify what needs to be benchmarked • Determine performance measures, collect and analyze internal data • Select benchmarking partners, e.g. comparator health units • Access data from benchmarking partners • Identify best practices and determine performance gaps • Implement an action plan • Monitor results and recalibrate benchmarks . . . and the cycle continues (Sales & Stewart, Benchmarking Tool Kit,

What factors influence your decision to initiate a benchmarking investigation?
Some factors to consider: • Is this program “benchmarkable”? Can you map out program components? • Are other “organizations” experiencing better results? • Do the differences in “results” merit the investment of resources? • Is this “issue” an organizational priority? Are there more urgent priorities? • Is there political/community pressure to “do things differently”? • What questions(s) do you want answered? Need to be specific • What processes are used by other sites that achieve better results (important to know process – sometimes articulating your process will highlight where changes are indicated)

Other issues to explore:
What level of data collection is needed? • Is the information you need to answer your question(s) already available, e.g. literature searches, systematic reviews, and if so are the findings transferable to your setting or are modifications required? Do you have enough information to make and support a recommendation? If so, develop, implement and evaluate an action plan

7

OR do you need to initiate a more comprehensive benchmarking investigation?

Additional factors to consider:
Determine organizational readiness to proceed. Is there: • management and organizational support, including commitment, dedicated time and budget? • involvement of front line staff? • a benchmarking team with the necessary commitment and expertise? • an environment conducive to change, i.e. dependent on findings are you willing to introduce a new program or to change a program component? • a reasonable timeline? If so, initiate a benchmarking investigation – refer to the above seven-step process.

Remember to keep it manageable.
Factors to consider when selecting benchmarking partners: Look for organizations that are: • recognized for their expertise • similar1 to yours, e.g. serve a similar population, e.g. rural/urban, age distribution, socioeconomic status, staffing mix, geography, etc. • are accessible, e.g. site visit, conference calls • willing to share information and to participate
1

Can be from a different sector but offer a similar service, e.g. community health centre, call centre.

The number of partners will depend on your benchmarking question, timeline and resources available.

For Additional Information - see the Benchmarking Tool Kit
References: Balm, G. J. (1992). Benchmarking: A practitioner’s guide for becoming and staying best of the best. Schaumburg, Illinois: QPMA Press Sales, P. D., & Stewart, P. J. (1998). Benchmarking Tool Kit: a blueprint for public health practice. PHRED Program: Middlesex-London Health Unit, Ottawa-Carleton Health Department. Source: Beynon, C. (1999). Benchmarking. From Regional Workshops, Public Health Needs, Effective Interventions, Benchmarking: Implications for Public Health Units Recommended Readings: Ammons, D. N. (2001). Municipal benchmarks: Assessing local performance and establishing community standards. California: Sage Publication Inc. Keehley, P., Medlin, S., MacBride, S., and Longmire, L. (1997). Benchmarking for best practices in the public health sector: Achieving performance breakthroughs in federal, state and local agencies. San Francisco: Jossey-Bass.

November 7, 2001/Reviewed: November 2004

Charlene Beynon, Middlesex-London Health Unit

8

Deciding What to Benchmark?

The process should: be meaningful be highly visible be resource intensive have a history of problems have the opportunity to improve have an environment conducive to change be understood support the agency mission, vision and strategic directions need ideas from other sources to be improved Is this process worth benchmarking at this time?

- Keehley, 1997, pp 87-88

9

10

A Benchmarking Checklist

Before embarking on a benchmarking investigation assess the following and determine the likelihood of a successful outcome. Key Decision Points 1. A S.M.A.R.T.1. benchmarking question can be identified? 2. There is a measure to determine what is “best”? 3. Performance improvement is possible? 4. There is a willingness to make changes? 5. There is a sustained commitment to participate for the project’s duration? 6. A detailed program description addressing how the program is delivered can be articulated for each comparator’s program? 7. A few key indicators (effectiveness and efficiency) can be identified? 8. Required data is available? 9. If data not available, it is feasible with current resources to retrieve or collect? 10. Data collection can be standardized? 11. Benchmarking expertise is available? 12. There are sufficient resources, e.g. staffing, budget, etc to sustain the project? 1. S = Specific M = Measurable A = Achievable R = Realistic Yes No

T = Timeline

This Checklist has not been validated. It has been developed from experiences gained from benchmarking public health programs and is intended to stimulate strategic dialogue and decision-making with an overall goal of increasing the likelihood of a successful outcome. For further information contact: Charlene Beynon Middlesex-London Health Unit cbeynon@uwo.ca 519-663-5317 ext. 2484 11

12

Prioritizing Program Components for Benchmarking
Criteria Ease of data collection Rationale New data collection for the sole purpose of benchmarking is a barrier to implementation Scoring + no new data collection required 0 minor effort will likely be required - substantial effort required for new data collection

Data collection tool

Drafting and piloting of measurement tools has potential to be resource and time consuming

+ data collection tool(s) available 0 minor tool creation required - substantial effort likely required to develop tools

Sufficient program implementation

Can not benchmark an activity that is not being done

+ component actively implemented 0 some implementation - little or no implementation

Sufficiently similar program implementation

Mandatory Program requirements leave discretion regarding approaches. Processes need to be sufficiently similar to allow comparison The more resources are committed to a component, more likely willing to engage in benchmarking effort

+ programs highly similar 0 similar - large divergence in approaches

Resource commitment

+ significant resources 0 moderate resources - little or not resources committed

Source: Ontario Public Health Benchmarking Partnership

13

14

RECOMMENDED RESOURCES www.benchmarking-publichealth.on.ca BENCHMARKING: Ammons, D.N. (2001). Municipal benchmarks: Assessing local performance and establishing community standards. Thousand Oaks: Sage Publication Inc. Ammons, D.N. (1996). Municipal benchmarks: Assessing local performance and establishing community standards. Thousand Oaks: Sage Publication Inc. Balm, G.J. (1992). Benchmarking: A practitioner’s guide for becoming and staying best of the best. Schaumburg, Illinois: QPMA Press. Beynon, C. & Wilson, V. (1998). Benchmarking in Public Health: An idea whose time has come. Public Health & Epidemiology Report Ontario, 9(7), 162-163. Bolan, S. (2001). Competitive calibration. Computing Canada, 27(10), p. 25. Camp, R. C. (1989). Benchmarking: The search for best practices that lead to superior performance. Part I: Benchmarking defined. Quality Progress, January, 61-68. Camp, R. C. (1989). Benchmarking: The search for best practices that lead to superior performance. Part II: Key process steps. Quality Progress, February, 70-75. Camp, R. C. (1989). Benchmarking: The search for best practices that lead to superior performance. Part III: Why benchmark?. Quality Progress, March, 61-68. Codling, S. (1996). Best Practice Benchmarking: An International Perspective. Texas: Gulf Publishing Company. Dattakumar, R. & Jagadeesh, R. (2003). A review of literature on benchmarking. Benchmarking, 10(3), 176-209. Davies, A. J. & Kochhar, A. K. (2000). A framework for the selection of best practices. International Journal of Operations and Production Management, 20(10), 1203. Doebbeling, B.N., Vaughn, T.E., Woolson, R.F., Peloso, P.M., Ward, M.M., Letuchy, E., BootsMiller, B.J., Tripp-Reimer, T., & Branch, L.G. (2002). Benchmarking veterans affairs medical centers in the delivery of preventive health services. Medical Care, 40(6), 540-554. Dunkley, G., Stewart, M., Basrur, S., Beynon, C., Finlay, M., Reynolds, D., Sanderson, R., & Wilson, V. (2001). Benchmarking in public health. Public Health & Epidemiology Report Ontario, 12(6), 211-215. Dunkley, G., Wilson, V., & Stewart, M. (2000). Benchmarking Pilot Project: Testing the concept in public health. Public Health & Epidemiology Report Ontario, 11(1), 14-21.

15

Ellis, J., Cooper, A., Davies, D., Hadfield, J., Oliver, P., Onions, J. & Walmsley, E. (2000) Making a difference to practice: Clinical benchmarking part 2. Nursing Standard, 14(33), 3235. Fitz-Enz, J. (1992). Benchmarking best practices. Canadian Business Review, 19(4), 28-31. Gunasekaran, A. (2001). Benchmarking tools and practices for twenty-first century competitiveness. Benchmarking, 8(2), 86-87. Herman, R. C. & Provost, S. (2003). Interpreting measurement data for quality improvement: Standards, means, norms, and benchmarks. Psychiatric Services, 54(5), 655-657. Johnson, B., & Chambers, J. (2000). Food service benchmarking: Practices, attitudes and beliefs of foodservice directors. The American Dietetic Association, 100(2), 175-180. Keehley, P., Medlin, S., MacBride, S., & Longmire, L. (1997). Benchmarking for best practices in the public health sector: Achieving performance breakthroughs in federal, state and local agencies. San Francisco: Jossey-Bass. Loveridge, N. (2002). Benchmarking as a quality assessment tool. Emergency Nurse, 9(9), 24-29. Mancuso, S. (2001). Adult-centered practices: Benchmarking study in higher education. Innovative Higher Education. 25(3), 165-181. Ossip-Klein, D.J., Karuza, J., Tweet, A., Howard, J., Obermiller-Powers, M., Howard, L., Katz, P., Griffin-Roth, S., & Swift, M. (2002). Benchmarking implementation of a computerized system for long-term care. American Journal of Medical Quality, 17(3), 94-102. Sales, P.D., & Stewart, P.J. (1998). Benchmarking Tool Kit: A blueprint for public health practice. Middlesex-London and Ottawa-Carleton Public Health Research, Education and Development (PHRED) Programs. Tepper, D. (2002). Benchmarking: Measuring productivity and outcomes. PT-Magazine of Physical Therapy, 10(1), 48-52. Vassallo, M.L. (2000). Benchmarking and evidence-based practice: Complementary approaches to achieving quality process improvement. Seminars in Perioperative Nursing, 9(3), 121-124. Wilson, B. & Beynon, C. (1998). Introducing benchmarking to Ontario Health Units. Public Health & Epidemiology Report Ontario, 9(8), 183-186. Witt, M. J. (2002). Practice re-engineering through the use of benchmarks: Part II. Medical Practice Management, March/April, 237-242. Zairi, M. (1998). Benchmarking for Best Practice. London: Butterworth-Heinemann. Zairi, M. & Leondard, P. (1994). Practical Benchmarking: The Complete Guide. London: Chapman & Hall. 16

PROGRAM LOGIC MODELS: Dwyer, J. (1996). Applying program logic model in planning and evaluation. Public Health & Epidemiology Report Ontario, 17(2), 38-46. Rush, B. & Ogborne, A. (1991). Program logic models: Expanding their role and structure for program planning and evaluation. The Canadian Journal of Program Evaluation. 6(2), 95-106. McLaughlin, J.A., & Jordan, G.B. (1999). Logic models: A tool for telling your program’s performance story. Evaluation and Program Planning, 22(1), 1-14. WEBSITES FOR PROGRAM LOGIC MODELS: • • • • www.benchmarking.co.uk www.bja.evaluationwebsite.org/html/roadmap/basic/program_logic_models http://garberconsulting.com/Program_Logic_Model.htm www.uottawa.ca/academic/med/epid/toolkit.htm

ONTARIO PUBLIC HEALTH BENCHMARKING PARTNERSHIP – DOCUMENTS & REPORTS: ✰ see - www.benchmarking-publichealth.on.ca (documents & reports) • • • Benchmarking: Breastfeeding Support in Public Health – December 2000 Towards Benchmarking Heart Health Coalitions: Developing A Systematic Process for Documenting and Enriching Community/Health Unit Partnerships – April 2001 Benchmarking in Public Health in Ontario: Web-Site User’s Manual (2nd Ed.) – July 2002 Charlene Beynon Middlesex-London Health Unit PHRED Program cbeynon@uwo.ca 519-663-5317 ext. 2484 Monique Stewart City of Ottawa Public Health & Long-Term Care Branch PHRED Program monique.stewart@ottawa.ca 613-724-4122 ext. 23467

November 7, 2001/Updated: November 2004

17

18

Tools
The following resources were developed from “lessons learned” from the Dental Screening Benchmarking Investigation to assist future project teams in :
• •

determining if benchmarking is the tool of choice, and mapping out pre-requisites for a successful benchmarking investigation.

These resources were used to facilitate discussion with colleagues in the Southwest to determine the feasibility of conducting a benchmarking investigation focusing on postpartum depression.

o Benchmarking Work Sheets o Selecting Our Options/Increasing the Odds o Assessing Feasibility of Success o Draft Project Discussion Points o Southwest Benchmarking Feasibility Assessment

We look forward to improving these resources and welcome your questions and comments.

19

20

Benchmarking Worksheets 1. Identify major components for a Program Logic Model.

2. What component do you want to benchmark? i.e. Where do you need to improve performance? And where can you improve performance?

3. What is your benchmarking question? We want to benchmark:

21

4. What is the benchmark? (i.e. What is the gold standard? What will you compare your results to in order to identify “Best Practices?)

5. Develop a Program Logic Model for the component to be benchmarked. Component

Activities

Target Audience

Short term Outcomes

Long Term Outcomes 22

6. Identify a maximum of 3 –4 indicators Principle: include both effectiveness and efficiency indicators.

Does it work?

Is it worth doing?

Indicators 1. 2. 3. 4.

Data Sources

Retrievable

An indicator: a quantifiable measure of an outcome or objective

For further information contact: Charlene Beynon Middlesex-London Health Unit 50 King Street, London, Ontario, N6A 5L7 cbeynon@uwo.ca / 519-663-5317 ext. 2484 23

24

Benchmarking Worksheet Selecting Our Options/Increasing the Odds
Program Component1. What is the benchmarking question? What is the benchmark? What are the indicators? Can we get the data? Other

Post Note: The goal is to focus on only one program component. Other components can be addressed in subsequent benchmarking investigations. If there is debate about which component should be addressed, this Work Sheet can assist in articulating which focus is more likely to yield a successful benchmarking outcome.
November 2003 – Revised: November 2004
1. Program Logic Model

25

26

Benchmarking Assessing Feasibility of Success Worksheet Component to be Ease of Data Collection benchmarked

Data Collection Tools

Sufficient Program Implementation

Sufficiently Similar Implementation

Resource Commitment

27

28

Postpartum Depression Benchmarking Investigation

Draft Project Discussion Points
1. Identify major components for a Program Logic Model. 2. What component do you want to benchmark? Principle: Select only 1 3. What benchmarking question do you want to answer? Principle: the question should be SMARTa Question: is there a benchmark? 4. Initiate discussion about how intend to identify “Best Practices”. 5. What data is required to answer the question? Question: is the data easily retrievable? 6. Identify a maximum of 3-4 indicators. Principle: include both effectiveness and efficiency indicators 7.. Review Benchmarking Checklist. Decision Point: i) continue, ii) different action, iii) stop 8. Develop questionnaire or other data collection tool to collect indicator data only*. 9. Pilot the questionnaire and revise as needed. Several iterations may be required. 10. 11. 12. 13. 14. 15. 16. 17. Collect indicator data. Review findings with practitioners and other recognized experts. Collect practice data to link with best results. Identify “Best Practices”. Implement. Evaluate. Disseminate experiences. Monitor results. a = S-Specific, M-Measurable, A-Achievable, R-Realistic, T-Timeline * Usually indicator and practice data are collected simultaneously. This option requires further study. It is being presented as one way to keep the project manageable, time limited and is based on experiences from other projects. Although represented as a series of linear steps, benchmarking is an interactive process
November 2003 – Revised: November 2004

29

30

Postpartum Depression Southwest Benchmarking Feasibility Assessment

The Question: is benchmarking the right tool to answer the question(s) that needs to be answered

When to Benchmark*: 1. Is there a performance issue? 2. Is someone doing better? Is it really better? Compared to what? 3. Can we identify and compare practices 4. Are the practices transferable and able to be customized to other settings? * see the “Benchmarking Checklist” and “Priority Program Components for Benchmarking”

Key Phrases: • Who is doing better at. . . (identifying women at risk for postpartum depression)?

What practices are most effective and efficient. . .(identifying women at risk for postpartum depression)? • type of contact • with what tool? • when?

November 2003

31

32

Personal Notes

33

34

35

36

Sign up to vote on this title
UsefulNot useful