You are on page 1of 161

Learning Event for Commissioners

Using Data to Support


System Improvement
Robert Lloyd, PhD
Vice President
Institute for Healthcare Improvement

21 January 2016
0900 1330
London Law Society

So, why do we need a dialogue on


Using Data to Support Health Systems Improvement?
2

Consider the following issues


The focus on measurement will only increase in health and social
services.
The role of measurement: Is it for the patient, the family or the care giver?
For staff? For the public, politicians or for researchers? Who is the
customer of the measurement system?
Ultimately, measurement should be for those receiving the output of our
processes.
Financial measures, for example, usually have been for someone else not
the patient or family.
How do we open a new mind set and dialogue on measurement since
historically much of the measurement for health and social services has
been required and done by external groups and used for passing
judgement?
2015 Institute for Healthcare Improvement/R. C. Lloyd

So, why do we need a dialogue on


Using Data to Support Health Systems Improvement?
3

A few more things to think about


If we trust the data but it is lagged by several quarters or a year or
more, how do we use it for improvement?
How can we develop measurement systems that reflect current
performance rather than being aggregate by quarter or year?

During the last 5 years we have seen a new perspective emerging.


The data collected nationally are expected to drive improvement
at the sites of care. How do we make this happen? Can it
happen?
Improvement can only happen if the people who produce the
actual work own the measures and the data not someone
removed from the work.
2015 Institute for Healthcare Improvement/R. C. Lloyd

Discussion Questions for Today


Question 1
What is the difference between a Commissioning
process that is focused on QA and one that is focused on
QI? How do we strike a balance between assurance and
improvement?
Question 2
How do analyse data from a QI perspective and what
questions should we ask about the results?
Question 3
How can Commissioners support providers in building
capacity and capability for improvement?

What is Quality?
We want to know what you think is the
definition of quality.
Use the sticky notes on your table.
Fill in the following statement:
Quality is ___________________.
Place your note(s) on the designated
flipchart.

Quality is
a combination of value and outcome in the eyes of the consumer
a product or service delivered with 100% satisfaction the first time, every time
a product or service that provides an expected value
a product that lasts, for the best price
a satisfied customer
a very good product or service - one you would want again
above standard results or outcomes
an excellent product or service delivered by professional, friendly,
knowledgeable people in a timely manner at the appropriate time
an unending struggle for excellence
accurate results to health care consumers
anticipation and fulfillment of needs
A vision which provides growth and satisfaction for the customer or consumer of
our service
attentive and excellent patient care
attention to detail, timeliness, competence
being the best, best of the best!
being present for every experience
best result possible in a given category
2015 Institute for Healthcare Improvement/R. C. Lloyd

What is Quality?

Quality is meeting and


exceeding the customers
needs and expectations and
then continuing to improve.
W. Edwards Deming

On the use of Statistical Analysis


in assessing Quality in Health Care
These statistics will enable us
to ascertain what diseases and
ages press most heavily on the
resources of particular
hospitals.
They (i.e., the statistics) will
show subscribers how their
money is being spent, what
amount of good is really being
done with it, or whether the
money is doing mischief rather
than good.

Florence Nightingale
(1820-1910)

Health Care Quality Improvement


A broad range of activities of varying degrees of
complexity and methodological and statistical
rigor through which health care providers
develop, implement, and assess small-scale
interventions and identify those that work well

and implement them more broadly in order to


improve clinical practice.
The Ethics of Improving Health Care Quality & Safety: A Hastings Center/AHRQ
Project, Mary Ann Baily, PhD, Associate for Ethics & Health Policy, The Hastings
Center, Garrison, New York, October, 2004

Lets start by thinking about the


Messiness of Life

Patient encounter
with physician

A healthy and productive


member of society

Is life this simple?


If it was this simple we wouldnt need to be
here discussing improvement!

Life looks more like this


There are numerous direct effects between the independent
variables (the Xs) and the dependent variable (Y).

Independent Variables

Age

Coordination of Care
Gender

Current
health
status

Communication

Time 1

Time 2

3
Time 3

Patient Assessment
Score (could be
health outcomes,
functional status or
satisfaction)

Well, actually, it looks like this!


In this case, there are numerous direct and indirect effects between the
independent variables and the dependent variable. For example, X1 and X4
both have direct effects on Y plus there is an indirect effect due to the
interaction of X1 and X4 conjointly on Y.

R1

Age

R = residuals or error terms representing the


effects of variables not included in the model.

R4

Coordination of care

Gender

R2

Current health
status

R3

RY

Communication

R5

3
Time 1

Time 2

Time 3

Patient Assessment
Score (could be
health outcomes,
functional status or
satisfaction)

Quality is about improving


Complex Problems! But
Some problems are so
complex that you have to be
highly intelligent and well
informed just to be undecided
about them.
--Laurence J. Peter

13

The Quality Pioneers

W. Edwards
Deming
(1900 - 1993)

Walter
Shewhart
(1891 1967)

Joseph Juran
(1904 - 2008)

Dr. Walter Shewhart


"Both pure and applied science have
gradually pushed further and further the
requirements for accuracy and precision.
However, applied science, is even more
exacting than pure science in certain
matters of accuracy and precision."

Applied Science requires two


types of knowledge
Subject Matter
Knowledge

Subject Matter Knowledge:


Knowledge basic to the things we
do in life. Professional knowledge.
Knowledge of work processes.

Science of Improvement (SOI)


Knowledge: The interplay of the
theories of systems, variation,
knowledge, and psychology.

SOI
Knowledge

Knowledge for Improvement


Improvement: Learn to combine subject matter
knowledge and SOI knowledge in creative ways to
develop effective changes for improvement.
Subject Matter
Knowledge

Improvement

SOI
Knowledge

Improving the messiness of


life requires applied science.
R1

R2

X
X

R4

X
2

Time 3

R3

Time 2

Time 1

R5

RY

Exercise
Assessing the Messiness of Life!
Do you think Commissioners and providers regularly view issues as

being rather messy and complex or do they see them as simple problems
that should be resolved quickly and easily (i.e., X causes Y)?
List a few of these messy problems that you are currently addressing and

why they are this way.


On a scale of 1-10, how messy is each of these problems? (1 = not very

messy to 10 = extremely messy).


Do you have current measures for these messy problems that allow you

to determine just how complex and challenging each problem is?


If you have measures, do you feel that they are valid, reliable and

appropriate?

Exercise
Assessing the Messiness of Life!
What is the topic of this
Messy Problem?

How Messy is this


Problem? Select a
number 1 -10 with
1 = not very messy
10 = extremely messy

List the current measures


you have for this Messy
Problem?
Do you have baseline data on
these measures?

Do you feel that these


measures are valid,
reliable and appropriate?

21

QA

QI

The Challenge
2015 Institute for Healthcare Improvement/R. C. Lloyd

The Challenge:
Moving from the Old Way to the New Way
Requirement,
Specification or
Threshold

No
action
taken
here

Better

Quality

Action taken
on all
occurrences

Reject
defectives

Worse

Old Way
(Quality Assurance)

Better

Quality

Worse

New Way
(Quality Improvement)

Source: Robert Lloyd, Ph.D., 2012

The Scientific Method provides the foundation for all


Quality Improvement models and approaches 23

Deductive Phase
(general to specific)

Theoretical
Concepts
Information
for Decision
Making

Interpretation
of the Results

(ideas & hypotheses)

Theory

Select &
Define
Indicators

and
Prediction

Data
Collection
(plans & methods)

(asking why?)

Data
Analysis and
Output

Inductive Phase
(specific to general)

Source: R. Lloyd Quality Health Care, 2004, p. 153.

Understanding the Timeline is Critical

API Model for


Improvement
(1996)

Source: Moen, R. and Norman, C. Circling Back: Clearing up Myths about the Deming
Cycle and Seeing How it Keeps Evolving, Quality Progress November, 2010:22-28.

Quality Models & Approaches


Across the Years
Human Factors/Ergonomics (Ancient Greece initially then
refined in 1857 and then again in 1949)
The International Federation of the National Standardizing
Associations (ISA) (1926)
International Organization for Standardization (ISO) (1947)
Toyota Production System (1950s)
Six Sigma (Motorola, 1980s)
Baldrige Criteria (1987)
European Foundation for Quality Management (EFQM)
(1988)
Model for Improvement (1996)

2015 Institute for Healthcare Improvement/R. C. Lloyd

Adding Six Sigma & Lean to the Timeline

Scoville & Little Comparing


Lean and Quality
Improvement (2014)

Bill Smith (1986)


Motorola
Six Sigma

Mikel Harry (1988)


Motorola- MAIC
Forrest Breyfogle 111
(1992)- Integration

F.Taylor-The Principles of
Scientific Management
(1911)

Michael George
(1991)- Integration
Toyoda Family
Kiichiro Toyoda
Sakichi Tooda

Taiichi Ohno 1950-1980


Toyota Production System

Reference: Wortman 2001

Womack & Jones

See the Appendices for further


details on the history of QI

Evolution of Quality Management over time


Age of Craftsman
Age of Mass Production
Age of Quality Management

Evolution of Quality Management (1850-1974)

Evolution of Quality Management (1978-2014)

Fourth Generation Management (Dr. Brian Joiner)

Evolution of Quality Management in Healthcare

What is Lean?

In short
The choice of a quality system, approach
or model should be driven by the
objectives of the organization, its culture
and its products or services!
The decision should NOT be driven by
how popular a particular approach is or
even if it has been used successfully in
other settings.
Institute for Healthcare Improvement, 2004

The Key: Constancy of Purpose!

The Quality Improvement Journey for IHI


(blending Jurans and Demings approaches)
Quality
Planning

Jurans
Quality
Trilogy
Quality
Control

Quality
Improvement

Demings System
of Profound
Knowledge
2015 Institute for Healthcare
Improvement/R. C. Lloyd

29

30

The Juran Trilogy


The Juran Trilogy consists of three types of
activities:
Quality Planning,
Quality Control (or Quality Assurance)

Quality Improvement

Quality Planning:
Setting aims
Selecting improvement projects
Selecting team and providing resources

31

Juran on Quality Control


Quality Control (QC): Quality control is the

regulatory process through which we measure


actual quality performance, compare it with
quality goals, and act on the difference
(Juran, 1988)
This is usually done by operations (e.g.,

clinicians and managers) with support from a QC


Department.

32

The Juran Trilogy Journey

Demings Lens of
Profound Knowledge
The system of profound
knowledge provides a
lens. It provides a new
map of theory by which
to understand and
optimise our
organisations.

Theory
of
Knowledge

Appreciation
of a system

QI

(Deming, Out of the Crisis)

It provides an
opportunity for
dialogue and learning!

33

Understanding
Variation

Human
Behaviour

What insights might be obtained by looking


through the Lens of Profound Knowledge?
Appreciation for a System

Theory of Knowledge

Interdependence, dynamism of the parts


The world is not deterministic
Direct, indirect and interactive variables
The system must have an aim
The whole is greater than sum of the parts

Human Behavior
What theories drive the
Interaction between people
system?
Intrinsic versus extrinsic
Can we predict?
motivation
Learning from theory and
Beliefs, values & assumptions
experience
What is the Will to change?
Operational definitions
(what does a concept
Understanding Variation
mean?)
Variation is to be expected!
PDSAs for learning and
Common or special causes of variation
improvement
Data for judgement or improvement?
Ranking, tampering & performance management
34
Potential sampling errors

Exercise
Profound Knowledge
Apply the Lens of Profound Knowledge to an improvement
project.
This is best accomplished with an improvement team.
Use the PK Worksheet (next page) to record your
responses. Remember that there are no right or wrong
responses.
Engage in a dialogue on PK (not a debate, a discussion or
idle chit-chat but rather a true dialogue about the theories
and assumptions surrounding the project and the degree to
which it is messy.

Share the results of this exercise with others to obtain their


thoughts and input.

Profound Knowledge Worksheet


Appreciation for a System

Human Behaviour

Theory of Knowledge

Understanding Variation

36

37

Can you help providers start


to apply Profound
Knowledge to their messy
problems?
2015 Institute for Healthcare Improvement/R. C. Lloyd

38
1996 API* added three basic questions to supplement the PDSA Cycle.
The PDSA Cycle is used to develop, test, and implement changes.

Is applicable to all types of


organizations.
Provides a framework for the
application of improvement
methods guided by theory.
Emphasizes and encourages the
iterative learning process of
deductive and inductive thinking.
Allows project plans to adapt as
learning occurs.
*API = Associates in Process Improvement

The IHI Approach


When you
combine
the 3
questions
with the
PDSA cycle,
you get

the Model
for
Improvement.

Langley, J. et al. The Improvement Guide. Jossey-Bass Publishers, 2009.

Foundation for the QI Learning


Demings System of Profound Knowledge

Key Improvement Methods:


Are used with

Model for Improvement with PDSA


Shewhart charts
Operational Definitions
Analytic Studies
Graphical Data Analysis
Intrinsic motivation
Multi-disciplinary teams

Subject Matter Knowledge

Improvement
Provides the Philosophical
and Theoretical Base for

Seven Propositions:

Characteristics of the Applied Science of Improvement:


1. Bias toward action learning
2. Focus on prediction of future outcomes
3. Multiple testing cycles before implementation
4. Visual display to learn from data
5. Learning from special and common causes
6. Simple and complex study designs
7. Ongoing interaction of scientists and practitioners

1. Grounded in the Scientific Method


2. Foundation of conceptualistic pragmatism
3. Embraces a weak from of psychologism
4. Considers context of justification and discovery
5. Recognizes value of operationism
6. Variation is defined by chance-cause system
7. Systems theory
Source: Provost, L., Perla, R., Parry, G.,Seven Propositions of the Science of Improvement: Exploring
Foundations. Q Manage Health Care Vol. 22, No. 3, 2013: 70186.

Dialogue
Science of Improvement
What is your current level of knowledge about the
Science of Improvement (SOI)?
Could you explain to a provider how the SOI can
help them to achieve better performance?
Are you and your colleagues prepared to engage in
a dialogue with providers on how to move from a QA
perspective to a QI perspective?
What structures and process can be established to
support providers in their quality journeys?

Why are you measuring?

Research?
(testing theory and building
new knowledge; efficacy)

Accountability
or Judgement?
(making comparisons;
no change focus)

Improvement?
(improving the effectiveness or
efficiency of a process)

The answer to this question will guide your entire


quality measurement journey!

The Three Faces of


Performance Measurement
Aspect

Improvement
Improvement of care
(efficiency & effectiveness)

Aim

Methods:
Test Observability

Bias

Test observable

Accountability
(Judgement)

Research

Comparison, choice,
reassurance, motivation for
change

Build new theories and


knowledge
(efficacy)

No test, evaluate current


performance

Test blinded or controlled

Accept consistent bias

Measure and adjust to


reduce bias

Design to eliminate bias

Sample Size

Just enough data, small


sequential samples

Obtain 100% of available,


relevant data

Just in case data

Flexibility of
Hypothesis

Flexible hypotheses, changes


as learning takes place

No hypothesis

Fixed hypothesis
(null hypothesis)

Testing Strategy

Sequential tests

No tests

One large test

Determining if a
change is an
improvement

Analytic Statistics
(statistical process control)
Run & Control charts

No change focus
(maybe compute a percent
change or rank order the
results)

Enumerative Statistics
(t-test, F-test,
chi square,
p-values)

Confidentiality of
the data

Data used only by those


involved with improvement

Data available for public


consumption and review

Research subjects identities


protected

Adapted from: Lief Solberg, Gordon Mosser and Sharon McDonald,Journal on


Quality Improvement vol. 23, no. 3, (March 1997), 135-147.

Example of Data for Judgement

Source: Provost, Murray & Britto (2010)


2015 Institute for Healthcare Improvement/R. C. Lloyd

How Is the Error Rate Doing?

Source: Provost, Murray & Britto (2010)


Slide #45

Slide #45

2015 Institute for Healthcare Improvement/R. C. Lloyd

How is Perfect Care Doing?

Slide #46

Source: Provost, Murray & Britto (2010)

Slide #46

2015 Institute for Healthcare Improvement/R. C. Lloyd

So, how do you view the Three Faces


of Performance Measurement?

As a
Research

Judgment

Improvement

As

Or,

2015 Institute for Healthcare Improvement/R. C. Lloyd

Integrating the
Three Faces of Performance Measurement
The three faces of performance
measurement should not be seen as
mutually exclusive silos. This is not an
either/or situation.

All three areas must be understood as


a system. Individuals need to build
skills in all three areas.
Organizations need translators who
and be able to speak the language of
each approach.
The problem is that individuals identify
with one of the approaches and
dismiss the value of the other two.
2015 Institute for Healthcare Improvement/R. C. Lloyd

Dialogue
Why are you measuring?
How much of your organizations energy is aimed at
improvement, accountability and/or research?
Does one form of performance measurement dominate
your journey?
Is your organization building silos or a Rubik's cube when it
comes to data collection and measurement?

Do you think the three approaches can be integrated or are


they in fact separate and distinct silos?
How many translators exist within your organization? Are
people being developed for this role?
2015 Institute for Healthcare Improvement/R. C. Lloyd

Now, how would you design a study to


improve performance?
Life is full of
options!

50

2015 Institute for Healthcare Improvement/R. C. Lloyd

Enumerative versus Analytic Studies and


Related Statistical Techniques
The teaching of pure statistical theory in universities, including
the theory of probability and related subjects is almost
everywhere excellent. Application to enumerative studies is
mostly correct, but application to analytic problems is deceptive
and misleading.
Analysis of variance, t-test, confidence intervals, and other
statistical techniques taught in books, however interesting, are
inappropriate because they provide no basis for prediction and
because they bury the information contained in the order of
production. Most if not all computer packages for analysis of
data, as they are called, provide flagrant examples of
inefficiency.
Dr. Deming, Out of the Crisis, page 132.

Enumerative versus Analytic Studies


Deming classified studies into two types depending on the type of action
that will be taken:
Enumerative Studies ones in which action will be taken on the entire
universe. The aim of an enumerative study is estimation of some aspect of the
universe. Action will be taken on the universe based on this estimate through the
sampling frame. The U.S. Census is a classic example of an enumerative study.

Analytic Studies ones in which action will be taken on a cause system to


improve performance of a product, process, or system in the future. The aim of an
analytic study is prediction that one of several alternatives will be superior to the
others in the future.
In an analytic study, the focus is on the cause system. There is no
identifiable universe, as there is in an enumerative study, and, therefore, no
frame.
Source: Quality Improvement Through Planned Experimentation by R. Moen, T. Nolan and L.
Provost, McGraw-Hill, New York, 1999, 2nd edition.

It is possible, in an
enumerative problem, to
reduce errors of sampling
to any specified level. In
contrast, in an analytic
problem, it is impossible
to compute the risk of
making a wrong
decision.
On Probability as Basis for Action
W. E. Deming, The American
Statistician, November 1975, vol. 29,
No. 4. Pages 146-152.

53

Enumerative and Analytic Studies


Enumerative: a Pond

Pull one sample from


this spot, walk away
and make a conclusion
about the total pond!

Fixed population-universe, frame


Random sampling
Probability based
Purpose:
- determine how much variation in a sample
- apply learning to the sample
(should not extrapolate)
- reject or do not reject sampled population
Hypothesis, statistical tests (t-test,
F-test, chi square, p-values)

Analytic: a River

But, how do you pull


a sample from a
moving process?
No fixed population
Population-ongoing stream of data
Also uses judgment sampling
Not totally based on probability
Purpose:
-how much variation, what type
-take action on underlying process to
Improve future outcome of process
Run charts or Shewhart control charts

Different Types of Studies


The approach to
research and the
statistical methods
used should be based
on the question(s)
being asked.
Descriptive Study summarize all the fish in one barrel by type.
Enumerative Study take a sample from one barrel as a point estimate
(audit) of the fish and generalize to all barrels on the boats deck.
Analytic Study understand the process that places fish in one barrel by
studying previous and future barrels. Why are these fish in this barrel?

Case Study: The Chicago Tribune


Monday, September 19, 2011

The purpose of the study, which


represents the most
comprehensive examination of
railroad pedestrian fatalities in
northeastern Illinois, was to
determine the factors leading to
the incidents and recommend
solutions the researchers said.
Does this purpose sound like it will be
an enumerative or analytic study?
56

The Chicago Tribune


Monday, September 19, 2011

Variables in the study


Train type (Metra, Amtrak or Freight)
Number of pedestrian deaths by age
Number of pedestrian deaths by gender

Pedestrian death rate by Metra route


Pedestrian deaths (count) and rate by municipality

Percentage of deaths by season


57

The Chicago Tribune


Monday, September 19, 2011

58

The Chicago Tribune


Monday, September 19, 2011

Fatal rail pedestrian


incidents are occurring
at an average of about
one every 10 days in
the Chicago area, the
study said. Last
week, there were two,
both on Thursday.

59

Now what do you think?


Is this an enumerative or analytic study?

Enumerative Studies frequently


suffer from 20-20 Hindsight!
Managing a process on the basis of monthly
(or quarterly) averages is like trying to drive a
car by looking in the rear view mirror.

D. Wheeler
Understanding
Variation, 1993.

Dialogue
Enumerative and Analytic Studies

61

When you consider the use of data in the Commissioning

Process, do you think it is designed around an Enumerative


or an Analytic approach?
If it is more aligned more with an Enumerative approach,

how will this lead to improving care processes and


outcomes?
If you think the use of data in the Commissioning Process

is more aligned with an Analytic approach, then what are


you doing to convey this approach to providers?

Read more about Enumerative and


Analytic Studies
In the spring of 2010 the BMJ sponsored the Vin McLoughlin Symposium on the
Epistemology of Improving Health Care. The papers that grew out of this symposium
are freely available online under the BMJ journals unlock scheme:
http://qualitysafety.bmj.com/site/about/unlocked.xhtml

Epistemology (from Greek epistm), meaning


"knowledge, science", and (logos), meaning "study
of" is the branch of philosophy concerned with the
nature and scope (limitations) of knowledge.
It addresses the questions:
What is knowledge?
How is knowledge acquired?
How do we know what we know?

BMJ Quality & Safety


April 2011 Vol. 20, No Suppl. 1

Measurement focuses on the 2nd question

Langley, G. et al, The Improvement Guide, API, 2009

But, do you know the Milestones


in the Quality Measurement Journey (QMJ)?

Milestones in the
Quality Measurement Journey
AIM (How good? By when?)
Concept
Measure
Operational Definitions
Data Collection Plan
Data Collection

Analysis

ACTION

Source: R. Lloyd. Quality Health Care: A Guide to Developing and


Using Indicators. Jones and Bartlett Publishers, 2004.

65

Milestones in the
Quality Measurement Journey
AIM reduce patient falls by 37% by the end of the year
Concept reduce patient falls
Measures Inpatient falls rate (falls per 1000 patient days)
Operational Definitions - # falls/inpatient days
Data Collection Plan weekly; no sampling; all IP units
Data Collection unit collects the data
Analysis control chart (u-chart)

ACTION

Milestones in the
Quality Measurement Journey
AIM

(How good? By when?)

Concept
Measure
Operational Definitions
Data Collection Plan
Data Collection
Analysis

ACTION

Source: R. Lloyd. Quality Health Care: A Guide to Developing and


Using Indicators. Jones and Bartlett Publishers, 2004.

67

NHS Mental Health Dashboard

But remember to build a


Cascading System of Measures
Look
at
your
system

of measures

69

as a cascade!

A Cascading Approach to Measurement

Percent service users


on antipsychotics with
baseline investigations

MACRO

Complication
rates
Percent compliance
with all bundles

Percent
compliance
with
Physical
observation
s bundle

Percent
compliance
with Cardiac
investigation
s bundle

MESO

+
MICRO

Percent
compliance
with
Pathology
investigation
s bundle

The Quality Measurement Journey


AIM

(Why are you measuring?)

Concept
Measure
Operational Definitions
Data Collection Plan
Data Collection
Analysis

ACTION

71
Copyright 2013 Institute for Healthcare Improvement/R. Lloyd

You have performance data!


Now, what do you
do with it?

72

Understanding variation conceptually


If I had to reduce
my message for
management to just
a few words, Id say
it all had to do with
reducing variation.
W. Edwards Deming

73

The Problem!
Aggregated data presented in tabular
formats or with summary statistics,
will not help you measure the impact
of process improvement efforts.
Aggregated data can only lead to
judgment, not to improvement.

74

Average Percent of Patients who Fall


Static View of Before and After the Implementation of a New Protocol

Protocol implemented here

Percent of Patients
who Fall

5.2

5.0%

WOW!

A significant drop
from 5% to 4%
4.0%

3.8
Time 1

75

Time 2

Conclusion -The protocol was a success!


A 20% drop in the average mortality!

Average Percent of Patients who Fall


Dynamic View of Before and After the Implementation of a New Protocol

Percent of Patients
who Fall

9.0

Protocol implemented here

UCL= 6.0
5.0
CL = 4.0

LCL = 2.0
1.0
24 Months

76

Now what do you conclude about the


impact of the protocol?

If you dont understand the variation that


lives in your data, you will be tempted to ...
Deny the data (It doesnt fit my view of reality!)
See trends where there are no trends
Try to explain natural variation as special events

Blame and give credit to people for things over


which they have no control
Distort the process that produced the data
Kill the messenger!
77

Dr. Campbell's Insight on Distortion


The more any quantitative social
indicator is used for social decisionmaking, the more subject it will be to
corruption pressures and the more apt it
will be to distort and corrupt the social
processes it is intended to monitor.
"Campbell's Law" from Assessing the Impact of Planned Social
Change, 1976
https://www.globalhivmeinfo.org/CapacityBuilding/Occasional
%20Papers/08%20Assessing%20the%20Impact%20of%20Plan
ned%20Social%20Change.pdf
P78
http://www.sciencedirect.com/science/article/pii/014971897990048X

Donald T. Campbell,
Ph.D., social
psychologist
(1916-1996)

Dr. Demings Cycle of Fear


Increased
Fear

Micromanagement

Kill the
Messenger

Filtered
Information

Source: William Scherkenbach. The Deming Route to Quality and Productivity. Ceep Press, Washington, DC, 1990, page 71.

Dr. Walter A Shewhart


W. Shewhart. Economic Control of
Quality of Manufactured Product, 1931

A phenomenon will
be said to be
controlled when,
through the use of
past experience, we
can predict, at least
within limits, how the
phenomenon may be
expected to vary in
the future

What is the variation in one system


over time? Walter A. Shewhart - early 1920s, Bell Laboratories
Dynamic View
Static View

Every process displays variation:


Controlled variation
stable, consistent pattern of variation
chance, constant causes
Static View

81

Special cause variation


assignable
pattern changes over time

UCL

time
LCL

Types of Variation
Common Cause Variation

Special Cause Variation

Is due to irregular or unnatural


causes that are not inherent in the
design of the process

Affect some, but not necessarily


all aspects of the process

Results in an unstable process


that is not predictable

Also known as non-random or


assignable variation

Is inherent in the design of the


process

Is due to regular, natural or ordinary


causes

Affects all the outcomes of a process

Results in a stable process that is


predictable

Also known as random or


unassignable variation

82

Point Variation exists!


Common Cause does not mean Good Variation. It only
means that the process is stable and predictable. For
example, if a patients systolic blood pressure averaged
around 165 and was usually between 160 and 170 mmHg,
this might be stable and predictable but completely
unacceptable.
Similarly Special Cause variation should not be viewed as
Bad Variation. You could have a special cause that
represents a very good result (e.g., a low turnaround time),
which you would want to emulate. Special Cause merely
means that the process is unstable and unpredictable.

Appropriate Management Response to


Common & Special Causes of Variation
Is the process stable?
YES
Type of variation
Right Choice
Wrong Choice
Consequences of
making the wrong
choice

NO

Only Common

Special + Common

If not at target
change the process

Investigate the origin of


the special cause

Treat normal variation as a


special cause (tampering)

Increased
variation!

Change the process

Wasted
resources!
(time, effort, morale,
money)

Source: Carey, R. and Lloyd, R. Measuring Quality Improvement in Healthcare: A Guide to Statistical Process
Control Applications. ASQ Press, Milwaukee, WI, 2001, page 153.

84

Questions

1. Is the process stable?


If so, it is predictable.
2. Is the process capable?
The chart will tell you if the process is
stable and predictable.
You have to decide if the output of the process is capable of
meeting the target or goal you have set!
(NOTE:
we will talk about setting targets and goals shortly)
85

Attributes of a Leader Who


Understands Variation
Leaders understand the different ways that variation is
viewed.
They explain changes in terms of common causes and
special causes.
They use graphical methods to learn from data and
expect others to consider variation in their decisions
and actions.
They understand the concept of stable and unstable
processes and the potential losses due to tampering.

Capability of a process or system is understood before


changes are attempted.

Dialogue
Common and Special Causes of Variation
Select several measures you review on a regular
basis.

Pe rc e n3 5t. 0o f Ce s a re a n Se c tio n s Pe rfo rm e d De c 9 5 - J u n 9 9


30. 0

Pe r c en t C- s e c t io n s

UCL = 2 7 . 7 0 1 8

25. 0

20. 0

CL=18. 0246
15. 0

10. 0
L CL = 8 . 3 4 7 3

Do you and other CCG members as well as


providers evaluate these measures according the
criteria for common and special causes of
variation?

6/ 99

4/ 99

2/ 99

8/ 98

12 / 98

6/ 98

4/ 98

10 / 98

2/ 98

8/ 97

6/ 97

12 / 97

4/ 97

10 / 97

2/ 97

8/ 96

12 / 96

6/ 96

10 / 96

4/ 96

2/ 96

12 / 95

0. 0

m ont h

Nu m be r o f M edic at io ns Er r o r s per 100 0 Pa t ie n t Day s

If not, what criteria do you use to determine if


data are improving or getting worse?

5. 0

22. 5

M ed ica tio n Erro r Ra te

20. 0

17. 5

15. 0
UCL = 1 3 . 3 9 4 6 1

12. 5

10. 0

Do these methods allow you to understand the


variation inherent in the data?

7. 5

5. 0

CL = 4 . 4 2 0 4 8

2. 5

0. 0

L CL = 0 . 0 0 0 0 0

W eek

Copyright 2013 Institute for Healthcare Improvement/R. Lloyd

Conclusions
Understanding Variation
1. The same data can show different patterns of variation
dependent on how much of it you present and how you
statistically analyse and display the data.
2. Data presented over time (i.e., plotting the data by day,
week or month) is the only way you will ever be able to
improve any aspect of quality or safety!
3. Avoid using aggregated data and enumerative statistics if
you are serious about improving quality and safety!
4. A leaders job is to understand patterns of variation and
ask why!
Copyright 2013 Institute for Healthcare Improvement/R. Lloyd

Understand variation statistically

Unplanned Returns to Ed w/in 72 Hours


Month
M
A
M
J
J
A
S
O
N
D
J
F
M
A
M
J
J
A
S
ED/100 41.78 43.89 39.86 40.03 38.01 43.43 39.21 41.90 41.78 43.00 39.66 40.03 48.21 43.89 39.86 36.21 41.78 43.89 31.45
Returns
17
26
13
16
24
27
19
14
33
20
17
22
29
17
36
19
22
24
22
u cha rt
1.2

1.0

Rate per 100 ED Patients

UCL = 0.88

0.8

0.6
Mean = 0.54

0.4

0.2

LCL = 0.19

STATIC VIEW
Descriptive Statistics
Mean, Median & Mode
Minimum/Maximum/Range
Standard Deviation
Bar graphs/Pie charts

19

18

17

16

15

14

13

12

11

10

0.0

DYNAMIC VIEW
Run Chart
Control Chart
(plot data over time)
Statistical Process Control (SPC)

89
2015 Institute for Healthcare Improvement/R. C. Lloyd

How do we analyze variation for


quality improvement?
With Statistical Process Control (SPC) charts!

Run and Control Charts are the best


tools to determine:
1. The variation that lives in the process
2. If our improvement strategies have had the
desired effect.
90

Three Uses of SPC Charts


1. Make process performance visible

Plotting data
over time to
understand the
variation!

Current Process Performance: Isolated Femur Fractures

Minutes ED to OR per
Patient

1200
1000
800
600
400
200
0

7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64
Sequential Patients

3. Determine if we are holding the gains


Process Improvement: Isolated Femur Fractures

Holding the Gain: Isolated Femur Fractures

1200
1000

Minutes ED to OR per
Patient

Minutes ED to OR per
Patient

1200

800
600
400
200

1000
800
600
400
200

1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64
Sequential Patients

2. Determine if a change is an
improvement

1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64
Sequential Patients

How do we analyze variation


statistically for quality improvement?
Run Chart

Measure

A Run Chart:
is a time series plot of data
The centerline is the Median
4 Run Chart rules are used to determine
if there are random or non-random
patterns in the data
Time

Measure

Control Chart

A Control Chart:
is a time series plot of data
The centerline is the Mean
Added features include Upper and lower
control Limits (UCL & LCL)
5 Control Chart rules are used to
determine if the data reflect common or
special causes of variation

Time

92
Copyright 2013 Institute for Healthcare Improvement/R. Lloyd

Lets start fitting the pieces


together

The Goal: To build information and learning for improvement.


2015 Institute for Healthcare Improvement/R. C. Lloyd

Organisation Name

Region

April 14
May 14
Dementia
Dementia
Diagnosis Rate Diagnosis
Rate

NHS Barking & Dagenham CCG


NHS Harrow CCG
NHS Redbridge CCG
NHS Sutton CCG
NHS Havering CCG
NHS Richmond CCG
NHS Kingston CCG
NHS Croydon CCG
NHS Camden CCG
NHS Hillingdon CCG
NHS Bexley CCG
NHS Enfield CCG
NHS Greenwich CCG
NHS Bromley CCG
NHS Lewisham CCG
NHS Wandsworth CCG
LONDON AREA TEAM
NHS West London (K&C & QPP) CCG
NHS City and Hackney CCG
NHS Newham CCG
NHS Merton CCG
NHS Southwark CCG
NHS Waltham Forest CCG
NHS Barnet CCG
NHS Hammersmith and Fulham CCG
NHS Hounslow CCG
NHS Central London (Westminster) CCG
NHS Brent CCG
NHS Haringey CCG
NHS Tower Hamlets CCG
NHS Ealing CCG
NHS Lambeth CCG
NHS Islington CCG

NE
NW
NE
South
NE
South
South
South
NE
NW
South
NE
South
South
South
South
LAT
NW
NE
NE
South
South
Ne
NE
NW
NW
NW
NW
NE
NE
NW
South
NE

55.10
38.14
49.95
45.13
45.47
52.85
42.53
46.50
64.88
42.84
50.04
49.49
54.80
44.89
53.52
56.12
54.94
57.35
68.78
63.87
49.88
58.57
54.29
57.53
57.03
54.26
59.15
54.37
53.92
66.62
54.19
55.50
69.88

54.58
38.37
49.71
45.18
46.17
52.31
41.98
46.50
65.27
41.38
50.18
49.08
54.64
44.98
53.62
56.17
54.90
57.41
68.53
63.68
49.46
55.74
54.48
57.65
57.20
53.77
59.59
55.23
53.57
66.97
54.28
57.50
70.41

June 14
Dementia
Diagnosis
Rate
55.33
38.76
50.03
46.40
47.11
53.40
41.99
46.73
65.21
42.88
50.91
50.10
55.33
45.21
54.50
56.86
55.49
56.05
68.54
63.82
50.52
56.33
54.69
57.50
60.32
53.73
61.10
55.86
55.72
66.89
54.94
57.71
70.27

Organisation Name

Region

April 14 Dementia
Diagnosis Rate

May 14
Dementia
Diagnosis
Rate

June 14
Dementia
Diagnosis
Rate

July 14
Dementia
Diagnosis
Rate

August 14 Dementia September 14


Diagnosis Rate
Dementia
Diagnosis Rate

October 14
November 14 December 14 January 15
February 15
March 15
Dementia
Dementia
Dementia
Dementia
Dementia
Dementia
Diagnosis Rate Diagnosis Rate Diagnosis Rate Diagnosis Rate Diagnosis Rate Diagnosis Rate

NHS Barking & Dagenham CCG

NE

55.10

54.58

55.33

55.57

54.31

56.25

59.47

61.69

62.77

62.84

63.07

63.96

NHS Harrow CCG

NW

38.14

38.37

38.76

37.97

37.44

40.09

40.24

42.30

43.14

39.35

43.29

50.30

NHS Redbridge CCG

NE

49.95

49.71

50.03

49.21

48.18

48.98

49.30

53.45

55.71

56.05

57.38

59.62

NHS Sutton CCG

South

45.13

45.18

46.40

45.51

44.13

47.94

54.31

54.63

56.68

55.82

56.21

55.56

NHS Havering CCG

NE

45.47

46.17

47.11

46.42

46.35

47.67

48.20

49.67

49.87

50.15

51.14

51.61

NHS Richmond CCG

South

52.85

52.31

53.40

51.40

50.76

53.07

52.06

54.83

55.82

58.04

60.20

63.60

NHS Kingston CCG

South

42.53

41.98

41.99

41.82

39.27

41.12

40.62

42.82

48.17

49.30

51.28

51.92

NHS Croydon CCG

South

46.50

46.50

46.73

46.66

46.18

46.51

46.28

47.51

48.78

50.33

51.43

51.83

NHS Camden CCG

NE

64.88

65.27

65.21

63.84

62.56

65.02

66.57

67.39

67.00

67.45

67.00

68.73

NHS Hillingdon CCG

NW

42.84

41.38

42.88

41.62

41.75

42.37

42.99

43.95

47.09

48.72

52.40

54.23

NHS Bexley CCG

South

50.04

50.18

50.91

49.86

51.11

50.41

50.38

51.87

52.63

53.65

55.41

57.56

NHS Enfield CCG

NE

49.49

49.08

50.10

48.14

49.03

51.91

52.29

52.51

53.78

55.68

56.44

59.73

NHS Greenwich CCG

South

54.80

54.64

55.33

55.60

55.77

56.84

56.12

57.78

59.72

59.88

62.95

69.33

NHS Bromley CCG

South

44.89

44.98

45.21

43.81

43.46

44.94

48.07

48.22

49.51

49.99

52.30

57.56

NHS Lewisham CCG

South

53.52

53.62

54.50

53.77

54.28

52.96

53.33

52.61

52.94

53.17

58.36

61.52

NHS Wandsworth CCG

South

56.12

56.17

56.86

56.03

54.87

55.95

55.78

56.48

55.92

56.37

58.62

58.61

LONDON AREA TEAM

LAT

54.94

54.90

55.49

54.72

54.51

55.62

56.35

57.79

58.87

60.33

62.60

65.79

NHS West London (K&C & QPP) CCG

NW

57.35

57.41

56.05

55.77

53.71

57.91

61.53

63.26

64.69

65.23

68.57

73.06

NHS City and Hackney CCG

NE

68.78

68.53

68.54

66.51

66.17

67.83

68.83

67.96

68.22

68.54

69.41

70.22

NHS Newham CCG

NE

63.87

63.68

63.82

64.14

62.66

63.85

63.71

63.93

64.77

65.81

65.68

68.35

NHS Merton CCG

South

49.88

49.46

50.52

49.75

49.48

51.86

51.30

52.39

53.52

55.80

57.52

66.45

NHS Southwark CCG

South

58.57

55.74

56.33

55.66

58.04

57.16

58.52

63.19

63.47

64.39

67.49

68.54

NHS Waltham Forest CCG

Ne

54.29

54.48

54.69

53.99

53.25

54.09

53.77

56.48

56.52

62.97

66.36

70.31

NHS Barnet CCG

NE

57.53

57.65

57.50

57.47

56.60

57.57

57.78

57.96

58.52

62.64

64.30

67.70

NHS Hammersmith and Fulham CCG

NW

57.03

57.20

60.32

60.41

60.17

62.23

61.47

60.11

60.49

62.94

65.63

68.18

NHS Hounslow CCG

NW

54.26

53.77

53.73

53.43

52.84

54.26

54.73

54.25

55.18

57.55

61.99

69.68

NHS Central London (Westminster) CCG

NW

59.15

59.59

61.10

59.67

59.97

62.60

62.17

63.25

63.38

64.76

69.88

71.68

NHS Brent CCG

NW

54.37

55.23

55.86

55.80

55.05

55.89

56.58

58.87

59.58

66.06

68.97

70.70

NHS Haringey CCG

NE

53.92

53.57

55.72

54.85

53.21

54.16

53.48

54.30

55.31

56.94

61.17

64.23

NHS Tower Hamlets CCG

NE

66.62

66.97

66.89

66.86

67.54

66.52

66.71

66.45

66.14

71.40

71.93

73.09

NHS Ealing CCG

NW

54.19

54.28

54.94

54.49

56.45

54.80

55.13

57.21

57.60

57.91

60.14

62.98

NHS Lambeth CCG

South

55.50

57.50

57.71

57.71

57.53

58.18

62.70

63.80

64.74

64.99

65.28

64.30

NHS Islington CCG

NE

69.88

70.41

70.27

69.39

67.82

69.03

68.85

69.08

71.27

72.91

74.70

77.83

Dementia Diagnosis Rates for 32 NHS CCGs, April 2014-March 2015

How do we improve performance of2015


theInstitute
system
with this data?
for Healthcare Improvement/R. C. Lloyd

NHS Mental Health Dashboard:


The beginning of a bridge between
Enumerative and Analytic studies

2015 Institute for Healthcare Improvement/R. C. Lloyd

80.00

But now, lets look at the data from


an Analytic Approach:
32 CCGs (London Team)

75.00

70.00

65.00

UCL

60.00

Mean = 57.6
55.00

LCL

A Trend: 6 or more consecutive data


point increasing (or decreasing)

50.00

45.00

All London Area Teams Dementia Diagnosis Rate


April 2014-March 2015

40.00

Mar-15

Feb-15

Jan-15

Dec-14

Nov-14

Oct-14

Sep-14

Aug-14

Jul-14

Jun-14

May-14

35.00
Apr-14

Dementia Diagnosis Rate

London Area Team - I Chart

Created by Forid Alom, ELFT

80.00

Looking at Data from an


Analytic Approach:
18 CCGs

75.00

70.00

65.00

60.00

UCL

55.00

Mean = 53.5
LCL

50.00

A Trend: 6 or more consecutive data


point increasing (or decreasing)

45.00

18 London Area Teams Dementia Diagnosis Rate


April 2014-March 2015

40.00

Mar-15

Feb-15

Jan-15

Dec-14

Nov-14

Oct-14

Sep-14

Aug-14

Jul-14

Jun-14

May-14

35.00
Apr-14

Dementia Diagnosis Rate

I Chart of selected 18 CCG's

Created by Forid Alom, ELFT

35
Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15
Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15
Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

50
LCL

65

UCL
UCL

NHS Greenwich CCG - I


Chart

55

50

45

NHS Bromley CCG


- I Chart

UCL

UCL
LCL

LCL
UCL

UCL

55
UCL

UCL

LCL

NHS Lewisham
CCG - I Chart
NHS Wandsworth
CCG - I Chart
NHS West London
(K&C & QPP) CCG
- I Chart

75

65
UCL

LCL

40
LCL
Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

40
UCL

NHS Sutton CCG I Chart

Dashboard of 18 London Area Teams Dementia Diagnosis Rates, April 2014-March 2015
Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

45

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

70

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

40

NHS Redbridge
CCG - I Chart

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

60
LCL

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

50

NHS Harrow CCG


- I Chart

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

80

NHS Barking &


Dagenham CCG - I
Chart

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

35
Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

45

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

55

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

Dementia Diagnosis Rate


75

Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

80
Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

35

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

Dementia Diagnosis Rate


60

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14

Dementia Diagnosis Rate


80

NHS Havering
CCG - I Chart
NHS Richmond
CCG - I Chart

70

65

UCL
UCL

UCL

LCL
LCL

LCL

NHS Kingston CCG - I


Chart
NHS Croydon
CCG - I Chart
NHS Camden CCG
- I Chart
NHS Hillingdon
CCG - I Chart
NHS Bexley CCG I Chart
NHS Enfield CCG I Chart

75

UCL

60
LCL

UCL

LCL
LCL
LCL

LCL

NHS City and


Hackney CCG - I
Chart

70
UCL

UCL
LCL

UCL
LCL
LCL

Created by Forid Alom, ELFT

Exercise
Understanding Variation across 18 CCGs
100

For these 18 selected CCGs:


What do we learn from these 18 charts?
Are all 18 CCGs performing the same?

Do all 18 charts match the overall performance pattern shown


on the aggregated chart?
Do these 18 CCGs exhibit common or special causes of
variation?
What will it take to get these 18 CCGs performing as a
system?

Should each CCGs improvement strategy be the same?


2015
Institute for Healthcare
Improvement/R. C. Lloyd
Are any of the CCGs demonstrating
excellent
performance?

35
Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15
Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15
Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

50
LCL

65

UCL
UCL

NHS Greenwich CCG - I


Chart

55

50

45

NHS Bromley CCG


- I Chart

UCL

UCL
LCL

LCL
UCL

UCL

55
UCL

UCL

LCL

NHS Lewisham
CCG - I Chart
NHS Wandsworth
CCG - I Chart
NHS West London
(K&C & QPP) CCG
- I Chart

75

65
UCL

LCL

40
LCL
Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

40
UCL

NHS Sutton CCG I Chart

Dashboard of 18 London Area Teams Dementia Diagnosis Rates, April 2014-March 2015
Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

45

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

70

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

40

NHS Redbridge
CCG - I Chart

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

60
LCL

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

50

NHS Harrow CCG


- I Chart

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

80

NHS Barking &


Dagenham CCG - I
Chart

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

35
Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

45

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

55

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

Dementia Diagnosis Rate


75

Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

80
Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

35

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14
Oct-14
Nov-14
Dec-14
Jan-15
Feb-15
Mar-15

Dementia Diagnosis Rate


60

Apr-14
May-14
Jun-14
Jul-14
Aug-14
Sep-14

Dementia Diagnosis Rate


80

NHS Havering
CCG - I Chart
NHS Richmond
CCG - I Chart

70

65

UCL
UCL

UCL

LCL
LCL

LCL

NHS Kingston CCG - I


Chart
NHS Croydon
CCG - I Chart
NHS Camden CCG
- I Chart
NHS Hillingdon
CCG - I Chart
NHS Bexley CCG I Chart
NHS Enfield CCG I Chart

75

UCL

60
LCL

UCL

LCL
LCL
LCL

LCL

NHS City and


Hackney CCG - I
Chart

70
UCL

UCL
LCL

UCL
LCL
LCL

Created by Forid Alom, ELFT

So, weve looked at the aggregate performance


for the entire system (all 32 CCGs in the
London area).

Then, we looked at the aggregate performance


for a segment of the system (18 CCGs)

Finally, we developed a dashboard of the 18


CCGs performance over time on control
charts.
Created by Forid Alom, ELFT

2015 Institute for Healthcare Improvement/R. C. Lloyd

I knowwhat can
a CCG do to
improve system
performance?

2015 Institute for Healthcare Improvement/R. C. Lloyd

What can a CCG do to support


system improvement?

104

Use the Commissioning data and the related findings to identify


opportunities for provider improvement.
Help providers to take responsibility for their data.
Understand the factors that drive a particular measure.
Look at data as a time series not in the aggregate or with summary
statistics.
Work with providers to set up improvement teams to work on improving
the measures.
Stress that providers need to identify a dedicated group of QI advisors and
coaches who can support the improvement teams in their work.
Build capacity and capability for improvement thinking and practice
throughout the system (from the Board and Non-Execs through senior
management, middle management and front-line staff)
Create a process to review progress of the improvement teams.
Be transparent with data and results.
2015 Institute for Healthcare Improvement/R. C. Lloyd

Falls
Medication
errors

Physical
violence

Pressure
ulcers

Reducing
Harm

It starts with
having a
strategic focus!

Restraints

Reliable delivery
of evidencebased care

Improving
patient and
carer
experience

Reducing delays
and
inefficiencies in
the system

Right care,
right place,
right time

Improved access
to services at
the right
location

A Driver Diagram with Aim, Primary


and Secondary Drivers
Primary
Drivers

Secondary Drivers

AIM

It then
requires
identifying
the factors
that drive the
outcomes!
2015 Institute for Healthcare Improvement/R. C. Lloyd

A plan for building capacity and capability for


the science of improvement is also essential
2015 Institute for Healthcare Improvement/R. C. Lloyd

Pocket QI commenced in October


2015. Aim to reach 200 people by
Dec 2016.
All staff receive intro to QI at
induction

Estimated number needed to train = 5000


Needs = introduction to quality
improvement, identifying problems, change
ideas, testing and measuring change

500 people have undertaken the


ISIA so far. Wave 5 = Luton/Beds
(Sept 2016 Feb 2017)

Estimated number needed to train = 1000


Needs = deeper understanding of
improvement methodology, measurement
and using data, leading teams in QI

Experts by experience
All staff

Estimated number needed to train = 45


Needs = deeper understanding of
improvement methodology, understanding
variation, coaching teams and individuals

Staff involved in or
leading QI projects

Currently have 3 improvement


advisors, with 1.5 wte deployed to
QI. To increase to 8 IAs in 2016/17
(6 wte).

Estimated number needed to train = 11


Needs = deep statistical process control,
deep improvement methods, effective plans
for implementation & spread

Internal
experts (QI
team)

Most Executives will have


undertaken the ISIA.
Annual Board session with IHI &
regular Board development
discussions on QI

Needs = setting direction and big goals,


executive leadership, oversight of
improvement, being a champion,
understanding variation to lead

Bespoke QI learning sessions for


service users and carers. Over 50
attended in 2015. Build into recovery
college syllabus, along with
confidence-building, presentation
skills etc.

Needs = introduction to quality


improvement, how to get involved in
improving a service, practical skills in
confidence-building, presentation,
contributing ideas, support structure for
service user involvement

30 QI coaches graduating in
January 2016. To identify and train
second cohort in mid-late 2016

QI coaches

Board

Then it is time to lay out your


Quality Measurement Journey

Quality Dashboard
(organisation-level view)

ELFT Quality Dashboards

Safety

Finally, build the ability to track individual teams

ACCESS TO SERVICES
COLLABORATIVE
DASHBOARD
December 2015

Jan-14

Oct-15

LCL

Nov-15

Nov-15

LCL

Oct-15

Oct-15

Sep-15

Aug-15

Jul-15

Jun-15

May-15

Apr-15

Mar-15

Feb-15

Jan-15

Dec-14

Nov-14

Oct-14

Sep-14

Aug-14

Jul-14

Jun-14

May-14

Average Waiting Time / Days

55

Sep-15

Aug-15

Jul-15

Jun-15

May-15

Apr-15

Mar-15

Feb-15

Jan-15

Dec-14

Nov-14

Oct-14

Sep-14

Aug-14

No. of referrals received (Collaborative, 9/11 teams) - C Chart

Jul-14

Jun-14

May-14

800
Apr-14

LCL

28%

Mar-14

1211.0

Feb-14

UCL

DNA / %

1400

Jan-14

Nov-15

1200

Sep-15

Aug-15

Jul-15

Jun-15

May-15

Apr-15

Mar-15

Feb-15

Jan-15

Dec-14

Nov-14

Apr-14

Mar-14

Feb-14

Jan-14

45

Oct-14

Sep-14

Aug-14

Jul-14

Jun-14

May-14

Apr-14

Mar-14

1100

Feb-14

No. of Referrals

ACCESS TO SERVICES COLLABORATIVE


December 2015
- Baseline data

70
Average waiting time from referral to 1st face to face appt (Collaborative, 9/11 teams) - X-bar Chart

UCL

65

60.7

60

52.2

50

Where would the average be for


all this data?

40

% of 1st face to face appt DNAs (Collaborative, 9/11 teams) - P Chart

38%
UCL

1300
33%

32.50%
25.52%

1021.8

1000

900
23%

18%

Jan-14

100
Jan-15

Nov-15

Oct-15

Sep-15

Aug-15

Jul-15

Jun-15

May-15

Apr-15

Mar-15

Feb-15

No. of referrals received (PTS) - I Chart

Dec-14

Nov-14

Oct-14

Nov-15

Oct-15

Sep-15

Aug-15

Jul-15

Jun-15

May-15

Apr-15

Mar-15

Feb-15

Jan-15

Dec-14

Nov-14

Oct-14

Sep-14

Aug-14

Jul-14

Jun-14

May-14

Apr-14

Mar-14

Feb-14

95

Sep-14

Aug-14

Jul-14

Jun-14

May-14

20%

Apr-14

211.7

Mar-14

40%

Feb-14

DNA / %

250

Jan-14

Nov-15

Oct-15

Sep-15

Aug-15

Jul-15

Jun-15

May-15

Apr-15

Mar-15

Feb-15

Jan-15

Dec-14

Nov-14

Oct-14

Sep-14

Aug-14

Jul-14

Jun-14

May-14

200

Apr-14

85

Mar-14

Jan-14

Average Waiting Time / Days

105

Feb-14

No. of Referrals

SERVICE LEVEL
Psychological Therapy Service (City and Hackney, Newham & Tower Hamlets)
- Baseline data

December 2015
Average waiting time from referral to 1st face to face appt (PTS) - X-bar Chart

125
UCL

115

104.0

88.9

LCL

75

65

45%
% of 1st face to face appt DNAs (PTS) - P Chart

300
UCL

UCL

35%

30%

29.75%

25%

150
LCL

15%

LCL

10%

Jan-14

0
0%
Nov-15

Oct-15

Sep-15

Aug-15

Jul-15

Jun-15

May-15

LCL

Apr-15

Mar-15

Feb-15

Jan-15

Dec-14

No. of referrals received (NH PTS) - I Chart

Nov-14

Oct-14

Sep-14

Aug-14

Jul-14

Jun-14

May-14

20

Apr-14

20%

Mar-14

40
30%

Feb-14

60
DNA / %

58.4

Jan-14

Nov-15

Oct-15

Sep-15

Aug-15

Jul-15

Jun-15

May-15

Apr-15

Mar-15

Feb-15

Jan-15

Dec-14

Nov-14

Oct-14

Sep-14

Aug-14

Jul-14

Jun-14

May-14

Nov-15

Oct-15

Sep-15

Aug-15

Jul-15

Jun-15

May-15

Apr-15

Mar-15

Feb-15

Jan-15

Dec-14

Nov-14

Oct-14

Sep-14

Aug-14

Jul-14

Jun-14

May-14

Apr-14

Mar-14

Feb-14

Jan-14

Average Waiting Time / Days

60

Apr-14

Mar-14

100

Feb-14

No. of Referrals

TEAM LEVEL
QI0043 & QI0175 Newham Psychological Therapy Service
- Baseline data

December 2015
Average waiting time from referral to 1st face to face appt (NH PTS) - X-bar Chart

140

120
UCL

100

80

85.4
LCL

40

56.6

20

% of 1st face to face appt DNAs (NH PTS) - P Chart

UCL

60%

50%

80
UCL

40%

32.73%

22.91%

LCL

10%

Jan-14

30
Nov-15

Oct-15

Sep-15

Aug-15

Jul-15

Jun-15

May-15

Apr-15

Mar-15

Feb-15

Jan-15

Dec-14

Nov-14

5%

Oct-14

50

Sep-14

90

Jul-14

No. of referrals received (NH Memory Service) - I Chart

Aug-14

110
20%

Jun-14

124.6

May-14

170

Apr-14

30%

Mar-14

190

Feb-14

150

DNA / %

UCL

Jan-14

Nov-15

Oct-15

Sep-15

Aug-15

Jul-15

Jun-15

May-15

Apr-15

Mar-15

Feb-15

Jan-15

Dec-14

Nov-14

Oct-14

Sep-14

Aug-14

Jul-14

Jun-14

130

May-14

Nov-15

Oct-15

Sep-15

Aug-15

Jul-15

Jun-15

May-15

Apr-15

Mar-15

Feb-15

Jan-15

Dec-14

Nov-14

Oct-14

Sep-14

Aug-14

Jul-14

Jun-14

May-14

Apr-14

Mar-14

Feb-14

Average Waiting Time / Days

25

Apr-14

Jan-14

15

Mar-14

210

Feb-14

No. of Referrals

TEAM LEVEL
QI0104 Newham Memory Service
- Baseline data

December 2015
50
Average waiting time from referral to 1st face to face appt (NH Memory Service) - X-bar Chart

45

40
UCL

35

30

28.5

20

LCL

10

% of 1st face to face appt DNAs (NH Memory Service) - P Chart

35%

UCL

25%

17.20%

15%

70
10%

LCL

LCL

0%

All 4 acute admissions wards


in Tower Hamlets started
working on violence
reduction at the end of 2014

ork begins

LCL

ns with wards
nce work

4
100
80
2
60
40
20
0
0

06-Jan-14
20-Jan-14
03-Feb-14
17-Feb-14
03-Mar-14
17-Mar-14
31-Mar-14
14-Apr-14
28-Apr-14
12-May-
26-May-
09-Jun-14
23-Jun-14
07-Jul-14
21-Jul-14
04-Aug-14
18-Aug-14
01-Sep-14
15-Sep-14
29-Sep-14
13-Oct-14
27-Oct-14
10-Nov-14
24-Nov-14
08-Dec-14
22-Dec-14
05-Jan-15
19-Jan-15
02-Feb-15
16-Feb-15
02-Mar-15
16-Mar-15
30-Mar-15
13-Apr-15
27-Apr-15
11-May-
25-May-
08-Jun-15
22-Jun-15
06-Jul-15
20-Jul-15
03-Aug-15
17-Aug-15
31-Aug-15
14-Sep-15
28-Sep-15
12-Oct-15
26-Oct-15
09-Nov-15
23-Nov-15
07-Dec-15
21-Dec-15

No. of Incidents per 1000 OBD


16

14

UCL

12

10

5.8

UCL

our baseline data told


us our wards were
experiencing 5.8 violent
incidents every two
weeks per 1000
Incidents
resulting in physical
violence
(PICU's only)
Occupied
Bed
Days
per 1000 occupied bed days (OBD) - U Chart
2.5

LCL

35.0

PDSA DATA
(AFTER)

Learnin
g Set 1
Learning
Set 4
Learning
Set 6 &

We started testing change ideas


to improve how we
communicate and work
together

and to better identify and


anticipate when our service users
might feel their needs
werent being met.

QI Work begins

LCL

Conversations with wards


re violence work

4
100
80
2
60
40
20
0
0

06-Jan-14
20-Jan-14
03-Feb-14
17-Feb-14
03-Mar-14
17-Mar-14
31-Mar-14
14-Apr-14
28-Apr-14
12-May-
26-May-
09-Jun-14
23-Jun-14
07-Jul-14
21-Jul-14
04-Aug-14
18-Aug-14
01-Sep-14
15-Sep-14
29-Sep-14
13-Oct-14
27-Oct-14
10-Nov-14
24-Nov-14
08-Dec-14
22-Dec-14
05-Jan-15
19-Jan-15
02-Feb-15
16-Feb-15
02-Mar-15
16-Mar-15
30-Mar-15
13-Apr-15
27-Apr-15
11-May-
25-May-
08-Jun-15
22-Jun-15
06-Jul-15
20-Jul-15
03-Aug-15
17-Aug-15
31-Aug-15
14-Sep-15
28-Sep-15
12-Oct-15
26-Oct-15
09-Nov-15
23-Nov-15
07-Dec-15
21-Dec-15

No. of Incidents per 1000 OBD

16

14

UCL

12

10

5.8

UCL

Incidents resulting in physical violence (PICU's only)


per 1000 occupied bed days (OBD) - U Chart

2.5

LCL

35.0

BASELINE DATA
(BEFORE)
PDSA DATA
(AFTER)

Learnin
g Set 1

Learnin
g Set 2

Introduce safety
culture bundle
Learning
Set 4

Learning
Set 3
Learning
Set 6 &
General
Adult
wards go
Learning Set 5: Safety smoke
Learning
free
Huddle outcomes
Set 7

0
LCL

The number of violent incidents has now


more than halved to 2.5 incidents per 1000
Occupied Bed Days, every two weeks.
QI Work begins

100

Conversations with wards


re violence work

06-Jan-14
20-Jan-14
03-Feb-14
17-Feb-14
03-Mar-14
17-Mar-14
31-Mar-14
14-Apr-14
28-Apr-14
12-May-
26-May-
09-Jun-14
23-Jun-14
07-Jul-14
21-Jul-14
04-Aug-14
18-Aug-14
01-Sep-14
15-Sep-14
29-Sep-14
13-Oct-14
27-Oct-14
10-Nov-14
24-Nov-14
08-Dec-14
22-Dec-14
05-Jan-15
19-Jan-15
02-Feb-15
16-Feb-15
02-Mar-15
16-Mar-15
30-Mar-15
13-Apr-15
27-Apr-15
11-May-
25-May-
08-Jun-15
22-Jun-15
06-Jul-15
20-Jul-15
03-Aug-15
17-Aug-15
31-Aug-15
14-Sep-15
28-Sep-15
12-Oct-15
26-Oct-15
09-Nov-15
23-Nov-15
07-Dec-15
21-Dec-15

No. of Incidents per 1000 OBD

16

14

UCL

12

10

5.8

UCL

Incidents resulting in physical violence (PICU's only)


per 1000 occupied bed days (OBD) - U Chart

50

2.5

LCL
35.0

BASELINE DATA
(BEFORE)
PDSA DATA
(AFTER)

Learnin
g Set 1

Learnin
g Set 2

Introduce safety
culture bundle
Learning
Set 4

Learning
Set 3

Learning
Set 6 &
General
Adult
wards go
Learning Set 5: Safety smoke
Learning
free
Huddle outcomes
Set 7

What is your current level of knowledge about


quality measurement?
This self-assessment is designed to help quality facilitators and improvement team
members gain a better understanding of where they personally stand with respect to
the milestones in the Quality Measurement Journey (QMJ). What would your
reaction be if you had to explain why is it preferable to plot data over time rather than
using aggregated statistics and tests of significance? Can you construct a run chart
or help a team decide which measure is more appropriate for their project?
You may not be asked to do all of the things listed below today or even next week.
But, if you are facilitating a QI team or expect to be able to demonstrate improvement,
sooner or later these questions will be posed. How will you deal with them?
The place to start is to be honest with yourself and see how much you know about
concepts and methods related to the QMJ. Once you have had this period of selfreflection, you will be ready to develop a learning plan for yourself and those on your
improvement team.
Source: R. Lloyd, Quality Health Care: A Guide to Developing and Using Indicators. Jones & Bartlett Publishers, 2004: 301-304.

Measurement Self-Assessment
Response Options
Use the following Response Scale. Select the one response which
best captures your opinion.
1. I'd definitely have to call in an outside expert to explain and
apply this topic/method.

2. I'm not sure I could apply this appropriately to a project.


3. I am familiar with this topic but would have to study it further
before applying it to a project.

4. I have knowledge about this topic, could apply it to a project but


would not want to be asked to teach it to others.
5. I consider myself an expert in this area, could apply it easily to a
project and could teach this topic/method to others.

Source: R. Lloyd, Quality Health Care: A Guide to Developing and Using Indicators. Jones & Bartlett Publishers, 2004: 301-304.

Measurement Self-Assessment Worksheet


Source: R. Lloyd, Quality Health Care: A Guide to Developing and Using Indicators.
Jones & Bartlett Publishers, 2004: 301-304.

Measurement Topic or Skill

Response Scale
1

1. Help people in my organization understand where and how measurement fits into our quality journey
2. Facilitate the development of clear Aim Statements
3. Move teams from concepts to specific quantifiable measures
4. Building clear and unambiguous operational definitions for our measures
5. Develop data collection plans (including stratification and sampling strategies)
6. Explain why plotting data over time (dynamic display) is preferable to using aggregated data and
summary statistics (static display)
7.Explain the differences between random and non-random variation
8. Construct run charts (including locating the median)
9. Explain the reasoning behind the run chart rules
10. Interpret run charts by applying the run chart rules
11. Explain the various types of control charts and how they differ from run charts
12. Construct the various types of control charts
13. Explain the control chart rules for special causes and interpret control charts
14. Help teams link measurement to their improvement efforts
Source: R. Lloyd, Quality Health Care: A Guide to Developing and Using Indicators. Jones & Bartlett Publishers, 2004: 301-304.

Knowing how to properly use


Shewhart Control Charts
(Wait Time to See the Doctor)
Xm R Char t

Fe b r u a r y

A p r il

30. 0
27. 5

Intervention

25. 0
22. 5

M in u t e s

20. 0
17. 5
UCL = 1 5 . 3
A

15. 0

12. 5

C
CL = 1 0 . 7
C
B
A
L CL = 6 . 1

10. 0
7. 5

Baseline
Period

5. 0
2. 5
1

3
2

5
4

7
6

9
8

11
10

Where
will the
process
go?

13
12

15
14

17
16

Freeze the Control Limits and Centerline, extend them and


23
25
27
29
31
compare
to these reference
20
22
2 4 the
2 6 new
2 8 process
30
3performance
2
lines to determine if a special cause has been introduced as
1 6 Pa t ie n t s in Ap r il
a result of the intervention.

19
18

1 6 Pa t ie n t s in Fe b r u a r y a n d

21

Using a Shewhart Control Chart


(Wait Time to See the Doctor)

Xm R Char t

Fe b r u a r y

A p r il

30. 0
27. 5

Intervention

25. 0

Freeze the Control Limits and compare


the new process performance to the
baseline using the UCL, LCL and CL from
the baseline period as reference lines

22. 5

M in u t e s

20. 0
17. 5

UCL = 1 5 . 3
A

15. 0

12. 5

C
CL = 1 0 . 7
C
B
A
L CL = 6 . 1

10. 0
7. 5

Baseline
Period

5. 0
2. 5
1

3
2

5
4

7
6

9
8

11
10

13
12

15
14

17
16

19
18

21
20

23
22

25
24

27
26

1 6 Pa t ie n t s in Fe b r u a r y a n d 1 6 Pa t ie n t s in Ap r il

29
28

31
30

32

A Special Cause is
detected
A run of 8 or more
data points on one
side of the centerline
reflecting a sift in the
process

Using a Shewhart Control Chart


(Wait Time to See the Doctor)
Xm R Char t

Fe b r u a r y

A p r il

30. 0
27. 5

Intervention

25. 0

Make new control limits for


the process to show the
improvement

22. 5

M in u t e s

20. 0
17. 5

UCL = 1 5 . 3
A

15. 0

12. 5

C
CL = 1 0 . 7
C
B
A
L CL = 6 . 1

10. 0
7. 5

Baseline
Period

5. 0
2. 5
1

3
2

5
4

7
6

9
8

11
10

13
12

15
14

17
16

19
18

21
20

23
22

25
24

27
26

1 6 Pa t ie n t s in Fe b r u a r y a n d 1 6 Pa t ie n t s in Ap r il

29
28

31
30

32

But the Charts Dont Tell You


The reasons(s) for a Special Cause.
Whether or not a Common Cause process
should be improved (is the performance of
the process acceptable?)
How the process should actually be
improved or redesigned.
130

Improvement Teams need a Framework for


Performance Improvement
Establish appropriate measures.
Set an aim and goal for each measure.
Develop theories and predictions on how they plan on
achieving the aim and an appropriate time frame for
testing.
Test theories, implement change concepts, follow the
measures over time and analyse the results with SPC.
Revise the strategy as needed.
311

A few thoughts on
Benchmarking

132

Benchmarking is the continuous process of measuring


products, services, and practices against the toughest
competitors or those companies recognized as industry
leaders.
Benchmarking is a structured process it is first and foremost a
search for knowledge and learning.
Camp, R. Benchmarking: The Search for Industry Best Practices that
Lead to Superior Performance ASQ Press, 1989.

A benchmark is a noun.
Benchmarking, on the other hand, is a verb that requires
exploration and investigation of why the benchmark number
was achieved!
2015 Institute for Healthcare Improvement/R. C. Lloyd

Benchmarking versus
Comparative Reference Data

133

Or,
regional or
national
norms

2015 Institute for Healthcare Improvement/R. C. Lloyd

More thoughts on Benchmarking


1. Benchmarking uses numbers and data but if you stop at the
numbers you will never achieve the potential that
benchmarking offers.
2. Benchmarking is a way to identify and understand best
practices that enable organizations to realize new levels of
performance (i.e., the targets and goals that can become
benchmarks).
3. Confusion over these concepts leads an organization to
accept a number, either from an internal or external source
as THE Benchmark. This orientation typically leads to a
fairly singular focus on the numbers (outcomes) without
giving due consideration to the interplay of the structures and
processes that produce the numbers.
(continued)
Source: R. Lloyd. Quality Health Care: A Guide to Developing and
Using Indicators. Jones and Bartlett Publishers, 2004.

2015 Institute for Healthcare Improvement/R. C. Lloyd

More thoughts on Benchmarking


(continued)

4. While you will hear organizations claim that, We are


benchmarking this statement usually means that they
are hoping that they hit the benchmark metric but have
not developed a strategy for achieving this ethereal
number. By what method will you get there?
5. The end result is usually confusion amongst the staff and
frequently unrealistic expectations on the part of
management and the board.

Source: R. Lloyd. Quality Health Care: A Guide to Developing and Using Indicators. Jones and Bartlett Publishers, 2004.

2015 Institute for Healthcare Improvement/R. C. Lloyd

Five Phases Of Benchmarking and the


Ten Related Specific Steps

2015 Institute for Healthcare Improvement/R. C. Lloyd

A Few thoughts on Setting


Targets and Goals

2015 Institute for Healthcare Improvement/R. C. Lloyd

137

138

On Setting Targets and Goals


Goals are necessary for you and
for me, but numerical goals set
for other people, without a road
map to reach the goal, have
effects opposite to the effects
sought. By what method do you
plan to achieve the goal?
Deming, E. Out of the Crisis. Massachusetts Institute of Technology,
Cambridge, MA, 1992
2015 Institute for Healthcare Improvement/R. C. Lloyd

139

On Setting Targets and Goals


Targets are short-term markers of achievement that are
achieved over the span of several months or several
years.

Goals, on the other hand, are more long term in nature,


usually in the range of 3-5 years.
A target or a goal can be based on a benchmark (the
noun) if it is derived from an organisation that is
considered the best of the best.
The benchmarking process (as a verb) is one of the best
ways, therefore, to develop a plan for achieving new
performance levels.
2015 Institute for Healthcare Improvement/R. C. Lloyd

Dialogue
Setting Targets and Goals

140

Do you distinguish between targets and


goals? Or do you consider them as being
synonymous?
How do you set targets and goals?

Would you say some targets and goals that


are set are arbitrary?
2015 Institute for Healthcare Improvement/R. C. Lloyd

Suggestions on
Setting Targets and Goals

141

Establish baseline data on the relevant measure to


determine what the current performance of the process
actually is.
Develop a control chart to determine what the statistical
capability of the process is.
Use the control chart as a basis to discuss the chances
(probability) that the current process will be able to achieve
the proposed target or goal.
If the current baseline performance is so far from the target
or goal then a discussion must occur related to Demings
basic question: By what method?
2015 Institute for Healthcare Improvement/R. C. Lloyd

The Primary Drivers of


Organisational Improvement
Having the Will (desire) to change the current state
to one that is better

Will

Developing Ideas
that will contribute
to making
processes and
outcome better

QI
Ideas

Execution

Having the capacity


and capability to
apply QI theories,
tools and
techniques that
enable successful
Execution of your
ideas

How prepared is the system?


Key Components*

Self-Assessment

Will (to change) Low


Low
Ideas

Medium High
Medium High

Low

Medium High

Execution

*All three components MUST be viewed together.

Focusing on
one or even two of the components will guarantee suboptimised performance. Systems thinking lies at the heart of QI!

A closing thought
It must be remembered that there is nothing
more difficult to plan, more doubtful of
success, nor more dangerous to manage than
the creation of a new system.
For the initiator has the enmity of all who
would profit by the preservation of the old
institution and merely lukewarm defenders
in those who would gain by the new one.
Machiavelli, The Prince, 1513

Copyright 2013 Institute for Healthcare Improvement/R. Lloyd

Appendix A

Evolution of Quality Management over time


Age of Craftsman
Age of Mass Production
Age of Quality Management

Evolution of Quality Management (1850-1974)

Evolution of Quality Management (1978-2014)

Fourth Generation Management (Dr. Brian Joiner)

Evolution of Quality Management in Healthcare

What is Lean?

Evolution of Quality Management


Need

Suppliers

Design &
Redesign of
Processes &
Products

Plan to
Improve

Market
Research

Measurement
& Feedback
Customers

A
B
C
D
E
F

Purpose of the
Organization

Production of Product or Service

Distribution

Support Processes

Evolution of Quality
Management

Age of the Craftsman


B.C. - 1800's
*

Theory of
Management

Scientific study is used for


simplif ication of methods for
individual tasks.

Management view s all work


as processes that link to
form a system.

Planning is separated from


execution.

The focus of management


is on improving the system.

Focus of management is on
production at low cost.

Rew ards are tied to the


customer.

Rewards are tied to the


individual.

Improvement requires
partnership between
suppliers and customers.

Rewards are tied to the


customer and teamwork.

Quality = High cost.

Responsibility for quality


belongs to the craftsman.

Quality = High cost and low


productivity.

Quality = Low cost and


high productivity.

Simplification objective
establishes the Q.C.
Department to measure and
report.

Quality is the focus of the


organization.

Quality is defined by the


need of the customer.

Q.C. Department assumes


the role of consultant for
improvement activities.

Impact on Quality

Age of Mass Production


Early 1800's - Present
*

Person doing the work


manages the entire job,
from planning to job
completion.

Age of
Quality Management
1950's - Present

Craftsman is responsible
for communication with
suppliers and customers.

Direct customer
feedback provides the
definition of quality.

Source: Ron Moen, Associates in


Process in Improvement

Focus is on reducing costs.

Quality is achieved by
inspection and sorting.

Evolution of Quality Management (1950-1974)


1951 Total Quality Control
published by Armand Feigenbaum
1956 Western Electric Statistical
Quality Control Handbook
1958 Genichi Taguchi begins
teaching his methods of loss
function and robust design.
1962 Quality Circles start. Kaoru
Ishikawa asked a number of
Japanese companies to participate
in an experiment.
1974 Kaoru Ishikawa publishes
Guide to Quality Control, 7 simple
tools for improvement.

Taguchi

Source: Ron Moen, Associates in


Process in Improvement

Source: Ron Moen, Associates in


Process in Improvement

Evolution of Quality Management (1978-2014)


1978 George Box, William G. Hunter and J.
Stuart Hunter publish their landmark book
Statistics for Experimenters
1979- Philip Crosby publishes Quality is Free
1980 - Quality revolution begins in US
NBC airs If Japan Can, Why Cant We?
Deming consults for Ford and GM
1987 - Malcolm Baldrige National Quality
Award is established.
1994 Deming publishes the New Economics
which emphasizes the use of the System of
Profound Knowledge.
Present - Quality programs spread to Service
Industries under a variety of names, tools and
approaches.
Proliferation of quality programs: TQM, Six
Sigma, Kaizen, SQC, SPC, Taguchi
Methods, Benchmarking, CQI, Lean Six
Sigma, etc.
Attempts are being made to package the
various contributions from the past into an
overall one best approach.

Source: Ron Moen, Associates in Process in Improvement

Dr. Brian Joiner


Dr. Brian Joiner, a student of Dr.
Demings, described four generations
of management:

First Generation: do it yourself.

Second Generation: Master craftsperson takes on apprentices but


remains the model and arbiter of production (and quality).

Third Generation: manage by resultsusually by specifying the


goals required without detailing the methods (by what method?).

Fourth Generation: simultaneous focus on three chunks of


work: quality, the scientific approach and all one team, the Joiner
Triangle (see next page for details).

149

The Joiner Triangle

150

the Joiner Triangle provides a framework for


implementing Quality Improvement. It consists of:
Quality as seen through the eyes of our customers
The Scientific Approach as the methodology for
solving problems and making decisions; iterative
learning, using data effectively, to build and maintain
effective methods.

Quality

The All One Team aimed at unifying staff work


efforts, getting all employees involved with quality
efforts, collaboration and respect for people.

Scientific
Approach

All One
Team

Evolution of Quality Management in Healthcare


B.C.

Hippocrates (3rd century B.C.). Medicine was and is taught and learned as a craft.

1973

Avedis Donabedian proposed measuring the quality of healthcare by observing :


structure, processes, and outcomes.

1970s Quality Assurance (QA) of hospital care using structural standards


1980s QA by government and insurers. The regulatory route relied on punishment and blame.
1986

Joint Commission on the Accreditation of Healthcare Organizations (JCAHO) announced


its Agenda for Change and stated that the philosophical context for the Agenda of
change is set by the theories of Continual Quality Improvement (QI).

1986

National Demonstration Project (NDP) on Quality Improvement in Healthcare. A


demonstration project to explore the application of modern quality improvement methods
to healthcare.

1990

NDP report: Berwick, D, Godfrey, J and Roessner, J. Curing Health Care. Jossey-Bass,
1990.

1991

Don Berwick founded the Institute for Healthcare Improvement (IHI) committed to
redesigning health care delivery systems in order to ensure the best health care
outcomes at the lowest costs.

1993

IHI adopts API Model for Improvement as its foundation for Improvement.
Source: Ron Moen, Associates in Process in Improvement

152

Beginning of Modern Health Care QI


The National Demonstration Project on Quality
Improvement in Health Care (NDP)
20 Hospitals and 21 Quality Improvement Experts

8 Months September 1986 to June 1987


Initial and Summary Conference

Curing Health Care


Dr. Don Berwick formed IHI at end of project

Lessons from Curing Health Care

153

(Berwick et al, 1990)


Lesson 1: Quality Improvement Tools Can Work in Health Care

Lesson 2: Cross-Functional Teams Are Valuable in Improving Health Care


Processes
Lesson 3: Data Useful for Quality Improvement Abound in Health Care
Lesson 4: Quality Improvement Methods are Fun to Use
Lesson 5: Costs of Poor Quality Are High and Savings are Within Reach
Lesson 6: Involving Doctors is Difficult
Lesson 7: Training Needs Arise Early
Lesson 8: Non-clinical Processes Draw Early Attention
Lesson 9: Health Care Organizations May Need a Broader Definition of
Quality
Lesson 10: In Health Care, as in Industry, the Fate of Quality Improvement
Is First of All in the Hands of Leaders
2015 Institute for Healthcare Improvement/R. C. Lloyd

154

What is Lean
Reducing the timeline from
customer order to building and
delivering a product by
eliminating waste
- Jeff Liker, The Toyota Way

All were trying to do is


shorten the time line
- Taiichi Ohno (credited with developing
lean at Toyota)

Lean: A systematic approach to identifying and eliminating waste


(non-value-added activities) through continuous improvement by
flowing the product at the pull of the customer in pursuit of perfection
(Improvement Guide, p. 463)

Lean incorporates aspects of Quality Planning, Quality Control, and


Quality Improvement

Why Lean?

155

Term given its current meaning at MIT in 1987.


Born of a need to describe a product development,
production, supplier management, customer
support, and planning system (exemplified by Toyota
practice) for what it did.
Compared to traditional mass production methods
(e.g., GM), this system required less time, human
effort, capital, and space to produce products with
fewer defects in wider variety more quickly.
Because it needed less of every input to create
value, we called it lean.
From: James P. Womack, President, Lean Enterprise Institute

156

The Lean Ideal (Aim)


The output is defect free.
The product or service is delivered in response to
customer need (pull, on demand).
The response is immediate.
Products or services are provided 1x1 in the unit
size of use (tailored to the identified needs of the
customer).
Work is done without waste.
Work is done safely.
Work is done securely.
Spear, S. and H. K. Bowen (1999). "Decoding the DNA of the Toyota Production
System." Harvard Business Review 77(5): 96-106.

157

The Core Ideas of Lean Thinking


All value is the result of a process (which we often
call a value stream.)
Move a managers focal plane to the organizations
value creating processes, rather than the
organization itself and the utilization of its assets.
For each value stream (process):
Accurately specify the value desired by the customer.
Identify every step in the value stream and remove the

waste.
Make value flow from beginning to end.
Based on the pull of the customer.
In pursuit of perfection.

From: James P. Womack, President, Lean Enterprise Institute

158

Common Tools and Methods in Lean

5 S Workplace Organization
Visual Management
Continuous Flow / Cell / JIT
Production Layout
Small Lot Production
Quick Setup / Changeover
TPM (Total Productive Maintenance)
Standardized Work
Level Scheduling
Pull System KANBAN
Supplier Rationalization

Appendix B
General References on Quality
The Improvement Guide: A Practical Approach to Enhancing
Organizational Performance. G. Langley, K. Nolan, T. Nolan, C.
Norman, L. Provost. Jossey-Bass Publishers., San Francisco, 1996.
Quality Improvement Through Planned Experimentation. 2nd edition. R.
Moen, T. Nolan, L. Provost, McGraw-Hill, NY, 1998.
The Improvement Handbook. Associates in Process Improvement.
Austin, TX, January, 2005.
A Primer on Leading the Improvement of Systems, Don M. Berwick,
BMJ, 312: pp 619-622, 1996.
Accelerating the Pace of Improvement - An Interview with Thomas
Nolan, Journal of Quality Improvement, Volume 23, No. 4, The Joint
Commission, April, 1997.
159

Appendix C
References on Measurement
Brook, R. et. al. Health System Reform and Quality. Journal of the
American Medical Association 276, no. 6 (1996): 476-480.
Carey, R. and Lloyd, R. Measuring Quality Improvement in healthcare: A
Guide to Statistical Process Control Applications. ASQ Press, Milwaukee,
WI, 2001.

Lloyd, R. Quality Health Care: A Guide to Developing and Using Indicators.


Jones and Bartlett Publishers, Sudbury, MA, 2004.
Provost, L. and Murray, S. The Healthcare Data Guide. Jossey-Bass,
2011.
Nelson, E. et al, Report Cards or Instrument Panels: Who Needs What?
Journal of Quality Improvement, Volume 21, Number 4, April, 1995.

Solberg. L. et. al. The Three Faces of Performance Improvement:


Improvement, Accountability and Research. Journal of Quality
Improvement 23, no.3 (1997): 135-147.
160

Appendix D
Robert Lloyd, PhD Bio
Robert Lloyd, PhD is Vice President for the Institute for Healthcare Improvement (IHI). Dr.
Lloyd provides leadership in the areas of performance improvement strategies, statistical
process control methods, development of strategic dashboards and building capacity and
capability for quality improvement. He also serves as lead faculty for various IHI initiatives
and demonstration projects in the north America, the UK, Sweden, Qatar, Denmark, New
Zealand, Australia and Africa.
Before joining the IHI, Dr. Lloyd served as the Corporate Director of Quality Resource
Services for Advocate Health Care (Oak Brook, IL). He also served as Senior Director of
Quality Measurement for Lutheran General Health System (Park Ridge, IL), directed the
American Hospital Association's Quality Measurement and Management Project (QMMP)
and served in various leadership roles at the Hospital Association of Pennsylvania. The
Pennsylvania State University awarded all three of Dr. Lloyds degrees. His doctorate is in
agricultural economics and rural sociology.
Dr. Lloyd has written many articles and chapters in books. He is also the co-author of the internationally
acclaimed book, Measuring Quality Improvement in Healthcare: A Guide to Statistical Process Control
Applications (American Society for Quality Press, 2001, 5th printing) and the author of Quality Health Care: A
Guide to Developing and Using Indicators, 2004 by Jones and Bartlett (Sudbury, MA).
Dr. Lloyd lives in Chicago with his wife Gwenn and their amusing dog Cricket. The Lloyds have a 21 year old
daughter Devon who is in her final year of university majoring in performance dance and choreography.