You are on page 1of 7

C H A P T E R 107

Medical Errors and Patient Safety


Jeremy Souder n Jennifer S. Myers

The 1999 Institute of Medicine (IOM) report To Err Is Human: Building a Safer Health System
estimated that 44,000 to 98,000 patients die annually in the United States as a result of medical
errors. This report showed that errors in medicine were poorly understood and infrequently inves-
tigated, largely because of the culture of hospitals and the medical profession that saw these errors
as the product of human failings. This belief led to a person-focused approach to error investigation,
in which strategies to reduce medical errors involved censoring or re-educating staff about proper
protocols and procedures. However, it is now known that this approach deprives organizations of
the greatest opportunities for meaningful learning and improvement.
Since the IOM report, health care has increasingly been adopting a systems-focused approach to
medical error. In this view, nothing is intrinsically different about medical errors as compared to
errors in other complex, high-risk industries such as aviation or nuclear power generation. Like
medicine, these fields rely heavily on human innovation and expertise to function normally, but
long ago recognized that human errors are symptoms of deeper organizational problems. Like
respiratory failure or heart failure, errors in medicine demand a diagnostic search for underlying
causes and systemic repairs to prevent error recurrence.
This chapter presents a framework in which to understand, investigate, and prevent errors in
the intensive care unit (ICU). These principles are widely used in the field of patient safety and
can be applied to other areas of health care.

Key Patient Safety Concepts and Definitions


All health care organizations display characteristics of complex adaptive systems, in that they
contain groups and individuals who have the freedom to act in unpredictable ways and whose
actions are interconnected. High-performing, complex organizations follow three broad rules.
First, leaders give general directions by articulating values, establishing a clear organizational
mission, and setting objectives. Second, resources and permissions are provided to the appropriate
personnel within the organization, and they are incented to efficiently and safely fulfill patient
needs. Finally, organizational constraints prevent providers from giving inefficient or unsafe care.
These three rules are expressed through organizational structures and processes. Structures are
the organizational and management hierarchy, physical facilities, staffing, and capital allocated to
perform a process. A process is the way humans and other resources interact to deliver patient care.
Together, the structure and process creates the final products of health care, which are referred to
as outcomes.
An error is a defect in process that arises from a person, the environment in which he or she
works, or, most commonly, an interaction between the two. In the field of patient safety, negative
outcomes are termed adverse events. Because patients may experience adverse events as a result of
their underlying illnesses, preventable adverse events are differentiated from unpreventable adverse

Additional online-only material indicated by icon.

958
Téléchargé pour julls apouakone (celestinjulls@yahoo.fr) à Hospital Group North Essonne à partir
de ClinicalKey.fr par Elsevier sur novembre 25, 2023. Pour un usage personnel seulement.
Aucune autre utilisation n´est autorisée. Copyright ©2023. Elsevier Inc. Tous droits réservés.
107—MEDICAL ERRORS AND PATIENT SAFETY 959

events; the former are due to error, whereas the latter are not. Errors that do not result in patient
harm are termed near misses and are more common than adverse events. Safety experts understand
that near miss events are equally useful to study to prevent future errors. Table 107.E1 illustrates
the difference between these terms.

Errors in Complex Systems


Errors in complex systems can be divided into two types based on where they occur in the
system. Active failures occur at the sharp end of a complex system, so named because it contains
the people and processes that are easily identified when an adverse event occurs because of their
proximity to the patient and the harm. Active failures always involve a human error, such as an
act of omission (forgetting) or commission (misperceptions, slips, and mistakes). Research done
in human factors has shown that the incidence of human error is predictable based on the nature
of a task, the number of steps in a process, and the context in which these occur. Table 107.E2
provides examples of these types of errors. Although active failures are easiest to identify, they
represent the tip of the iceberg and nearly always rest on deeper and more massive latent condi-
tions in the organization.
Latent conditions are defects in a system that make it error-prone. Arising from the physical
environment and as unintended consequences of decisions made by organizational leaders,
managers, and process designers, latent conditions are the unforeseen blunt end of a system
that has “set people up” to fail at the sharp end. Indeed, the majority of near misses and
preventable adverse events identify multiple causative latent conditions. For example, consider
an investigation of a central line–associated bloodstream infection in the ICU. Several poten-
tial latent conditions for this infection are listed in Table 107.E3. Note that if this investiga-
tion had focused on active failures alone, it would have stopped short and blamed providers
without identifying the underlying latent conditions that allowed the error or preventable
infection to occur.
Latent conditions breed human error through a variety of factors. Knowledge factor impair-
ment makes an individual’s impression of what is happening inaccurate or incomplete. An
example would be an intelligent intern struggling to apply extensive prior “book learning” in
the context of actual clinical practice. Excessive mental workload, fatigue, and distractions
make it difficult to focus attention and maintain an accurate overview of the complex situation
at hand, also known as situational awareness. One example of the latter is the difficulty that an
ICU physician may have in remembering to order and follow up on coagulation factors every
6 hours for a patient on heparin while simultaneously managing multiple other critically ill
patients. Impaired attention also increases the use of heuristics, which are the cognitive short-
cuts we use to increase mental efficiency during times of stress. Although they may increase
productivity in the short term, heuristics also increase certain types of human errors. Lastly,
strategic factors force providers into difficult trade-offs between opposing objectives when time
and resources are limited and risks and uncertainties abound. An example would be deciding
whether or not to give the last open ICU “crash” bed to a medical/surgical floor patient who is
hemodynamically stable, but difficult to manage on the floor because of observation or moni-
toring demands.
Figure 107.1 illustrates that on one side, the expression of human error or expertise at the
sharp end is governed by the evolving demands of the situation being managed and on the
other side by the organizational context in which an individual is operating. Organizational
structures and culture at the blunt end of the system determine what resources and constraints
people at the sharp end experience, powerfully influencing how well they will be able to use
their knowledge, focus their attention, and make decisions during the course of their work
(see Figure 107.1).

Téléchargé pour julls apouakone (celestinjulls@yahoo.fr) à Hospital Group North Essonne à partir
de ClinicalKey.fr par Elsevier sur novembre 25, 2023. Pour un usage personnel seulement.
Aucune autre utilisation n´est autorisée. Copyright ©2023. Elsevier Inc. Tous droits réservés.
107—MEDICAL ERRORS AND PATIENT SAFETY 959.e1

TABLE 107.E1 n Patient Safety Terms and Examples


Term Example
Adverse event A patient is administered heparin and develops HIT
Unpreventable adverse event A patient who has no known history of HIT is
administered heparin and develops HIT
Preventable adverse event A patient with a known history of HIT is
administered heparin and develops HIT
Near miss A patient is prescribed heparin, but before drug
administration the pharmacist notes that the
patient has a history of HIT and calls the doctor
to cancel the order before it reaches the patient

HIT, heparin-induced thrombocytopenia (see Chapter 45).

TABLE 107.E2 n Human Error in Active Failures: Types and Examples


Type of Error Example
Omission—forgetting Physician forgets to order heparin for DVT
prophylaxis.
Commission—misperception Physician recalls patient had recent history of severe
hemorrhage (when in fact it was an otherwise
similar patient the physician had admitted the
same night after being roused from sleep)
and therefore does not order heparin for DVT
prophylaxis
Commission—mistakes Physician notes mild thrombocytopenia and
thinks it is a contraindication to heparin for DVT
prophylaxis, so heparin is not ordered
Commission—slips Admitting physician believes he or she has ordered
heparin DVT prophylaxis and has actually done
so—but for the wrong patient

DVT, deep venous thrombosis.

TABLE 107.E3 n Latent Failures: Domains and Examples Related to a Central Line Infection
Domain of Latent Condition Example
Policies, procedures, and processes Absence of a standardized, evidence-based
approach to central line insertion or maintenance
Information systems Lack of documentation or access to documentation
related to timing and observation of dressing
changes
Materials and equipment Unavailability of chlorhexidine or dressing kits
needed for central line maintenance; changes of
line kits without education of personnel regarding
how to use new kits
Work environment and architecture Insufficient lighting or room setup made sterile central
line insertion and maintenance difficult

Téléchargé pour julls apouakone (celestinjulls@yahoo.fr) à Hospital Group North Essonne à partir
de ClinicalKey.fr par Elsevier sur novembre 25, 2023. Pour un usage personnel seulement.
Aucune autre utilisation n´est autorisée. Copyright ©2023. Elsevier Inc. Tous droits réservés.
960 6—PROFESSIONALISM AND INTERPERSONAL AND COMMUNICATION SKILLS

Organizational structure, culture,


leadership, management processes BLUNT END

Latent
conditions
Resources and constraints: physical
environment, equipment, staffing
levels, training, policies

Expression of human
expertise or error:
G Knowledge factors SHARP END
Co oals ct
Ob nflict Attention factors Rea
Op stac s Strategic factors o p e
C Active
por le pt
tun s Ada ver failures
ve

i Ac o
Ris ty e c
Percei

R ate
t
ks t icip
An

Evolving/dynamic/complex situation

Figure 107.1 Sharp-end operators mediate the interaction between evolving work situations and the organi-
zational context (structures and processes). Factors at the blunt end markedly influence the likelihood that an
operator will express human expertise or human error. (Based on an image from Woods DD, Dekker SWA,
Cook RI, et al: Behind Human Error, 2nd ed. Aldershot, UK: Ashgate, 2010.)

Tools for Error Analysis in the ICU


Root cause analysis (RCA) is a widely used patient safety tool. In this retrospective analysis, latent
conditions, or “root” causes of an error, are identified through a deliberate process that explicitly
moves past proximal, seemingly obvious active failures. In an RCA, a facilitator leads a multidis-
ciplinary team through a review of the event to uncover system failures in an open environment
that explicitly avoids assigning blame.
Diagrams are often used in or created from a root cause analysis, and they can help demon-
strate the relationship between the various factors that contributed to an error or unanticipated
event. Figure 107.2 shows a cause-and-effect diagram for an ICU patient who suffers a gastroin-
testinal bleed related to a heparin dosing error. A discussion focused on active failure would spend
time talking about why the physician ordered the wrong heparin dose. A root cause analysis would
identify multiple contextual factors and latent conditions that allowed this error to occur.
In addition to dissecting the medical error, RCA teams create action plans for improvement,
assign responsibility for those improvements, and consider metrics that allow the organization
to measure the results of the intervention. Failure Modes and Effects Analysis (FMEA) is a highly
structured, prospective method in which a process is studied to determine what types of errors may
result from its operation in the system and what effects or outcomes the error is likely to cause.
The likely effects of a failure are prioritized according to their severity, frequency, and detect-
ability. This detailed analysis is then used to develop countermeasures and redesign the process
to make it more reliable. Other important tools for patient safety include leadership walk rounds
with frontline staff to observe and learn more about real-world operating conditions and safety

Téléchargé pour julls apouakone (celestinjulls@yahoo.fr) à Hospital Group North Essonne à partir
de ClinicalKey.fr par Elsevier sur novembre 25, 2023. Pour un usage personnel seulement.
Aucune autre utilisation n´est autorisée. Copyright ©2023. Elsevier Inc. Tous droits réservés.
107—MEDICAL ERRORS AND PATIENT SAFETY 961

ROOT CAUSE ANALYSIS CAUSE AND EFFECT DIAGRAM

Order placed overnight; Overnight nurse did not


No verbal communication
not discussed during know that a weight was
between MD and RN
morning multidisciplinary required for heparin
regarding heparin order
rounds (communication) orders (training)

A patient was given an


The ordering MD was not the
overdose of heparin Handoffs in care
primary MD for this patient
and suffered a GI bleed

MD entered a bolus Incomplete written and


dose of heparin that verbal handoff: no patient
was not weight based. weight or specific dose
discussed for a high-risk
medication (heparin) that
may need to be ordered

Lack of evidence- No requirement for No pharmacy review


based guidelines for patient weight in CPOE of heparin dosing prior
heparin administration when ordering heparin to administration
(policies/procedures) (information systems) (procedures)

Figure 107.2 Cause-and-effect diagram for heparin medication error causing patient harm. GI, gastrointestinal;
CPOE, computer physician order entry system

concerns and the use of easily accessible incident reporting systems to promote reporting of near
misses and other adverse events. Key to each of these is subsequent follow-up and action, with
feedback to and recognition of the frontline staff who have participated in identifying problems.
Each of these activities facilitates a culture of safety (discussed later).

Error Disclosure
There is always an ethical duty to disclose an error to a patient when it results in harm. Nonethe-
less, multiple barriers to disclosure remain in health care, including the profound negative emotions
that physicians experience when their patients are harmed by an error, including guilt, fear, anger,
remorse, and isolation. Complicating these emotions, many physicians are unsure of what to say and
how to say it given the fear of litigation. Addressing these factors can help alleviate the emotional
barriers.
To maintain the trust of patients and their families, errors should be disclosed as soon as
possible after the event has occurred. The person that the patient identifies as being responsible
for his or her care should make the disclosure. This is usually the most senior physician but
may vary depending on the clinical circumstances. For example, an intensivist would most likely
disclose an error in the ICU. However, if the error were related to a surgical intervention that
occurred while the patient was in the ICU, it may best be disclosed by the surgeon or collaboratively
by both the intensivist and the surgeon. Additional information on this topic can be found in
Chapter 109.

Téléchargé pour julls apouakone (celestinjulls@yahoo.fr) à Hospital Group North Essonne à partir
de ClinicalKey.fr par Elsevier sur novembre 25, 2023. Pour un usage personnel seulement.
Aucune autre utilisation n´est autorisée. Copyright ©2023. Elsevier Inc. Tous droits réservés.
962 6—PROFESSIONALISM AND INTERPERSONAL AND COMMUNICATION SKILLS

Conclusion: Establishing a Culture of Safety in the ICU


A culture of safety is critical to reducing errors and improving ICU performance and is character-
ized by a pervasive preoccupation with identifying and preventing error. Culture is formed by
the attitudes and behaviors of the people who work in a particular area as well as their leaders. A
critical precondition for a culture of safety is a just culture, in which front-line individuals are held
accountable for competent performance, but not the shortcomings of the organization in which
they work. Clinicians have a readily accessible incident reporting system and know that they will
not be punished for using it to report safety concerns. Once established, health care providers at
all levels in the organization can share accountability for safe, reliable care; feel safe themselves
when reporting problems, near misses, and errors; and have the forum and expectation to do so.
The analysis of reported incidents and problems generates information that these providers and
their leaders can use to improve patient care.
In other complex industries, organizations that consistently maintain safe performance by
fostering the principles outlined here are termed high-reliability organizations. These organiza-
tions articulate their safety objectives clearly, recognize that people are the preeminent guardians
of the system, and are preoccupied with learning from errors and creating effective error-reduction
strategies. The challenge for medicine is to learn from the success of high-reliability organizations
and adopt their principles in order to achieve the highest levels of safety in our field.

An annotated bibliography can be found at www.expertconsult.com.

Téléchargé pour julls apouakone (celestinjulls@yahoo.fr) à Hospital Group North Essonne à partir
de ClinicalKey.fr par Elsevier sur novembre 25, 2023. Pour un usage personnel seulement.
Aucune autre utilisation n´est autorisée. Copyright ©2023. Elsevier Inc. Tous droits réservés.
Bibliography
Classen DC, Kilbridge PM: The roles and responsibility of physicians to improve patient safety within health
care delivery systems. Acad Med 77:963-972, 2002.
This article outlined challenges to the development of a culture of safety and recommendations for the creation of
high-reliability health care organizations.
Cook RI, Woods DD: Operating at the sharp end: the complexity of human error. In: Bogner M (ed): Human
Error in Medicine. 2nd ed. Hillsdale, NJ: L. Erlbaum Associates, 1994, pp 255-310.
This chapter elucidated Reason’s model of the “sharp” and “blunt” ends of complex systems, dissected how blunt
end factors influence human performance at the sharp end, and described how complex systems fail.
Dekker SW: Patient Safety: A Human Factors Approach. Boca Raton, FL: CRC Press, 2011.
This book defines and explains the role of human factors in medical error and outlines an approach to look for
safety solutions and risk everywhere within the health care system.
Donabedian A: The quality of care: how can it be assessed? JAMA 260:1743-1748, 1988.
In this article the author outlined concepts essential to assessing the quality of medical care and the relationship
between structures, processes, and outcomes.
Gallagher TH, Studdert D, Levinson W: Disclosing harmful medical errors to patients. N Engl J Med
356:2713-2719, 2007.
This article summarized the available evidence around standards in medical error disclosure and recent legal
developments.
Kohn L, Corrigan J, Donaldson M (eds): To Err Is Human: Building a Safer Health System. Washington,
DC: National Academy Press, 2000.
This report raised awareness of the magnitude of errors in medicine and identified faulty systems as the primary
reason for error. It charged health care organizations with identifying and fixing systems flaws.
Lawton R, McEachan R, Giles SJ, Sirriyeh R, Watt IS, Wright J: Development of an evidence-based
framework of factors contributing to patient safety incidents in hospital settings: a systematic review.
BMJ Qual Saf 21(5):369-380, 2012.
This systematic review article described a contributory factors framework to analyze patient safety events in the
hospital setting from a synthesis of the literature to date on the topic.
Marx D: Patient safety and the just culture: a primer for health care executives. New York, NY: Columbia
University, 2001.
This white paper is a primer for organizational leaders on the topic of “Just Culture” which is an approach
to deciding upon how to best hold individuals accountable for mistakes while simultaneously understanding and
accepting the concept of blameless culture for systems error.
Moray N: Error reduction as a systems problem. In: Bogner M (ed): Human Error in Medicine. 2nd ed.
Hillsdale, NJ: L. Erlbaum Associates, 1994, pp 67-91.
This chapter outlined the human and systems constraints and limitations that affect behavior and the likelihood
of human error.
Nolan T: System changes to improve patient safety. BMJ 320:771-773, 2000.
This article delineated an approach to safe systems design informed by human factors and reliability engineering.
Plsek P: Redesigning health care with insights from the science of complex adaptive systems. Crossing the
Quality Chasm, Appendix B. Washington, DC: National Academy Press, 2001, pp 309-322.
This manuscript contrasted mechanical and adaptive systems, described complex adaptive systems, and articulated
the way simple underlying rules govern their performance.
Reason J: Human error: models and management. BMJ 320:768-770, 2000.
In this article, cognitive psychologist James Reason distilled his seminal work on human error, the fundamental
differences between a person and systems approach to error analysis, latent conditions and active failures, and
characteristics, especially redundancy of safeguards, present in high-reliability organizations.
Senders JW: Medical devices, medical errors, and accidents. In: Bogner M (ed): Human Error in Medicine.
2nd ed. Hillsdale, NJ: L. Erlbaum Associates, 1994, pp 159-177.
This chapter discussed the nature of human errors, syntax and taxonomies of error, remedies, and barriers to their
implementation.
Spear S, Bowen HK: Decoding the DNA of the Toyota production system. Harv Bus Rev 77:97-106, 1999.
This article articulated the simple rules used by the high-reliability organization Toyota to govern its complex business.
Wachter RM: Understanding Patient Safety. New York: McGraw Hill, 2008.
This concise primer outlined basic concepts and domains of patient safety.
962.e1
Téléchargé pour julls apouakone (celestinjulls@yahoo.fr) à Hospital Group North Essonne à partir
de ClinicalKey.fr par Elsevier sur novembre 25, 2023. Pour un usage personnel seulement.
Aucune autre utilisation n´est autorisée. Copyright ©2023. Elsevier Inc. Tous droits réservés.

You might also like