Professional Documents
Culture Documents
The 1999 Institute of Medicine (IOM) report To Err Is Human: Building a Safer Health System
estimated that 44,000 to 98,000 patients die annually in the United States as a result of medical
errors. This report showed that errors in medicine were poorly understood and infrequently inves-
tigated, largely because of the culture of hospitals and the medical profession that saw these errors
as the product of human failings. This belief led to a person-focused approach to error investigation,
in which strategies to reduce medical errors involved censoring or re-educating staff about proper
protocols and procedures. However, it is now known that this approach deprives organizations of
the greatest opportunities for meaningful learning and improvement.
Since the IOM report, health care has increasingly been adopting a systems-focused approach to
medical error. In this view, nothing is intrinsically different about medical errors as compared to
errors in other complex, high-risk industries such as aviation or nuclear power generation. Like
medicine, these fields rely heavily on human innovation and expertise to function normally, but
long ago recognized that human errors are symptoms of deeper organizational problems. Like
respiratory failure or heart failure, errors in medicine demand a diagnostic search for underlying
causes and systemic repairs to prevent error recurrence.
This chapter presents a framework in which to understand, investigate, and prevent errors in
the intensive care unit (ICU). These principles are widely used in the field of patient safety and
can be applied to other areas of health care.
958
Téléchargé pour julls apouakone (celestinjulls@yahoo.fr) à Hospital Group North Essonne à partir
de ClinicalKey.fr par Elsevier sur novembre 25, 2023. Pour un usage personnel seulement.
Aucune autre utilisation n´est autorisée. Copyright ©2023. Elsevier Inc. Tous droits réservés.
107—MEDICAL ERRORS AND PATIENT SAFETY 959
events; the former are due to error, whereas the latter are not. Errors that do not result in patient
harm are termed near misses and are more common than adverse events. Safety experts understand
that near miss events are equally useful to study to prevent future errors. Table 107.E1 illustrates
the difference between these terms.
Téléchargé pour julls apouakone (celestinjulls@yahoo.fr) à Hospital Group North Essonne à partir
de ClinicalKey.fr par Elsevier sur novembre 25, 2023. Pour un usage personnel seulement.
Aucune autre utilisation n´est autorisée. Copyright ©2023. Elsevier Inc. Tous droits réservés.
107—MEDICAL ERRORS AND PATIENT SAFETY 959.e1
TABLE 107.E3 n Latent Failures: Domains and Examples Related to a Central Line Infection
Domain of Latent Condition Example
Policies, procedures, and processes Absence of a standardized, evidence-based
approach to central line insertion or maintenance
Information systems Lack of documentation or access to documentation
related to timing and observation of dressing
changes
Materials and equipment Unavailability of chlorhexidine or dressing kits
needed for central line maintenance; changes of
line kits without education of personnel regarding
how to use new kits
Work environment and architecture Insufficient lighting or room setup made sterile central
line insertion and maintenance difficult
Téléchargé pour julls apouakone (celestinjulls@yahoo.fr) à Hospital Group North Essonne à partir
de ClinicalKey.fr par Elsevier sur novembre 25, 2023. Pour un usage personnel seulement.
Aucune autre utilisation n´est autorisée. Copyright ©2023. Elsevier Inc. Tous droits réservés.
960 6—PROFESSIONALISM AND INTERPERSONAL AND COMMUNICATION SKILLS
Latent
conditions
Resources and constraints: physical
environment, equipment, staffing
levels, training, policies
Expression of human
expertise or error:
G Knowledge factors SHARP END
Co oals ct
Ob nflict Attention factors Rea
Op stac s Strategic factors o p e
C Active
por le pt
tun s Ada ver failures
ve
i Ac o
Ris ty e c
Percei
R ate
t
ks t icip
An
Evolving/dynamic/complex situation
Figure 107.1 Sharp-end operators mediate the interaction between evolving work situations and the organi-
zational context (structures and processes). Factors at the blunt end markedly influence the likelihood that an
operator will express human expertise or human error. (Based on an image from Woods DD, Dekker SWA,
Cook RI, et al: Behind Human Error, 2nd ed. Aldershot, UK: Ashgate, 2010.)
Téléchargé pour julls apouakone (celestinjulls@yahoo.fr) à Hospital Group North Essonne à partir
de ClinicalKey.fr par Elsevier sur novembre 25, 2023. Pour un usage personnel seulement.
Aucune autre utilisation n´est autorisée. Copyright ©2023. Elsevier Inc. Tous droits réservés.
107—MEDICAL ERRORS AND PATIENT SAFETY 961
Figure 107.2 Cause-and-effect diagram for heparin medication error causing patient harm. GI, gastrointestinal;
CPOE, computer physician order entry system
concerns and the use of easily accessible incident reporting systems to promote reporting of near
misses and other adverse events. Key to each of these is subsequent follow-up and action, with
feedback to and recognition of the frontline staff who have participated in identifying problems.
Each of these activities facilitates a culture of safety (discussed later).
Error Disclosure
There is always an ethical duty to disclose an error to a patient when it results in harm. Nonethe-
less, multiple barriers to disclosure remain in health care, including the profound negative emotions
that physicians experience when their patients are harmed by an error, including guilt, fear, anger,
remorse, and isolation. Complicating these emotions, many physicians are unsure of what to say and
how to say it given the fear of litigation. Addressing these factors can help alleviate the emotional
barriers.
To maintain the trust of patients and their families, errors should be disclosed as soon as
possible after the event has occurred. The person that the patient identifies as being responsible
for his or her care should make the disclosure. This is usually the most senior physician but
may vary depending on the clinical circumstances. For example, an intensivist would most likely
disclose an error in the ICU. However, if the error were related to a surgical intervention that
occurred while the patient was in the ICU, it may best be disclosed by the surgeon or collaboratively
by both the intensivist and the surgeon. Additional information on this topic can be found in
Chapter 109.
Téléchargé pour julls apouakone (celestinjulls@yahoo.fr) à Hospital Group North Essonne à partir
de ClinicalKey.fr par Elsevier sur novembre 25, 2023. Pour un usage personnel seulement.
Aucune autre utilisation n´est autorisée. Copyright ©2023. Elsevier Inc. Tous droits réservés.
962 6—PROFESSIONALISM AND INTERPERSONAL AND COMMUNICATION SKILLS
Téléchargé pour julls apouakone (celestinjulls@yahoo.fr) à Hospital Group North Essonne à partir
de ClinicalKey.fr par Elsevier sur novembre 25, 2023. Pour un usage personnel seulement.
Aucune autre utilisation n´est autorisée. Copyright ©2023. Elsevier Inc. Tous droits réservés.
Bibliography
Classen DC, Kilbridge PM: The roles and responsibility of physicians to improve patient safety within health
care delivery systems. Acad Med 77:963-972, 2002.
This article outlined challenges to the development of a culture of safety and recommendations for the creation of
high-reliability health care organizations.
Cook RI, Woods DD: Operating at the sharp end: the complexity of human error. In: Bogner M (ed): Human
Error in Medicine. 2nd ed. Hillsdale, NJ: L. Erlbaum Associates, 1994, pp 255-310.
This chapter elucidated Reason’s model of the “sharp” and “blunt” ends of complex systems, dissected how blunt
end factors influence human performance at the sharp end, and described how complex systems fail.
Dekker SW: Patient Safety: A Human Factors Approach. Boca Raton, FL: CRC Press, 2011.
This book defines and explains the role of human factors in medical error and outlines an approach to look for
safety solutions and risk everywhere within the health care system.
Donabedian A: The quality of care: how can it be assessed? JAMA 260:1743-1748, 1988.
In this article the author outlined concepts essential to assessing the quality of medical care and the relationship
between structures, processes, and outcomes.
Gallagher TH, Studdert D, Levinson W: Disclosing harmful medical errors to patients. N Engl J Med
356:2713-2719, 2007.
This article summarized the available evidence around standards in medical error disclosure and recent legal
developments.
Kohn L, Corrigan J, Donaldson M (eds): To Err Is Human: Building a Safer Health System. Washington,
DC: National Academy Press, 2000.
This report raised awareness of the magnitude of errors in medicine and identified faulty systems as the primary
reason for error. It charged health care organizations with identifying and fixing systems flaws.
Lawton R, McEachan R, Giles SJ, Sirriyeh R, Watt IS, Wright J: Development of an evidence-based
framework of factors contributing to patient safety incidents in hospital settings: a systematic review.
BMJ Qual Saf 21(5):369-380, 2012.
This systematic review article described a contributory factors framework to analyze patient safety events in the
hospital setting from a synthesis of the literature to date on the topic.
Marx D: Patient safety and the just culture: a primer for health care executives. New York, NY: Columbia
University, 2001.
This white paper is a primer for organizational leaders on the topic of “Just Culture” which is an approach
to deciding upon how to best hold individuals accountable for mistakes while simultaneously understanding and
accepting the concept of blameless culture for systems error.
Moray N: Error reduction as a systems problem. In: Bogner M (ed): Human Error in Medicine. 2nd ed.
Hillsdale, NJ: L. Erlbaum Associates, 1994, pp 67-91.
This chapter outlined the human and systems constraints and limitations that affect behavior and the likelihood
of human error.
Nolan T: System changes to improve patient safety. BMJ 320:771-773, 2000.
This article delineated an approach to safe systems design informed by human factors and reliability engineering.
Plsek P: Redesigning health care with insights from the science of complex adaptive systems. Crossing the
Quality Chasm, Appendix B. Washington, DC: National Academy Press, 2001, pp 309-322.
This manuscript contrasted mechanical and adaptive systems, described complex adaptive systems, and articulated
the way simple underlying rules govern their performance.
Reason J: Human error: models and management. BMJ 320:768-770, 2000.
In this article, cognitive psychologist James Reason distilled his seminal work on human error, the fundamental
differences between a person and systems approach to error analysis, latent conditions and active failures, and
characteristics, especially redundancy of safeguards, present in high-reliability organizations.
Senders JW: Medical devices, medical errors, and accidents. In: Bogner M (ed): Human Error in Medicine.
2nd ed. Hillsdale, NJ: L. Erlbaum Associates, 1994, pp 159-177.
This chapter discussed the nature of human errors, syntax and taxonomies of error, remedies, and barriers to their
implementation.
Spear S, Bowen HK: Decoding the DNA of the Toyota production system. Harv Bus Rev 77:97-106, 1999.
This article articulated the simple rules used by the high-reliability organization Toyota to govern its complex business.
Wachter RM: Understanding Patient Safety. New York: McGraw Hill, 2008.
This concise primer outlined basic concepts and domains of patient safety.
962.e1
Téléchargé pour julls apouakone (celestinjulls@yahoo.fr) à Hospital Group North Essonne à partir
de ClinicalKey.fr par Elsevier sur novembre 25, 2023. Pour un usage personnel seulement.
Aucune autre utilisation n´est autorisée. Copyright ©2023. Elsevier Inc. Tous droits réservés.