You are on page 1of 6

Audio Transcript

Usability and Human Factors


Electronic Health Records and
Usability
Lecture a

Health IT Workforce Curriculum


Version 4.0/Spring 2016
This material (Comp 15 Unit 6) was developed by Columbia University, funded by the Department of
Health and Human Services, Office of the National Coordinator for Health Information Technology under
Award Number 1U24OC000003. This material was updated by The University of Texas Health Science
Center at Houston under Award Number 90WT0006.

This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0


International License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-
sa/4.0/.
Slide 1

Welcome to Usability and Human Factors, Electronic Health Records and Usability.
This is Lecture a.
In this unit we will apply principles of usability and design to critiquing EHR systems and
to making recommendations for iterative improvement.

Slide 2

By the end of this unit students will be able to:


1. Describe and define usability as it pertains to the EHR
2. Explain the challenges of EHR design and usability in typical workflow

Slide 3

In 2010, three major organizations, the Association for Research on Healthcare and
Quality (AHRQ), the National Research Council (NRC), and the Health information
Management Systems Society (HIMSS) published reports on usability. NRC was the
first, and after a two year study, in which experts travelled around the country looking at
some of the institutions with the best healthcare IT, came to this conclusion:
That while computing science has adequately met the needs of the back-end systems;
what is needed is better front-end development that provides cognitive support to
clinicians. Usability is a critical part of the user experience.
The other reports focused on usability also, pointing out its influence in errors (which in
medicine can be fatal), user satisfaction, and productivity.

Slide 4

In 2014, certification criteria were revised to include safety-enhanced design. In 2015,


the United States Department of Health and Human Services, in the final rule published
in the Federal Register on October 16, 2015, recommended following specific standards
and guidelines for testing and evaluation methods published by the National Institute of
Standards and Technology (or NIST) and by the International Organization for
Standardization.

Slide 5

Users form their impression of software from their experience above all; poor
experiences can lead to profound dissatisfaction (including refusal to use the system),
abuse, dangerous workarounds, and other serious consequences. For example, we
have seen the results of poor usability affect the outcome of elections.

Slide 6

Health IT Workforce Curriculum Usability and Human Factors


Version 4.0/Spring 2016 Electronic Health Records and Usability 2
In 2009, HIMSS published ten aspects of usability: simplicity, naturalness, consistency,
minimizing cognitive load, efficient interactions, forgiveness, feedback, and effective use
of language, effective information presentation, and preservation of context.

Slide 7

Current research by the National Center for Cognitive Informatics and Decision Making
in Healthcare (the NCCD) has found that a great user interface follows established
human interface design principles that are based on the way users (doctors, nurses,
patients etc.) think and work. There are 14 general design principles that can be applied
to the development of EHRs, and they are an expansion and elaboration of Nielsen’s 10
principles that are discussed in other units in this component. We will use these
guidelines in the remainder of this unit.

Slide 8

Let’s look at some reasons why EHR usability is not yet optimal. Vendor contracts may
forbid customers (even customers of the same EHR) to discuss their experiences.
Publication of screenshots and other information may be forbidden by copyright; this
hinders research.

Slide 9

The AHRQ report found that many legacy systems currently in use are more than 10
years old, and implementation plans can take decades. Best practices have not been
defined yet, though AHRQ and other associations are working on this. Expectations are
unclear, communication limited, and many vendors do not do formal usability testing, or
only do it to a limited extent. Because of the lack of formal standards and training
usability may be perceived to be overly subjective and therefore difficult to measure. As
we will show later, this is not the case.

Slide 10

However, the increased interest and focus on this problem means that there is
increasing involvement of users in design. The AHRQ report on vendor practices found
that vendors attempt to compete on usability, users demand better products, and plans
for formal usability testing are increasing. Vendors also say they are amenable to
changing design if given guidelines.

Slide 11

Some users and researchers are discouraged at the extremely poor usability of some
systems, which has led to errors (including fatal errors). Political and power struggles in
implementations can ensue, as the introduction of technology can also change power
relationships, as well as radically alter workflow and work practices. Lack of appropriate

Health IT Workforce Curriculum Usability and Human Factors


Version 4.0/Spring 2016 Electronic Health Records and Usability 3
clinician input at design has sometimes resulted in systems which are at best difficult to
use and at worst, dangerous.

Slide 12

According to the AHRQ report on vendor practices below are some quotes.
“The field is competitive so there is little sharing of best practices to the community. The
industry should not look towards vendors to create these best practices. Other entities
must step up and define [them] and let the industry adapt.”
Products are picked on the amount of things they do, not how well they do them.”
“There are no standards most of the time, and when there are standards, there is no
enforcement of them. The software industry has plenty of guidelines and good best
practices, but in HIT, there are none.”

Slide 13

A study published in 2015 by Ratwani and colleagues found that there was a lack of
adherence to ONC certification requirements. For example, only 22% of the vendor
reports had used at least 15 participants with clinical backgrounds for usability tests.
Ratwani and colleagues stated, “The lack of adherence to usability testing may be a
major factor contributing to the poor usability experienced by clinicians. Enforcement of
existing standards, specific usability guidelines, and greater scrutiny of vendor UCD
processes may be necessary to achieve the functional and safety goals for the next
generation of EHRs.”

Slide 14

Let’s look at some examples of egregious usability problems, prepared by Scot


Silverstein, an informatician who had to prepare mock screenshots (based on real
systems) because of copyright restrictions. His website contains more examples.
Some basic examples are the placement of related data far apart, such as one real
system, which required the user find the different components of blood pressure
(systolic and diastolic) four screens apart. Another example is diagnosis lists, which
make rare diagnoses more easily clickable than common ones.

Slide 15

Take a look at this mock screenshot. What do you see that is suboptimal or could lead
to error?

Slide 16

Health IT Workforce Curriculum Usability and Human Factors


Version 4.0/Spring 2016 Electronic Health Records and Usability 4
Instead of just programming a highlight for information that should be alerted, the
system states there are no indicator flags.

Slide 17

The results section says that the result is negative, and the result is final. Most clinicians
who are busy would likely stop reading here.

Slide 18

Then, there is an addendum saying that the culture is actually positive for MRSA, a
dangerous infection that often spreads in hospitals.

Slide 19

This sort of bad design has several consequences. It forces clinicians to search for
indications of normalcy or danger. It presents a disparity from the lab system, which
normally flags abnormal results. This can lead to miscommunication between
personnel. The case is a real case in which the patient was not treated for a dangerous
infection for 24 hours. The system is CCHIT certified despite the bad design. This
example also shows one of the changes from paper to computer - in a paper system the
erroneous first test could have been crossed out, preventing the mistake.

Slide 20

This slide shows an alphabetized problem lists from a real system. It does not meet the
needs of clinicians, who would want to see the problems in order of severity or
importance.

Slide 21

The list is created by the system automatically, and the clinician does not have the
ability to edit or delete entries. The entries can be incorrect because many people put
information into the system, and may make selections for convenience, such as the
nurse who entered the atrial fibrillation diagnosis to speed up the order fulfillment.
Unbelievably, the wrong entry can only be removed by the vendor.

Slide 22

Thus the multiple diabetes diagnoses, only one of which is accurate. Lack of controlled
terminology makes term management difficult. The list also includes useless information
such as the 'medication use, long term' item.

Slide 23

Health IT Workforce Curriculum Usability and Human Factors


Version 4.0/Spring 2016 Electronic Health Records and Usability 5
This screen shows excessive density, complexity, lack of organization or marking that
could make it easier to read, extraneous information, and general clutter.

Slide 24

This is a grid which the user must scroll to be able to see some information. However,
when the user scrolls...

Slide 25

...the row and column headers that tell what information belong to each column
disappears. Thus the user must keep track either mentally or (more likely) by placing
fingers on the screen. Otherwise it would be easy to lose track of columns or misread
information, potentially causing errors.

Slide 26

This screen has excessive repetitious information that is not needed and is distracting,
such as including the units in every cell instead of in the header rows. There is a lack of
focus and clarity; lab panel components are scattered.

Slide 27

This concludes lecture a of Usability and Human Factors, Electronic Health Records
and Usability.

In this unit we examined vendor practice reports by the Agency for Healthcare Research
and Quality. This provided key rules and roles for vendors. In addition this lectured
provided examples on how wrong data can be input into EHR systems (error). In the
next lecture we will continue by discussing usability concepts.

Slide 28 (Reference slide)

No Audio.
Slide 28 (Reference slide)

No Audio.
Slide 30 (Final slide)

No audio.
End.

Health IT Workforce Curriculum Usability and Human Factors


Version 4.0/Spring 2016 Electronic Health Records and Usability 6

You might also like