You are on page 1of 51

Public health informatics: Policy,

design, and evaluation

Rebecca Randell
Professor in Digital Innovations in Healthcare

r.randell@bradford.ac.uk

1 26 December 2023
Structure of lecture

• Public health data collection


• Public health informatics in low resource
settings
• UK public health informatics policy
• Designing public health informatics
• Evaluating public health informatics

2 26 December 2023
Public health informatics policy

3 26 December 2023
PHE Strategy 2020-25
Ambitions:
• Predictive prevention: To help provide people,
particularly among vulnerable and disadvantaged
groups, with personalised public health interventions
that empower them to take greater control of their
health and prevent avoidable illness.
• Enhanced data and surveillance capabilities: To
develop a world-leading public health data and
surveillance infrastructure that supports effective
decision-making and action by generating public
health intelligence which is accessible, consistent,
flexible, timely and of high quality.
4 26 December 2023
WHO Global strategy on digital
health 2020-2025
• 4 guiding principles, e.g. “Promote the appropriate use
of digital technologies for health” (adaptable to
different countries and contexts, support equity in
access to digital resources – “digital determinants of
health”)
• 4 strategic objectives, e.g. “Advocate people-centred
health systems that are enabled by digital health”
• Framework for action: commit, catalyse, measure,
enhance and iterate

5 26 December 2023
6 26 December 2023
Designing public health
informatics

7 26 December 2023
Requirements elicitation
• Requirement: statement about an
intended product that specifies
what the product is expected to
do or how it will perform
Requirements specification
• Used as the basis of design or to
select existing software

8 26 December 2023
Defining requirements

9 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE


Types of
requirements

Functional • What the software should do

• Look
Non- • Usability

functional •
Interaction
Data
(not •

Performance
Maintainability
definitive) • Support

10 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE


Methods of requirements elicitation
• Requirements analysis,
capturing requirements
• Need to understand users:
capabilities, tasks and
goals, context
• Interviews
• Documentation
• Look at existing software

11 26 December 2023
Interviews for requirements elicitation
• Semi-structured – allowing follow-up
questions
• Questions typically focus on:
– Background (profession, training) and
role
– Normal duties, decisions, and use of
technologies
– Challenges they experience
– What they would want from the new
technology

12 26 December 2023
Translation into requirements
• To identify functional
requirements: tasks,
questions they want to
answer (information
needs)
• To identify non-
functional requirements:
what else do they say
about how the system
needs to be?

13 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE


Personas
• Rich descriptions of typical
users that designers can
focus on and design the
technology for
• Not real people, but realistic
rather than idealised
• Focus on goals, rather than
job roles
• Skills, attitudes, tasks,
environment
• Make them feel real: name,
photograph, personal details,
e.g. what they do in their
spare time

14 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE


Example persona
• Name: Mary Brown
• Age: 51
• Gender: Female
• Occupation: Clinical Manager
• Hobbies: Reading and Hiking

Mary is a clinical manager at Eden


Medical Centre. She has a strong
background in nursing and healthcare
administration. Mary has over 20
years experience in public health. She
has dedicated her career to ensuring
the efficient and compassionate
delivery of healthcare services.
15 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE
Scenarios of use Descriptions of common work
activities performed by
individuals who occupy
specific roles in specific
contexts
Describe everyday tasks in
plain language

16 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE


Example scenario of use
Critical Incident: Managing the Effects of a
Snow Storm
It is mid-December and a snow storm has
shut down most public and private
operations in the area. The county covers a
wide geographic area with many
microclimates, so conditions vary from bad
to worse. Many areas are without power.
Roads are blocked by snow and ice. Public
transit is running sporadically or not at all.
Many public health staff members
commute to work via car or bus. Some
staff members live far from the public
health sites where they work. Mary must
use the local health jurisdiction’s
operations support information system to
help manage the effects of the snow storm
at her public health centres (Location A
and Location B). (from Reeder & Turner,
2011)

17 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE


Challenges of
requirements elicitation
• Difference between
what people do and
what they say –
idealised accounts
• Requirements evolve

• Observation: see
activity you’re trying
to support in context
of existing tools and
technologies and
broader work
• Iterative (and on-
going) development

18 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE


The design cycle

Elicit
requirements

Evaluation Design

19 26 December 2023
Co-design

• Working with intended users


• Participatory design, co-
production

• Experience-based co-design:
1. Reflection, analysis,
diagnosis, and description
2. Imagination and visualisation
3. Modelling, planning, and
prototyping
4. Action and implementation

20 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE


SYNC SYSTEM

21 26 December 2023
QualDash

22 26 December 2023
A co-design workshop
• Use the tasks as the
basis for activities
• Gain a better
understanding of tasks
• For each task:
– What information is
used?
– How is the
information
presented?
• Could also add new
tasks
23 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE
Understanding current tasks/situation
QualDash vs EVEREST project

QualDash – task driven project

EVEREST – situation driven project

24 26 December 2023
Understanding current situation
EVEREST project

25 26 December 2023
Understanding current tasks
QualDash

26 26 December 2023
Multiple tasks
• Select the most pressing
questions to be answered
at a glance at a dashboard
• Sketch the layout of a
static dashboard that could
provide the minimally
sufficient information for
answering these tasks
• Select/add follow-up tasks
that arise from these initial
tasks

27 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE


Understanding how tasks fit together

28 26 December 2023
Iterative co-design

• Prototype: one
manifestation of a design
• Useful for discussing and
evaluating ideas with
intended users
• Low-fidelity prototype:
Paper-based
• High-fidelity prototype:
Looks more like the final
product and provides more
functionality

29 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE


Low vs high fidelity
• Low:
 Quick and cheap to
produce
 Quick and easy to revise
 Can evaluate multiple
design ideas
 Useful as a
communication device
- Big jump to fully
functional system
- Doesn’t tell you about
usability

30 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE


Low vs high fidelity
• High:
 (Almost) complete
functionality
 Fully interactive – so
can assess usability
 Look and feel of
intended product
- More resource-
intensive to develop
- Potential of being
mistaken for the final
product
31 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE
Challenges of co-design
• People having time to
participate

32 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE


Reading list
• Reeder, B. and A. M. Turner (2011). "Scenario-based
design: A method for connecting information system
design with public health operations and emergency
management." Journal of Biomedical Informatics 44(6):
978-988.
• Turner, A. M., B. Reeder and J. Ramey (2013). "Scenarios,
personas and user stories: User-centered evidence-based
design representations of communicable disease
investigations." Journal of Biomedical Informatics 46(4):
575-584.
• Reeder, B., R. A. Hills, G. Demiris, D. Revere and J. Pina
(2011). "Reusable design: A proposed approach to Public
Health Informatics system design." BMC Public Health 11(1):
33 26 December 2023
1-8.
Evaluating public health
informatics

34 26 December 2023
Evaluating public health informatics
Evaluation is a systematic
process of assessing a
product, or process. The
purpose is to provide
information that will be
used in decision making.

35 26 December 2023 POWERPOINT PRESENTATION TEMPLATE BLUE


Types of evaluation
• Formative Evaluation: This is conducted
during the development or planning
phase to inform and improve the
project.
• Summative Evaluation: Performed at
the end of a project or program to
measure overall impact and
effectiveness.
• Process Evaluation: Focus on the
implementation and execution of a
project to ensure its adherence to
intended plans.
36 26 December 2023
Three broad categories
• Controlled setting involving users: usability
tests and experiments
• Natural settings involving users
(in-situ/real world evaluation)
• Settings not involving users

37 26 December 2023
Usability evaluation

38 26 December 2023
Usability evaluation
Usability evaluation is the
process of assessing the
effectiveness, efficiency,
and satisfaction of a
product. The goal is to
improve user experience
and ensures that a
product meets the needs
of its intended users.

39 26 December 2023
Types of Usability evaluation
• Usability Testing: observe real users as they
interact with a product or interface.
• Heuristic Evaluation: experts or evaluators
assess a product against a set of predefined
usability guidelines. Most common guideline is
called Nielson’s usability principles.
• Think-Aloud Testing: requires users to
verbalize their thoughts while interacting with
a product. The purpose is to identify the pain
points
40 26 December 2023
Heuristic evaluation
• Form of expert review
• 3-5 experts
• Experts in what?
• Undertake tasks and assess against a set of
heuristics
– Nielsen’s usability principles
– Heuristics for dashboard visualisations
– Visualisation value framework

41 26 December 2023
Nielsen’s usability principles
1. Visibility of system status
2. Match between system and
the real world
3. User control and freedom
4. Consistency and standards
5. Error prevention

42 26 December 2023
Nielsen’s usability principles
6. Recognition rather than recall
7. Flexibility and efficiency of use
8. Aesthetic and minimalist design
9. Help users recognize, diagnose, and recover
from errors
10.Help and documentation

43 26 December 2023
Visualisation value framework
• 10 heuristics
1. Time savings a
visualisation provides
2. Insights and insightful
questions a visualisation
spurs
3. Overall essence of the
data a visualisation
conveys
4. Confidence about the
data and its domain a
visualisation inspires
44 26 December 2023
Think aloud technique
• Controlled setting
• User given tasks to complete
• User ‘thinks aloud’ as they
complete them
• Sit with the user/record the
user
• How many users?
• What do you want variation in?
e.g. job role, familiarity with
technology, age, health literacy
45 26 December 2023
Additional methods
• Questionnaire
– System Usability Scale
– Based on TAM: Perceived ease of use and
usefulness
• Interview or focus group
• Cognitive Walkthrough: evaluators takes the
role of users and take steps through a product,
describing their thought processes and actions
at each step.

46 26 December 2023
Controlled experiment
• Controlled setting
• User given tasks to complete
• Observe/video record
• Assess participant’s performance: completion,
accuracy, time
• Minimum of 15 participants
• What do you want variation in?

• Complement with
47
questionnaire/interview/focus group
26 December 2023
In situ evaluation
• Just go in and see what happens?
• Data collection informed by anticipated benefits but
flexible to respond to unanticipated uses and
impacts
• Fieldnotes
• Video recording
• Audio recording
• Interviews
• Questionnaires, diaries
• BUT time-consuming
48 26 December 2023
Designing an evaluation

• Will users accept this technology?


• How does the system change how people
work?
• How does the system change...?
• How does use of the technology change as
people become more familiar with it, and what
impact does that have?
• Why is the system not having the impact we
expected?
49 26 December 2023
Reading list
• https://www.nngroup.com/articles/ten-usability-heuristics/
• Yen, P.-Y. and S. Bakken (2009). "A comparison of usability
evaluation methods: heuristic evaluation versus end-user think-
aloud protocol - an example from a web-based communication
tool for nurse scheduling." AMIA ... Annual Symposium
proceedings. AMIA Symposium 2009: 714-718.
• Bangor, A., P. T. Kortum and J. T. Miller (2008). "An Empirical
Evaluation of the System Usability Scale." International Journal of
Human-Computer Interaction 24(6): 574-594.
• Favela, J., M. Tentori and V. M. Gonzalez (2010). "Ecological
validity and pervasiveness in the evaluation of ubiquitous
computing technologies for healthcare." International Journal of
Human-Computer Interaction 26(5): 414-444.
50 26 December 2023
Questions?

51 26 December 2023

You might also like