You are on page 1of 10

Journal of the American Medical Informatics Association, 00(0), 2021, 1–10

doi: 10.1093/jamia/ocab010
Review

Review

Downloaded from https://academic.oup.com/jamia/advance-article/doi/10.1093/jamia/ocab010/6146672 by guest on 23 February 2021


Toolkits for implementing and evaluating digital health: A
systematic review of rigor and reporting
Myron Anthony Godinho ,1 Sameera Ansari,1 Guan Nan Guo ,1,2 and
Siaw-Teng Liaw 1

1
WHO Collaborating Centre on eHealth (AUS-135), School of Population Health, UNSW Sydney, New South Wales, Australia, 2Na-
tional Drug and Alcohol Research Centre, University of New South Wales, Sydney, New South Wales, Australia

Corresponding Author: Emeritus Prof. Dr. Siaw-Teng Liaw, Collaborating Centre on eHealth (AUS-135), School of Popula-
tion Health, UNSW Sydney, NSW 2052, Australia (siaw@unsw.edu.au)
Received 16 September 2020; Revised 8 January 2021; Editorial Decision 11 January 2021; Accepted 15 January 2021

ABSTRACT

Objective: Toolkits are an important knowledge translation strategy for implementing digital health. We studied
how toolkits for the implementation and evaluation of digital health were developed, tested, and reported.
Materials and Methods: We conducted a systematic review of toolkits that had been used, field tested or evalu-
ated in practice, and published in the English language from 2009 to July 2019. We searched several electronic
literature sources to identify both peer-reviewed and gray literature, and records were screened as per system-
atic review conventions.
Results: Thirteen toolkits were eventually identified, all of which were developed in North America, Europe, or
Australia. All reported their intended purpose, as well as their development process. Eight of the 13 toolkits in-
volved a literature review, 3 did not, and 2 were unclear. Twelve reported an underlying conceptual framework,
theory, or model: 3 cited the normalization process theory and 3 others cited the World Health Organization and
International Telecommunication Union eHealth Strategy. Seven toolkits were reportedly evaluated, but details
were unavailable. Forty-three toolkits were excluded for lack of field-testing.
Discussion: Despite a plethora of published toolkits, few were tested, and even fewer were evaluated. Methodo-
logical rigor was of concern, as several did not include an underlying conceptual framework, literature review,
or evaluation and refinement in real-world settings. Reporting was often inconsistent and unclear, and toolkits
rarely reported being evaluated.
Conclusion: Greater attention needs to be paid to rigor and reporting when developing, evaluating, and report-
ing toolkits for implementing and evaluating digital health so that they can effectively function as a knowledge
translation strategy.

Key words: digital health, toolkit, framework, implementation, evaluation, eHealth

INTRODUCTION Goals, urging stakeholders “to assess their use of digital technolo-
gies for health. . . and to prioritise [their] development, evaluation,
The 2018 World Health Assembly Resolution 71.7 on digital health
implementation, scale-up and greater use. . .”1 To this end, toolkits
recognized the role of digital technologies in achieving universal
are being increasingly used to guide the implementation and evalua-
health coverage and other targets of the Sustainable Development
tion of digital health interventions and systems. Toolkits are a

C The Author(s) 2021. Published by Oxford University Press on behalf of the American Medical Informatics Association.
V
All rights reserved. For permissions, please email: journals.permissions@oup.com
1
2 Journal of the American Medical Informatics Association, 2021, Vol. 00, No. 0

knowledge translation (KT) strategy used to communicate messages that together can guide users to develop a plan or organise efforts to
or share decision aids, tools, or goods to improve health, educate, follow evidence-based recommendations or meet evidence-based
or change practice or behavior among diverse populations—includ- specific practice standards.”9 The AHRQ also defines a tool as “an
ing patients, carers, clinical and managerial health professionals, instrument (e.g., survey, guidelines, or checklist) that helps users ac-
policymakers, community and health organizations—and the health complish a specific task that contributes to meeting a specific
system.2 evidence-based recommendation or practice standard.” As a KT
In 2012, the World Health Organization (WHO) and Interna- strategy, toolkits should be concise, understandable and clearly fo-
tional Telecommunication Union (ITU) developed the National cused, to enable the user to understand and operationalize or evalu-
eHealth Strategy Toolkit, which provides guidance to governments ate digital health strategies through a clear step-by-step process.

Downloaded from https://academic.oup.com/jamia/advance-article/doi/10.1093/jamia/ocab010/6146672 by guest on 23 February 2021


on the tools and processes to be considered in the development, im- As defined by the WHO Global Strategy on Digital Health 2020-
plementation, monitoring and evaluation of a national strategy.3 2025, digital health is “the field of knowledge and practice associ-
This WHO-ITU toolkit has prompted many public and private ated with the development and use of digital technologies to im-
organizations, including global aid agencies, to design domain- prove health.”10 It is “a broad umbrella term encompassing eHealth
specific toolkits, interactively mapping and addressing the digital (which includes mHealth)” and “expands the concept of eHealth to
health strategy, universal health coverage, and Sustainable Develop- include digital consumers, with a wider range of smart and con-
ment Goals. Examples of such toolkits are the COCIR eHealth Tool- nected devices, such as the Internet of Things, advanced computing,
kit by the European Coordination Committee of the Radiological, big data analytics, artificial intelligence (AI) including machine
Electromedical and Healthcare IT Industry4 and the Global Digital learning, and robotics.”10
Health Index Indicator Guide.5 However, little is known about the
success or otherwise of the use of these toolkits. Significant resources Review objectives
are needed to produce such toolkits, underscoring the need that they For toolkits to function as a robust and trustworthy KT strategy,
be of adequate quality and have evidence from user and utility testing toolkits should (1) be developed with adequate methodological rigor
and evaluation that they actually “work” in the real world. The and (2) transparently report the process of their development, test-
WHO Collaborating Centre for eHealth (AUS-135) was tasked with ing, and refinement.
conducting a systematic review on implementation and evaluation of This review focused on knowledge-to-action digital health tool-
toolkits for digital health, given the emerging evidence and lack of kits, studied their development process for both rigor and reporting,
syntheses on this topic. The review is the first part of a wider project their intended use, and their level of application. We also considered
to support WHO activities in digital health strategy development, im- whether toolkit evaluations provided evidence for their effectiveness
plementation, capacity building, and evaluation.6 as a KT strategy and thus identified determinants of successful tool-
kits. The review addressed the following questions about toolkits for
Background implementation and evaluation of digital health that have been prac-
tically used, tested, or evaluated in any way:
A 2014 scoping review of toolkits used to disseminate health knowl-
edge or support practice change found that only 31 (37%) of the 83 1. What are the toolkit’s characteristics?
toolkits included had been evaluated to any extent.2 The majority • What is the purpose of the toolkit?
(70%) did not specify the evidence base from which they draw, and • Who is it meant to be used by?
their effectiveness as a KT strategy is rarely assessed. To truly inform • Which operational level is it intended to be applied at?
health and health care, toolkits should include comprehensive 2. Is the toolkit development process methodologically rigorous and
descriptions of their content, be explicit regarding content that is evi- adequately reported?
dence based, and include an evaluation of their effectiveness as a KT • Is the toolkit informed by a literature review or evidence syn-
strategy, addressing both clinical and implementation outcomes.2 thesis?
This message is reinforced by a 2015 systematic review that con- • Is the toolkit based on an underlying conceptual framework?
cluded that toolkits should be informed by high-quality evidence and • Is the toolkit informed by expert consensus?
theory, and should be evaluated using rigorous study designs to ex- • Has the toolkit been evaluated?
plain the factors underlying their effectiveness and successful imple- • Has the toolkit reported the process of its development, test-
mentation.7 A recent qualitative study of clinic and community ing, evaluation, and refinement?
perceptions of a general intervention toolkit asserted that “unless the
toolkit is used, it won’t help solve the problem.” The authors recom-
mended that studies be conducted to determine when and how tool- MATERIALS AND METHODS
kits are used. Funders, policymakers, researchers, and leaders in
This systematic review was conducted from March 2019 to May
primary care and public health should allocate resources to foster
2020. Our methods were informed by the PRISMA (Preferred
toolkit development, testing, implementation, and evaluation.8
Reporting Items for Systematic Reviews and Meta-Analyses) guide-
A cursory search of the literature found no reviews of digital
lines, and modified to suit this review’s meta-research objectives.11
health toolkits, including for implementation or evaluation. To the
best of our knowledge, there are currently no defined standards or
Search methods and strategy
critical appraisal checklists for measuring or evaluating the quality
We searched several electronic databases for literature published
of digital health toolkits.
from 2009 to July 2019, including Global Health, Scopus, Pro-
Quest, Web of Science, and PubMed. Relevant gray literature (eg,
Defining key terms technical reports, dissertations, patents, meeting reports, annual
The Agency for Healthcare Research and Quality (AHRQ) defines a reports, government publications) was also identified using Google
toolkit as “a collection of related information, resources, or tools Advanced Search, and the first 10 pages of relevance-sorted results
Journal of the American Medical Informatics Association, 2021, Vol. 00, No. 0 3

were perused for relevant material. Other highly relevant sources in RESULTS
the public domain that were specifically searched included: WHO
Of the 1473 records sourced from searches of electronic databases,
Institutional Repository for Information Sharing, Digital Impact Al-
gray literature, and perusal of relevant journals, 138 duplicates were
liance,12 Asian Development Bank, Asia eHealth Information Net-
removed and 1335 records were screened at the title and abstract
work, MEASURE Evaluation and Health Data Collaborative,
stage (Figure 1). Of these, 1231 records were excluded and 104
Digital Square Global Goods,13 Health Metrics Network, AHRQ,
articles proceeded to full-text screening. During full text screening,
COCIR, Joint Learning Network, Healthcare Information and Man-
88 full-text articles were excluded (reasons for these are outlined in
agement Systems Society, U.S. Office of the National Coordinator,
Figure 1) and 16 full-text articles (13 toolkits) were included for
and The Open Group. Gray literature is being identified on an ongo-
data extraction.

Downloaded from https://academic.oup.com/jamia/advance-article/doi/10.1093/jamia/ocab010/6146672 by guest on 23 February 2021


ing basis, as part of the WHO Collaborating Centre activities.6
The earliest toolkit we included was developed in 2010, with
Records were pooled using Covidence to create a single literature
more being developed each subsequent year (Table 2). All toolkits
corpus and duplicates removed.14 The search strategy was com-
reported their intended purpose 7 were for digital health implemen-
prised of groups of similar search terms combined into strings using
tation, 11 for digital health evaluation, and 5 addressed both. While
Boolean operators, for example, (toolkit OR tool) AND (eHealth
toolkits for use at the national or sectoral level were developed by
OR mhealth OR digital health) AND (evaluation OR assessment).
international agencies, the U.S. Agency for International Develop-
The search strings are listed in Supplementary Appendix 1.
ment developed most of the toolkits for health organization use. Al-
most all the toolkits were developed in North America or Europe,
Screening process except for one from Australia.
Search results were pooled in EndNote X7 library (Clarivate, Phila-
delphia, PA) for reference management and exported to Covidence Methods of development
for screening. Duplicates were removed using both Endnote X7 and All of the 13 included toolkits reported their development process
Covidence. All records pertaining to the same toolkit and its testing (Table 3), all of which involved expert consensus. Eight of 13 tool-
or evaluation were grouped together. Records were screened for in- kits reported that their development process involved a literature re-
clusion in 2 stages: (1) title and abstract and (2) full text. Using Cov- view, while 3 did not, and a further 2 were unclear. Twelve reported
idence, each record was screened by 2 reviewers working basing the toolkit on an underlying conceptual framework, theory,
independently, choosing to include or exclude records based on se- or model, and of them, 3 cited the use of normalization process the-
lection criteria. During title and abstract screening, records were ex- ory and 3 others cited the WHO-ITU eHealth Strategy; only 1 tool-
cluded or progressed to full-text screening if both reviewers agreed. kit did not report using an underlying framework. The intended user
During full-text screening, records were excluded or included if both groups for each toolkit included government policymakers, imple-
reviewers agreed. Disagreements between reviewers at either screen- menting partners, donors, system managers, system users, and care
ing stage were resolved through arbitration and consensus with a program managers. Because evidence of the toolkit being used or
third reviewer. field tested in practice was required for inclusion of a toolkit, all the
toolkits we included necessarily reported this. Seven toolkits were
reportedly evaluated,16–20,27,28,32–35,41–45 but detailed evaluation
Selection criteria results were only available for 316–20,27,28 and, when available,
We included any toolkit for the implementation and evaluation of addressed usability, relevance, and facilitators and barriers to toolkit
digital health that had been used or field tested or evaluated in prac- use. The PRISM (Performance of Routine Information System Man-
tice. We excluded non-English literature, conference abstracts, agement) toolkit is a good example of toolkit development and
reviews, practice guidelines, and checklists. Conceptual frameworks reporting, as it is based on a literature review, an underlying frame-
are sometimes described as a tool or toolkit because they are used to work, and stakeholder consensus,18 and has been evaluated in sev-
guide approaches to implementation and evaluation; we restricted eral real-world settings, the findings of which have been
our selection to the AHRQ instrumental definition of tool and tool- reported.16,17 The extracted data for included toolkits is available in
kit and excluded conceptual frameworks from this review. Supplementary Appendix 2.

Data extraction and quality appraisal


DISCUSSION
A data extraction template was developed to capture data of rele-
vance to the reviews questions. Two independent reviewers What we learned
extracted data from each full text, and all authors reviewed com- Very few of the potentially relevant toolkits met the inclusion crite-
pleted data extraction sheets, and resolved disagreements through rion of field testing, let alone evaluation; this could indicate a lack
discussion and consensus. As there are currently no critical appraisal of field testing or poor reporting of field testing. The description of
checklists for toolkit quality, during data extraction we captured the development of the toolkits was often inconsistent and unclear,
any available information regarding toolkit development methods to possibly reflecting an inconsistent methodology.
serve as an indicator of methodological rigor. Specifically, this in- The toolkits for implementing and evaluating digital health
cluded the conduct of a literature review, the incorporation of expert broadly fell into 3 categories, based on the organizational level they
consensus, and the use of toolkit field testing (eg, pilot testing, use, applied to: the national health system level, health organization or
evaluation). Recognizing that there are currently no reporting crite- facility level (eg, health information system), or user level (eg, indi-
ria for toolkits, we extracted the reporting characteristics of toolkits vidual or health professional) (Table 4). These align closely with the
that reported their development processes, an approach informed by organizational levels identified by the “Framework of e-health for
our prior work on reporting completeness.15 The details captured by improved health service delivery,”47 which might offer a common
the data extraction form fields are outlined in Table 1. conceptual framework to enable a multiperspective and interprofes-
4 Journal of the American Medical Informatics Association, 2021, Vol. 00, No. 0

Table 1. Data extraction form fields and description

Toolkit What is the toolkit called?


Year In which year was the toolkit published?
Organization(s) Which organization(s) developed the toolkit?
Summary (rationale and scope) A brief summary of the toolkit’s rationale, purpose, aim/scope, and
tools/components, if described.
Purpose: Implementation Is the toolkit intended for use in implementation?
Purpose: Evaluation Is the toolkit intended for use in evaluation?
Target user group Who are the target user group, as specified by the toolkit itself?

Downloaded from https://academic.oup.com/jamia/advance-article/doi/10.1093/jamia/ocab010/6146672 by guest on 23 February 2021


Toolkit development overview Description of the stages and processes involved in toolkit development
Reported process Did the toolkit report its development process?
Literature review Did the development process include a literature review or evidence syn-
thesis?
Conceptual framework Did the development process include a conceptual framework?
Expert consensus Did the development process include expert consensus?
Field testing Did the development process include field testing?
Evaluation Was the toolkit evaluated?
Organizational level Which organizational level does the toolkit address? (ie, the national
health system level, health facility level, or operational/staff/user
level).

Figure 1. Flow diagram of study selection process. Ab: Abstract; DH: digital health; DHI: digital health implementation; Ti: Title.

sional approach to implementation and evaluation across the levels. mainly related to reporting and communication quality, rather than
Furthering this line of thinking, more coordinated implementation to rigor.
of digital health could be achieved by better conceptual synchronic-
ity between toolkits of all levels.
Following the release of the WHO-ITU National eHealth Strat- Recommendations for developing and reporting a
egy toolkit, many domain-specific operational and technical toolkits digital health toolkit
have emerged and are continuing to emerge. The plethora of largely For toolkits to successfully fulfill their purpose as a tool for KT, it is
untested toolkits we excluded risks “cognitive overload” among imperative that they demonstrate adequate methodological rigor, to
intended users; a limited number of focused, instructive, well-tested, convey “true” knowledge (epistemology) about a given implementa-
and clearly reported toolkits would likely be a more effective KT tion reality (ontology). This process should also include their evalua-
strategy to guide implementation of digital health interventions and tion, which ought to test and demonstrate the toolkits’ real-world
systems. Our findings indicate that increasingly more toolkits were applicability and usability. Furthermore, they should be trustwor-
made available over time. In light of the burgeoning demand for thy, and transparently and completely report their development pro-
health services,48,49 many more such toolkits will likely be devel- cess. Users should understand how a toolkit was developed in order
oped as emerging digital technologies (eg, artificial intelligence, ro- to meaningfully appraise its relevance and applicability to their con-
botics, “-omics”) are more widely adopted to support increasingly text and situation.
integrated models of care.50–53 However, generic frameworks or Starting by asking, “What makes a good toolkit?,” we build on
guidance for methodologically rigorous toolkit development are few this logic, pooling together our study findings, and guidance on
and far between. The AHRQ offers some guidelines,9 but these are guideline development,54 to propose a preliminary standard ap-
Journal of the American Medical Informatics Association, 2021, Vol. 00, No. 0 5

Table 2. Characteristics of included toolkits (listed chronologically)

Year Toolkit, Case Examples, Institution Purpose Summary


and Evaluation

2010-2019 Performance of Routine In- MEASURE Evaluation Implementation and evalu- Conceptual framework and
formation System Man- ation associated data collection
agement (PRISM) and analysis tools to as-
Toolkit16–18 sess, design, strengthen,
and evaluate RHIS

Downloaded from https://academic.oup.com/jamia/advance-article/doi/10.1093/jamia/ocab010/6146672 by guest on 23 February 2021


2011 eHealth Implementation eHealth Unit, University Implementation and evalu- Toolkit to enable senior
toolkit19,20 College London ation staff to analyze chal-
lenges likely to arise
when implementing an e-
health initiative.
Intended to promote crit-
ical thinking, not to re-
place it (ie, not a “tick-
box” tool).
2012 National eHealth Strategy WHO-ITU Implementation Comprehensive practical
Toolkit3,21–24 guidance for develop-
ment of a national
eHealth strategy, action
plan and monitoring
framework.
2015 mHealth Health Assess- WHO Implementation and evalu- Comprehensive self-assess-
ment and Planning for ation ment and planning guide
Scale (MAPS) toolkit25,26 to improve the potential
for scaling up and
achieving long-term sus-
tainability in mHealth
2016 HIS assessment support WHO Regional Office for Evaluation European-specific version
tool27,28 Europe of the National HIS As-
sessment toolkit,29 which
uses the HMN frame-
work to guide HIS evalu-
ation, for the
achievement of HMN
goals.
2017 Evaluating Person-Centred Computer and Information Implementation and evalu- Flexible toolkit used to
Digital Health and Well- Sciences, University of ation evaluate an evolving
ness at Scale30 Strathclyde large scale, national digi-
tal health project.
2017 Informatics Capability Academic GP Unit, UNSW Evaluation Assisted self-assessment
Maturity Toolkit31 Medicine tool to reflect on and
document the informat-
ics capability maturity of
a health facility.
2018 Routine HIS Rapid Assess- MEASURE Evaluation Evaluation Toolkit for assisting health
ment Tool32,33 information system man-
agers identify RHIS gaps
using global standards to
identify where resources
should be invested for
system strengthening.
2018 eHealth Literacy Assess- Department of Public Implementation and evalu- Toolkit for assessing indi-
ment Toolkit34,35 Health, University of Co- ation viduals’ health literacy
penhagen and digital literacy across
7 dimensions, using a
mix of existing and
newly developed scales.
2019 Coordinating Digital Digital Square, PATH Implementation Toolkit for implementing
Transformation practical strategies, en-
Toolkit36–40 abling approaches and
best practices for success-
fully coordinating the
digital health sector.

(continued)
6 Journal of the American Medical Informatics Association, 2021, Vol. 00, No. 0

Table 2. continued

Year Toolkit, Case Examples, Institution Purpose Summary


and Evaluation

2019 Global Digital Health Global Digital Health Evaluation Index for tracking, moni-
Index54 Index toring, and evaluating
progress in digital health
technology at the country
level across the 7 compo-
nents of the WHO-ITU

Downloaded from https://academic.oup.com/jamia/advance-article/doi/10.1093/jamia/ocab010/6146672 by guest on 23 February 2021


eHealth strategy frame-
work.
2019 HIS Interoperability Matu- Health Data Collaborative Evaluation Self-administered toolkit
rity (HISIM) Toolkit41–43 and MEASURE Evalua- designed to monitor,
tion evaluate, and report on
domains and compo-
nents required for a
country’s digital HIS to
exchange data (intero-
perate) with other health
systems, to inform a plan
for a strong, responsive,
and sustainable national
HIS.
2019 HIS Stages of Continuous Health Data Collaborative Evaluation Toolkit to help countries or
Improvement (HISSCI) and MEASURE Evalua- organizations holistically
Toolkit44,45 tion assess, plan, and priori-
tize interventions and
investments to strengthen
an HIS.

HIS: health information system; HMN: Health Metrics Network; ITU: International Telecommunication Union; mHealth: mobile health; RHIS: routine health
information system; WHO: World Health Organization.

proach to developing, testing and reporting toolkits for implement- included demonstrated use cases, methodological rigor was of con-
ing and evaluating digital health interventions (Table 5). Developing cern, as several did not include an underlying conceptual frame-
a good toolkit should include robust methodological rigor and com- work, or literature review, or evaluation and refinement in real-
plete and transparent reporting. These, in turn, are characterized by world settings. Reporting of approaches and methods was often in-
several characteristics (Table 5, left column) that can be operational- consistent and unclear, and toolkits rarely reported being evaluated.
ized through the recommendations provided (Table 5, right col- Toolkit development should exhibit greater methodological rigor,
umn). These recommendations are not intended as gospel, but whether in the evidence base (eg, literature review), theoretical
rather as a starting point for discourse among the digital health re- grounding (ie, underlying conceptual framework), participatory ap-
search and implementation community on this important matter. By proach (eg, co-creation, consensus processes), or in the testing, eval-
this review’s own critical standards, we recommend that these be uation, and refinement of the toolkits in real-world settings.
subjected to expert review and consensus, to create guiding criteria As a vital component of the widespread, global rollout of digi-
for developing and reporting DHI toolkits. As there are currently no tal health, it is imperative that, as a knowledge translation strat-
critical appraisal checklists for toolkit quality, we recommend that egy, toolkits fulfill their function efficiently and effectively.
similar principles be used to inform the development of a standard- Greater attention needs to be paid to developing, evaluating, and
ized tool for assessing toolkits’ quality and rigor, with a scoring sys- reporting toolkits to ensure that they effectively perform their
tem, for example (Table 5). intended function.

Limitations
Our study did not include non-English literature; it would add rele-
FUNDING
vance and validity to this study to include literature published in None.
other languages.

AUTHOR CONTRIBUTIONS
CONCLUSION MAG led the review and wrote the first draft of the manuscript. SA
The findings of this review raise concerns regarding the potential of coordinated the review and conducted the database searches. All
toolkits to effectively facilitate digital health implementation and authors conceptualized the work, contributed to data collection,
evaluation. Despite a plethora of published digital health toolkits, participated in screening search results, critically revised the manu-
very few demonstrated their application in real-world testing, and script, approved the final version to be published; and agree to be
even fewer had any evidence of having been evaluated. Of those that accountable for all aspects of the work. MAG, S-TL and SA partici-
Journal of the American Medical Informatics Association, 2021, Vol. 00, No. 0 7

Table 3. Toolkit development approach and methods

Year Toolkit Literature Review Expert Consensus Underlying Frame- Intended User Evaluated
work Group

2010-2019 PRISM Yes Yes PRISM framework Managers, system Yes


Toolkit16–18 evaluators, and
policy makers
2011 eHealth Implemen- Yes Yes Normalization Senior staff and Yes
tation tool- process theory Managers of
kit19,20

Downloaded from https://academic.oup.com/jamia/advance-article/doi/10.1093/jamia/ocab010/6146672 by guest on 23 February 2021


eHealth imple-
mentation
2012 National eHealth Unclear Yes WHO-ITU frame- Policymakers No
Strategy Tool- work
kit3,21–24
2015 MAPS toolkit25,26 Yes Yes MAPS framework Project managers No
and project
teams
2016 HIS assessment No Yes European Health HIS evaluators Yes
support tool27,28 Information Ini-
tiative frame-
work
2017 Evaluating Person- No Yes Normalization Care managers No
Centred Digital process theory
Health and
Wellness at
Scale30
2017 Informatics Capa- Yes Yes Capability matu- Care providers and No
bility Maturity rity models managers of
Toolkit31 health organiza-
tions
2018 Routine HIS Rapid Yes Yes Health Facility and Managers of HIS, Yes
Assessment Community In- programs, and
Tool32,33 formation Sys- data.
tem Standards
2018 eHealth Literacy No Yes The eHealth Liter- eHealth interven- Yes
Assessment acy Framework tion Users
Toolkit34,35
2019 Coordinating Digi- Yes Yes Not reported Governments, No
tal Transforma- implementing
tion partners, donors
Toolkit36–40
2019 Global Digital Unclear Yes WHO-ITU Frame- Policymakers No
Health Index46 work
2019 HIS Interoperabil- Yes Yes Principles of digital Policymakers in Yes
ity Maturity development; low-resource
(HISIM) several maturity settings
Toolkit41–43 models and as-
sessment tools
2019 HIS Stages of Con- Yes Yes National eHealth Policymakers Yes
tinuous Im- Strategy Toolkit
provement (WHO) toolkit,
(HISSCI) Health Metrics
Toolkit44,45 Network Assess-
ment (WHO);
HIS Strengthen-
ing Model
(MEASURE
Evaluation); De-
mand and Read-
iness Tool.

HIS: health information system; ITU: International Telecommunication Union; MAPS: mHealth Health Assessment and Planning for Scale; PRISM: Perfor-
mance of Routine Information System Management; WHO: World Health Organization.
8 Journal of the American Medical Informatics Association, 2021, Vol. 00, No. 0

Table 4. Generic classification of toolkits by organizational level

Organizational level Toolkit

Health system level Coordinating Digital Transformation Toolkit36–40


Global Digital Health Index46
National eHealth Strategy Toolkit3,21–23
Health organization/facility level HIS Interoperability Maturity (HISIM) Toolkit41–43
HIS Stages of Continuous Improvement (HISSCI) Toolkit44,45
Routine HIS Rapid Assessment Tool32,33
HIS assessment support tool27,28

Downloaded from https://academic.oup.com/jamia/advance-article/doi/10.1093/jamia/ocab010/6146672 by guest on 23 February 2021


Performance of Routine Information System Management (PRISM)
Toolkit16,17
Informatics Capability Maturity Toolkit31
Users (care providers and patients) level eHealth Literacy Assessment Toolkit34,35
Evaluating Person-Centred Digital Health and Wellness at Scale30
mHealth Health Assessment and Planning for Scale (MAPS) toolkit25,26
eHealth Implementation toolkit19,20

HIS: health information system.

Table 5. Preliminary considerations in developing toolkits for implementing and evaluating digital health

What Makes a Good Toolkit? Recommendations for Digital Health Intervention Toolkits

Rigor
• Philosophically grounded • Make the philosophical stance, assumptions and motivations explicit.
• Evidence based • Include a literature review.
• Theoretically grounded • Include a conceptual framework, model, or theory.
• Describe the framework building process.
• Participatory methods (co-creating toolkits with users) Include all stakeholders in development process, especially intended user
groups (eg, during needs assessment, establishing expert consensus).
• Use a methodology and study design that can incorporate appropriate
methods (eg, case study or mixed methods approaches).
• Testing and refinement • Ensure to describe the type of testing that the toolkit (eg, usability
testing, pilot testing, expert review, focus group with users).
• Describe the context in which the toolkit was tested, and how gener-
alizable this is to other contexts (eg, realist evaluation).
• Explain how the findings of testing informed the toolkit or frame-
work development process.
• Toolkit synthesis • Describe the research design and logic being used for framework syn-
thesis.
• Outline the resources used in developing the toolkit.
• Clearly detail the case(s) being used to develop and test the toolkit.
• Ensure alignment of objectives between review, expert consensus and
evaluation.
• Describe the approach used for triangulation of findings from differ-
ent sources, and integration (synthesis) of findings into the final
framework.
• Toolkit evaluation • Outline how the toolkit was evaluated (ie, the approach, study de-
sign, program logic, etc.).
• Describe the unintended consequences of the toolkit’s use in the con-
text of complex adaptive systems (ie, both benefits and harms).
• Describe how and why the toolkit was deemed to be “fit” for its
intended purpose.
Reporting
• Underlying logic and reasoning • Report the toolkit’s purpose and motivation.
• Trustworthy • Explicitly report the development process in context.
• Instructional clarity • Clearly describe the step-by-step process of how to use the toolkit.
• Personalized communication • Ensure clarity of communication to all intended users.
• Applicability • Ensure that the toolkit applies to meet needs of the system and/or in-
dividual users.
• Ensure that the toolkit applies to meet needs of organizational users
(ie, clinical, managerial, and technical staff).
• Ensure that the toolkit applies to meet needs of community and/or
citizen users.
Journal of the American Medical Informatics Association, 2021, Vol. 00, No. 0 9

pated in data extraction. S-TL guided the overall direction of the 12. Digital Impact Alliance. Principles for Digital Development. 2019. https://
work. digitalprinciples.org/ Accessed September 15, 2020.
13. Digital Square. Global Goods Guidebook: PATH. 2019. https://
digitalsquare.org/s/Global-Goods-Guidebook_V1.pdf Accessed Septem-
ber 15, 2020.
DATA AVAILABILITY STATEMENT 14. Veritas Health Innovation. Covidence systematic review software. 2019.
The data underlying this article are available in the article and in its www.covidence.org Accessed September 15, 2020.
online supplementary material. 15. Godinho MA, Gudi N, Milkowska M, Murthy S, Bailey A, Nair NS.
Completeness of reporting in Indian qualitative public health research: a
systematic review of 20 years of literature. J Public Health 2019; 41 (2):

Downloaded from https://academic.oup.com/jamia/advance-article/doi/10.1093/jamia/ocab010/6146672 by guest on 23 February 2021


405–11.
ACKNOWLEDGMENTS 16. Hotchkiss DR, Aqil A, Lippeveld T, Mukooyo E. Evaluation of the perfor-
We thank Dr Jitendra Jonnagaddala and Dr Padmanesan Narasim- mance of routine information system management (PRISM) framework:
han for their insights toward conceptualizing the review, and Ms evidence from Uganda. BMC Health Serv Res 2010; 10 (1): 188.
Donna Medeiros for her recommendations on sources of toolkits. 17. MEASURE Evaluation. PRISM Case Studies: Strengthening and Evaluating
RHIS. Chapel Hill, NC: University of North Carolina at Chapel Hill; 2008.
18. Aqil A, Lippeveld T, Hozumi D. PRISM framework: a paradigm shift for
CONFLICT OF INTEREST STATEMENT designing, strengthening and evaluating routine health information sys-
tems. Health Policy Plan 2009; 24 (3): 217–28.
This review was conducted to meet the Terms of Reference for the 19. MacFarlane A, Clerkin P, Murray E, et al. The e-health implementation
WHO Collaborating Centre for eHealth (AUS-135). toolkit: qualitative evaluation across four European countries. Implement
Sci 2011; 6 (1): 122.
20. Murray E, May C, Mair F. Development and formative evaluation of the
REFERENCES e-Health Implementation Toolkit (e-HIT). BMC Med Inform Decis Mak
1. World Health Organization. Seventy-First World Health Assembly, 2010; 10 (1): 61.
Agenda item 12.4, Digital Health. Secondary Seventy-First World Health 21. Darcy N, Elias M, Swai A, Danford H, Rulagirwa H, Perera S. eHealth
Assembly, Agenda item 12.4, Digital Health 2018. https://apps.who.int/ strategy development: a case study in Tanzania. J Health Inform Africa
gb/ebwha/pdf_files/WHA71/A71_R7-en.pdf Accessed September 15, 2014; 2 (2). doi: 10.12856/JHIA-2014-v2-i2-107
2020. 22. Ali S. Formulation of a National e-Health Strategy Development Frame-
2. Barac R, Stein S, Bruce B, Barwick M. Scoping review of toolkits as a work for Pakistan. Graduate Studies [master’s thesis]. Calgary, Alberta,
knowledge translation strategy in health. BMC Med Inform Decis Mak Canada, University of Calgary; 2013.
2014; 14 (1): 121. 23. Riazi H, Jafarpour M, Bitaraf E. Towards National eHealth Implementa-
3. World Health Organization and International Telecommunication Union. tion–a comparative study on WHO/ITU National eHealth Strategy Tool-
National eHealth Strategy Toolkit. 2012. https://apps.who.int/iris/bit- kit in Iran. Stud Health Technol Inform 2014; 205: 246–50.
stream/handle/10665/75211/9789241548465_eng.pdf? 24. Hamilton C. The WHO-ITU national eHealth strategy toolkit as an effec-
sequence¼1&isAllowed¼y Accessed September 15, 2020. tive approach to national strategy development and implementation. Stud
4. European Coordination Committee of the Radiological, Electromedical, Health Technol Inform 2013; 192: 913–6.
and Healthcare IT Industry. COCIR eHealth Toolkit. Integrated Care: 25. World Health Organization. The MAPS Toolkit: mHealth Assessment and
Breaking the Silos. 2015. https://www.cocir.org/uploads/media/15013. Planning for Scale. Geneva, Switzerland: World Health Organization; 2015.
COC_2.pdf Accessed September 15, 2020. 26. Labrique AB, Wadhwani C, Williams KA, et al. Best practices in scaling
5. Global Digital Health Index. Global Digital Health Index Indicator digital health in low and middle income countries. Global Health 2018;
Guide. https://static1.squarespace.com/static/5ace2d0c5cf- 14 (1): 103.
d792078a05e5f/t/5c1153d1352f53f8337b8dfb/1544639443105/GDHI- 27. Verschuuren M, Diallo K, Calleja N, Burazeri G, Stein C. World health or-
IndicatorþGuide.pdf Accessed September 15, 2020. ganization. First experiences with a WHO tool for assessing health infor-
6. WHO Collaborating Centre for eHealth. Terms of Reference: AUS 135 mation systems. Public Health Panor 2016; 2 (3): 379–82.
WHO Collaborating Centre for eHealth. eHealth 2019. https://sphcm. 28. World Health Organization Regional Office for Europe. Support Tool to
med.unsw.edu.au/sites/default/files/sphcm/Centres_and_Units/eHealth_ Assess Health Information Systems and Develop and Strengthen Health
terms_reference.pdf Accessed March 20, 2019. Information Strategies. Copenhagen, Denmark: WHO Regional Office
7. Yamada J, Shorkey A, Barwick M, Widger K, Stevens BJ. The effectiveness for Europe; 2015.
of toolkits as knowledge translation strategies for integrating evidence 29. World Health Organization. Assessing the National Health Information
into clinical care: a systematic review. BMJ Open 2015; 5 (4): e006808. System: An Assessment Tool. Version 4.00. Geneva, Switzerland: World
8. Davis MM, Howk S, Spurlock M, McGinnis PB, Cohen DJ, Fagnan LJ. A Health Organization; 2008.
qualitative study of clinic and community member perspectives on inter- 30. McGee-Lennon M, Bouamrane M-M, Grieve E, et al. A flexible toolkit for
vention toolkits: “Unless the toolkit is used it won’t help solve the prob- evaluating person-centred digital health and wellness at scale. In: Duffy
lem. BMC Health Serv Res 2017; 17 (1): 497. VG, Lightner N, eds. Advances in Human Factors and Ergonomics in
9. Agency for Healthcare Research and Quality (AHRQ). AHRQ Publishing Healthcare. New York, NY: Springer; 2017: 105–18.
and Communications Guidelines, Section 6: Toolkit Guidance. 2013. 31. Liaw S-T, Kearns R, Taggart J, et al. The informatics capability maturity
https://www.ahrq.gov/sites/default/files/publications/files/pcguide6.pdf of integrated primary care centres in Australia. Int J Med Inform 2017;
Accessed September 15, 2020. 105: 89–97.
10. World Health Organization. Global Strategy for Digital Health 2020- 32. MEASURE Evaluation. Validating the Effectiveness of a Rapid Assess-
2024. 2019. https://extranet.who.int/dataform/upload/surveys/183439/ ment Tool for Routine Health Information Systems. Chapel Hill, NC:
files/Draft%20Global%20Strategy%20on%20Digital%20Health.pdf MEASURE Evaluation, University of North Carolina., 2018.
Accessed September 15, 2020. 33. MEASURE Evaluation. Routine Health Information System Rapid As-
11. Ansari S, Godinho MA, Jonnagaddala J, Narasimhan P, Guo G, Liaw S-T. sessment Tool: Implementation Guide. Chapel Hill, NC: MEASURE Eval-
A systematic review of toolkits for implementation and evaluation of digi- uation, 2018.
tal health interventions. 2019 https://www.crd.york.ac.uk/prospero/dis- 34. Karnoe A, Furstrand D, Christensen KB, Norgaard O, Kayser L. Assessing
play_record.php?ID¼CRD42019147273 Accessed September 15, 2020. competencies needed to engage with digital health services: development
10 Journal of the American Medical Informatics Association, 2021, Vol. 00, No. 0

of the eHealth literacy assessment toolkit. J Med Internet Res 2018; 20 1565379490219/StateþofþDigitalþHealthþ2019.pdf Accessed Septem-
(5): e178. ber 15, 2020.
35. Knudsen AK, Kayser L. Validation of the eHealth Literacy Assessment 47. World Health Organization Regional office for South-East Asia. Regional Strat-
tool (eHLA). Int J Integr Care 2016; 16 (6): 349. egy for Strengthening eHealth in the South-East Asia Region WHO (2014-
36. PATH. Coordinating Digital Transformation: Ethiopia. Seattle, WA: Dig- 2020). 2015. https://apps.who.int/iris/bitstream/handle/10665/160760/SEA-
ital Square; 2019. HSD-366%20Rev.pdf?sequence¼1&isAllowed¼y Accessed September 15,
37. PATH. Coordinating Digital Transformation: Nepal. Seattle, WA: Digital 2020.
Square; 2019. 48. Lozano R, Fullman N, Mumford JE, et al. Measuring universal health cov-
38. PATH. Coordinating Digital Transformation: Tanzania. Seattle, WA: erage based on an index of effective coverage of health services in 204
Digital Square; 2019. countries and territories, 1990–2019: a systematic analysis for the Global

Downloaded from https://academic.oup.com/jamia/advance-article/doi/10.1093/jamia/ocab010/6146672 by guest on 23 February 2021


39. PATH. Coordinating Digital Transformation: Replication Guide. Seattle, Burden of Disease Study 2019. Lancet 2020; 396 (10258): 1250–84. doi:
WA: Digital Square; 2019. https://doi.org/10.1016/S0140-6736(20)30750-9
40. PATH. Coordinating Digital Transformation: Overview. Seattle, WA: 49. Murray CJL, Abbafati C, Abbas KM, et al. Five insights from the global
Digital Square; 2019. burden of disease Study 2019. Lancet 2020; 396 (10258): 1135–59.
41. MEASURE Evaluation. Health Information Systems Interoperability Ma- 50. Marcelo A, Medeiros D, Ramesh K, Roth S, Wyatt P. Transforming
turity Toolkit. 2019. https://www.measureevaluation.org/resources/tools/ Health Systems Through Good Digital Health Governance. 2018. www.
health-information-systems-interoperability-toolkit Accessed September adb.org/sites/default/files/publication/401976/sdwp-051-transforming-
15, 2020. health-systems.pdf Accessed September 15, 2020.
42. MEASURE Evaluation. Building a Strong and Interoperable Digital 51. Godinho MA, Borda A, Kostkova P, Molnar A, Liaw S-T. Serious Games’
Health Information System for Uganda. Chapel Hill, NC: MEASURE for unboxing Global Digital Health policymaking. BMJ Simul Technol
Evaluation; 2018. Enhanc Learn 2020; 6: 255–6.
43. MEASURE Evaluation. Building a Strong and Interoperable Health Infor- 52. Godinho MA, Ashraf MM, Narasimhan P, Liaw S-T. Community health
mation System for Ghana. Chapel Hill, NC: MEASURE Evaluation; alliances as social enterprises that digitally engage citizens and integrate
2018. services: A case study in Southwestern Sydney (protocol). Digit Health
44. MEASURE Evaluation. HIS Stages of Continuous Improvement Toolkit. 2020; 6: 205520762093011.
Chapel Hill, NC: MEASURE Evaluation; 2017. 53. Godinho MA, Jonnagaddala J, Gudi N, Islam R, Narasimhan P, Liaw S-
45. MEASURE Evaluation. Mapping a Path to Improve Uganda’s Health In- T. mHealth for integrated people-centred health services in the Western
formation System Using the Stages of Continuous Improvement Toolkit - Pacific: a systematic review. Int J Med Inform 2020; 142: 104259.[pub-
Workshop Report. Chapel Hill, NC: MEASURE Evaluation; 2019. lished Online First: Epub Date].
46. Mechael P, Ke Edelman J. The State of Digital Health 2019: Global Devel- 54. Logullo P, MacCarthy A, Kirtley S, Collins GS. Reporting guideline check-
opment Incubator, 2019. https://static1.squarespace.com/static/ lists are not quality evaluation forms: they are guidance for writing.
5ace2d0c5cfd792078a05e5f/t/5d4dcb80a9b3640001183a34/ Health Sci Rep 2020; 3 (2): e165. doi: 10.1002/hsr2.165.

You might also like