Professional Documents
Culture Documents
Toolkits For Implementing and Evaluating Digital Health
Toolkits For Implementing and Evaluating Digital Health
doi: 10.1093/jamia/ocab010
Review
Review
1
WHO Collaborating Centre on eHealth (AUS-135), School of Population Health, UNSW Sydney, New South Wales, Australia, 2Na-
tional Drug and Alcohol Research Centre, University of New South Wales, Sydney, New South Wales, Australia
Corresponding Author: Emeritus Prof. Dr. Siaw-Teng Liaw, Collaborating Centre on eHealth (AUS-135), School of Popula-
tion Health, UNSW Sydney, NSW 2052, Australia (siaw@unsw.edu.au)
Received 16 September 2020; Revised 8 January 2021; Editorial Decision 11 January 2021; Accepted 15 January 2021
ABSTRACT
Objective: Toolkits are an important knowledge translation strategy for implementing digital health. We studied
how toolkits for the implementation and evaluation of digital health were developed, tested, and reported.
Materials and Methods: We conducted a systematic review of toolkits that had been used, field tested or evalu-
ated in practice, and published in the English language from 2009 to July 2019. We searched several electronic
literature sources to identify both peer-reviewed and gray literature, and records were screened as per system-
atic review conventions.
Results: Thirteen toolkits were eventually identified, all of which were developed in North America, Europe, or
Australia. All reported their intended purpose, as well as their development process. Eight of the 13 toolkits in-
volved a literature review, 3 did not, and 2 were unclear. Twelve reported an underlying conceptual framework,
theory, or model: 3 cited the normalization process theory and 3 others cited the World Health Organization and
International Telecommunication Union eHealth Strategy. Seven toolkits were reportedly evaluated, but details
were unavailable. Forty-three toolkits were excluded for lack of field-testing.
Discussion: Despite a plethora of published toolkits, few were tested, and even fewer were evaluated. Methodo-
logical rigor was of concern, as several did not include an underlying conceptual framework, literature review,
or evaluation and refinement in real-world settings. Reporting was often inconsistent and unclear, and toolkits
rarely reported being evaluated.
Conclusion: Greater attention needs to be paid to rigor and reporting when developing, evaluating, and report-
ing toolkits for implementing and evaluating digital health so that they can effectively function as a knowledge
translation strategy.
INTRODUCTION Goals, urging stakeholders “to assess their use of digital technolo-
gies for health. . . and to prioritise [their] development, evaluation,
The 2018 World Health Assembly Resolution 71.7 on digital health
implementation, scale-up and greater use. . .”1 To this end, toolkits
recognized the role of digital technologies in achieving universal
are being increasingly used to guide the implementation and evalua-
health coverage and other targets of the Sustainable Development
tion of digital health interventions and systems. Toolkits are a
C The Author(s) 2021. Published by Oxford University Press on behalf of the American Medical Informatics Association.
V
All rights reserved. For permissions, please email: journals.permissions@oup.com
1
2 Journal of the American Medical Informatics Association, 2021, Vol. 00, No. 0
knowledge translation (KT) strategy used to communicate messages that together can guide users to develop a plan or organise efforts to
or share decision aids, tools, or goods to improve health, educate, follow evidence-based recommendations or meet evidence-based
or change practice or behavior among diverse populations—includ- specific practice standards.”9 The AHRQ also defines a tool as “an
ing patients, carers, clinical and managerial health professionals, instrument (e.g., survey, guidelines, or checklist) that helps users ac-
policymakers, community and health organizations—and the health complish a specific task that contributes to meeting a specific
system.2 evidence-based recommendation or practice standard.” As a KT
In 2012, the World Health Organization (WHO) and Interna- strategy, toolkits should be concise, understandable and clearly fo-
tional Telecommunication Union (ITU) developed the National cused, to enable the user to understand and operationalize or evalu-
eHealth Strategy Toolkit, which provides guidance to governments ate digital health strategies through a clear step-by-step process.
were perused for relevant material. Other highly relevant sources in RESULTS
the public domain that were specifically searched included: WHO
Of the 1473 records sourced from searches of electronic databases,
Institutional Repository for Information Sharing, Digital Impact Al-
gray literature, and perusal of relevant journals, 138 duplicates were
liance,12 Asian Development Bank, Asia eHealth Information Net-
removed and 1335 records were screened at the title and abstract
work, MEASURE Evaluation and Health Data Collaborative,
stage (Figure 1). Of these, 1231 records were excluded and 104
Digital Square Global Goods,13 Health Metrics Network, AHRQ,
articles proceeded to full-text screening. During full text screening,
COCIR, Joint Learning Network, Healthcare Information and Man-
88 full-text articles were excluded (reasons for these are outlined in
agement Systems Society, U.S. Office of the National Coordinator,
Figure 1) and 16 full-text articles (13 toolkits) were included for
and The Open Group. Gray literature is being identified on an ongo-
data extraction.
Figure 1. Flow diagram of study selection process. Ab: Abstract; DH: digital health; DHI: digital health implementation; Ti: Title.
sional approach to implementation and evaluation across the levels. mainly related to reporting and communication quality, rather than
Furthering this line of thinking, more coordinated implementation to rigor.
of digital health could be achieved by better conceptual synchronic-
ity between toolkits of all levels.
Following the release of the WHO-ITU National eHealth Strat- Recommendations for developing and reporting a
egy toolkit, many domain-specific operational and technical toolkits digital health toolkit
have emerged and are continuing to emerge. The plethora of largely For toolkits to successfully fulfill their purpose as a tool for KT, it is
untested toolkits we excluded risks “cognitive overload” among imperative that they demonstrate adequate methodological rigor, to
intended users; a limited number of focused, instructive, well-tested, convey “true” knowledge (epistemology) about a given implementa-
and clearly reported toolkits would likely be a more effective KT tion reality (ontology). This process should also include their evalua-
strategy to guide implementation of digital health interventions and tion, which ought to test and demonstrate the toolkits’ real-world
systems. Our findings indicate that increasingly more toolkits were applicability and usability. Furthermore, they should be trustwor-
made available over time. In light of the burgeoning demand for thy, and transparently and completely report their development pro-
health services,48,49 many more such toolkits will likely be devel- cess. Users should understand how a toolkit was developed in order
oped as emerging digital technologies (eg, artificial intelligence, ro- to meaningfully appraise its relevance and applicability to their con-
botics, “-omics”) are more widely adopted to support increasingly text and situation.
integrated models of care.50–53 However, generic frameworks or Starting by asking, “What makes a good toolkit?,” we build on
guidance for methodologically rigorous toolkit development are few this logic, pooling together our study findings, and guidance on
and far between. The AHRQ offers some guidelines,9 but these are guideline development,54 to propose a preliminary standard ap-
Journal of the American Medical Informatics Association, 2021, Vol. 00, No. 0 5
2010-2019 Performance of Routine In- MEASURE Evaluation Implementation and evalu- Conceptual framework and
formation System Man- ation associated data collection
agement (PRISM) and analysis tools to as-
Toolkit16–18 sess, design, strengthen,
and evaluate RHIS
(continued)
6 Journal of the American Medical Informatics Association, 2021, Vol. 00, No. 0
Table 2. continued
2019 Global Digital Health Global Digital Health Evaluation Index for tracking, moni-
Index54 Index toring, and evaluating
progress in digital health
technology at the country
level across the 7 compo-
nents of the WHO-ITU
HIS: health information system; HMN: Health Metrics Network; ITU: International Telecommunication Union; mHealth: mobile health; RHIS: routine health
information system; WHO: World Health Organization.
proach to developing, testing and reporting toolkits for implement- included demonstrated use cases, methodological rigor was of con-
ing and evaluating digital health interventions (Table 5). Developing cern, as several did not include an underlying conceptual frame-
a good toolkit should include robust methodological rigor and com- work, or literature review, or evaluation and refinement in real-
plete and transparent reporting. These, in turn, are characterized by world settings. Reporting of approaches and methods was often in-
several characteristics (Table 5, left column) that can be operational- consistent and unclear, and toolkits rarely reported being evaluated.
ized through the recommendations provided (Table 5, right col- Toolkit development should exhibit greater methodological rigor,
umn). These recommendations are not intended as gospel, but whether in the evidence base (eg, literature review), theoretical
rather as a starting point for discourse among the digital health re- grounding (ie, underlying conceptual framework), participatory ap-
search and implementation community on this important matter. By proach (eg, co-creation, consensus processes), or in the testing, eval-
this review’s own critical standards, we recommend that these be uation, and refinement of the toolkits in real-world settings.
subjected to expert review and consensus, to create guiding criteria As a vital component of the widespread, global rollout of digi-
for developing and reporting DHI toolkits. As there are currently no tal health, it is imperative that, as a knowledge translation strat-
critical appraisal checklists for toolkit quality, we recommend that egy, toolkits fulfill their function efficiently and effectively.
similar principles be used to inform the development of a standard- Greater attention needs to be paid to developing, evaluating, and
ized tool for assessing toolkits’ quality and rigor, with a scoring sys- reporting toolkits to ensure that they effectively perform their
tem, for example (Table 5). intended function.
Limitations
Our study did not include non-English literature; it would add rele-
FUNDING
vance and validity to this study to include literature published in None.
other languages.
AUTHOR CONTRIBUTIONS
CONCLUSION MAG led the review and wrote the first draft of the manuscript. SA
The findings of this review raise concerns regarding the potential of coordinated the review and conducted the database searches. All
toolkits to effectively facilitate digital health implementation and authors conceptualized the work, contributed to data collection,
evaluation. Despite a plethora of published digital health toolkits, participated in screening search results, critically revised the manu-
very few demonstrated their application in real-world testing, and script, approved the final version to be published; and agree to be
even fewer had any evidence of having been evaluated. Of those that accountable for all aspects of the work. MAG, S-TL and SA partici-
Journal of the American Medical Informatics Association, 2021, Vol. 00, No. 0 7
Year Toolkit Literature Review Expert Consensus Underlying Frame- Intended User Evaluated
work Group
HIS: health information system; ITU: International Telecommunication Union; MAPS: mHealth Health Assessment and Planning for Scale; PRISM: Perfor-
mance of Routine Information System Management; WHO: World Health Organization.
8 Journal of the American Medical Informatics Association, 2021, Vol. 00, No. 0
Table 5. Preliminary considerations in developing toolkits for implementing and evaluating digital health
What Makes a Good Toolkit? Recommendations for Digital Health Intervention Toolkits
Rigor
• Philosophically grounded • Make the philosophical stance, assumptions and motivations explicit.
• Evidence based • Include a literature review.
• Theoretically grounded • Include a conceptual framework, model, or theory.
• Describe the framework building process.
• Participatory methods (co-creating toolkits with users) Include all stakeholders in development process, especially intended user
groups (eg, during needs assessment, establishing expert consensus).
• Use a methodology and study design that can incorporate appropriate
methods (eg, case study or mixed methods approaches).
• Testing and refinement • Ensure to describe the type of testing that the toolkit (eg, usability
testing, pilot testing, expert review, focus group with users).
• Describe the context in which the toolkit was tested, and how gener-
alizable this is to other contexts (eg, realist evaluation).
• Explain how the findings of testing informed the toolkit or frame-
work development process.
• Toolkit synthesis • Describe the research design and logic being used for framework syn-
thesis.
• Outline the resources used in developing the toolkit.
• Clearly detail the case(s) being used to develop and test the toolkit.
• Ensure alignment of objectives between review, expert consensus and
evaluation.
• Describe the approach used for triangulation of findings from differ-
ent sources, and integration (synthesis) of findings into the final
framework.
• Toolkit evaluation • Outline how the toolkit was evaluated (ie, the approach, study de-
sign, program logic, etc.).
• Describe the unintended consequences of the toolkit’s use in the con-
text of complex adaptive systems (ie, both benefits and harms).
• Describe how and why the toolkit was deemed to be “fit” for its
intended purpose.
Reporting
• Underlying logic and reasoning • Report the toolkit’s purpose and motivation.
• Trustworthy • Explicitly report the development process in context.
• Instructional clarity • Clearly describe the step-by-step process of how to use the toolkit.
• Personalized communication • Ensure clarity of communication to all intended users.
• Applicability • Ensure that the toolkit applies to meet needs of the system and/or in-
dividual users.
• Ensure that the toolkit applies to meet needs of organizational users
(ie, clinical, managerial, and technical staff).
• Ensure that the toolkit applies to meet needs of community and/or
citizen users.
Journal of the American Medical Informatics Association, 2021, Vol. 00, No. 0 9
pated in data extraction. S-TL guided the overall direction of the 12. Digital Impact Alliance. Principles for Digital Development. 2019. https://
work. digitalprinciples.org/ Accessed September 15, 2020.
13. Digital Square. Global Goods Guidebook: PATH. 2019. https://
digitalsquare.org/s/Global-Goods-Guidebook_V1.pdf Accessed Septem-
ber 15, 2020.
DATA AVAILABILITY STATEMENT 14. Veritas Health Innovation. Covidence systematic review software. 2019.
The data underlying this article are available in the article and in its www.covidence.org Accessed September 15, 2020.
online supplementary material. 15. Godinho MA, Gudi N, Milkowska M, Murthy S, Bailey A, Nair NS.
Completeness of reporting in Indian qualitative public health research: a
systematic review of 20 years of literature. J Public Health 2019; 41 (2):
of the eHealth literacy assessment toolkit. J Med Internet Res 2018; 20 1565379490219/StateþofþDigitalþHealthþ2019.pdf Accessed Septem-
(5): e178. ber 15, 2020.
35. Knudsen AK, Kayser L. Validation of the eHealth Literacy Assessment 47. World Health Organization Regional office for South-East Asia. Regional Strat-
tool (eHLA). Int J Integr Care 2016; 16 (6): 349. egy for Strengthening eHealth in the South-East Asia Region WHO (2014-
36. PATH. Coordinating Digital Transformation: Ethiopia. Seattle, WA: Dig- 2020). 2015. https://apps.who.int/iris/bitstream/handle/10665/160760/SEA-
ital Square; 2019. HSD-366%20Rev.pdf?sequence¼1&isAllowed¼y Accessed September 15,
37. PATH. Coordinating Digital Transformation: Nepal. Seattle, WA: Digital 2020.
Square; 2019. 48. Lozano R, Fullman N, Mumford JE, et al. Measuring universal health cov-
38. PATH. Coordinating Digital Transformation: Tanzania. Seattle, WA: erage based on an index of effective coverage of health services in 204
Digital Square; 2019. countries and territories, 1990–2019: a systematic analysis for the Global