You are on page 1of 10

Original Article

Critical Appraisal Tools and Reporting


Guidelines for Evidence-Based Practice
Robin K. Buccheri, RN, PhD, NP, FAAN • Claire Sharifi, BS, MLIS

ABSTRACT
Keywords Background: Nurses engaged in evidence-based practice (EBP) have two important sets of tools:
critical appraisal Critical appraisal tools and reporting guidelines. Critical appraisal tools facilitate the appraisal
tools, process and guide a consumer of evidence through an objective, analytical, evaluation process.
evidence-based Reporting guidelines, checklists of items that should be included in a publication or report, ensure
nursing, that the project or guidelines are reported on with clarity, completeness, and transparency.
evidence-based Purpose: The primary purpose of this paper is to help nurses understand the difference between
practice, critical appraisal tools and reporting guidelines. A secondary purpose is to help nurses locate the
reporting guidelines appropriate tool for the appraisal or reporting of evidence.
Methods: A systematic search was conducted to find commonly used critical appraisal tools and
reporting guidelines for EBP in nursing.
Rationale: This article serves as a resource to help nurse navigate the often-overwhelming terrain
of critical appraisal tools and reporting guidelines, and will help both novice and experienced
consumers of evidence more easily select the appropriate tool(s) to use for critical appraisal and
reporting of evidence. Having the skills to select the appropriate tool or guideline is an essential
part of meeting EBP competencies for both practicing registered nurses and advanced practice
nurses (Melnyk & Gallagher-Ford, 2015; Melnyk, Gallagher-Ford, & Fineout-Overholt, 2017).
Results: Nine commonly used critical appraisal tools and eight reporting guidelines were found
and are described in this manuscript. Specific steps for selecting an appropriate tool as well as
examples of each tool’s use in a publication are provided.
Linking Evidence to Action: Practicing registered nurses and advance practice nurses must be
able to critically appraise and disseminate evidence in order to meet EBP competencies. This
article is a resource for understanding the difference between critical appraisal tools and reporting
guidelines, and identifying and accessing appropriate tools or guidelines.

INTRODUCTION This article provides definitions and descriptions of critical


Nurses engaged in evidence-based practice (EBP) have two im- appraisal tools and reporting guidelines and rationales for their
portant sets of tools: (a) critical appraisal tools that aid in assess- use. A selection of frequently used critical appraisal tools and
ing evidence for validity, reliability, and applicability to clinical reporting guidelines are described and instructions are pro-
practice, and (b) reporting guidelines that aid in the structured, vided for selecting the most appropriate tools. Information on
comprehensive, and transparent dissemination of outcomes how to access the full text of selected critical appraisal tools and
and findings during the publication process. Both critical ap- reporting guidelines is provided as well as examples of each
praisal tools and reporting guidelines are distinct entities and tool’s use in a publication.
each is essential to EBP. Selecting the most appropriate critical
BACKGROUND
appraisal tool or reporting guideline can be very challenging
for both novice and expert consumers of evidence. Rationale for Using Critical Appraisal Tools
The primary purpose of this paper is to help nurses un- In order to answer a clinical question to improve practice,
derstand the difference between critical appraisal tools and nurses must be able to evaluate the body of evidence on a
reporting guidelines. A second purpose is to help them find topic. Critical appraisal, defined by Duffy (2005) as “an objec-
the appropriate tool for the job, whether that job is the critical tive, structured approach that results in a better understanding
appraisal of evidence or reporting the results of an EBP project, of a study’s strengths and weaknesses” (p. 282), is the pro-
a research study, or a clinical practice guideline (CPG). cess that allows the nurse to identify evidence that comes from

Worldviews on Evidence-Based Nursing, 2017; 00:0, 1–10. 1



C 2017 Sigma Theta Tau International
Critical Appraisal Tools and Reporting Guidelines

rigorous, reliable, unbiased, and methodologically appropriate ference attendance, reviewing EBP textbooks, and networking
research (Melnyk & Fineout-Overholt, 2015). with other EBP nurse educators. Next, both authors collabo-
Critical appraisal tools allow nurses to evaluate the evidence rated on a comprehensive search to validate the list and to iden-
using structured questions and/or a checklist. However, they tify other commonly used critical appraisal tools and reporting
are not a one-size-fits-all resource and nurses often turn to a guidelines. PubMed, CINAHL and Scopus were searched us-
familiar critical appraisal tool, regardless of whether or not it ing a combination of keywords and subject headings for the
is the most appropriate tool for the methodology of the article following concepts: Critical appraisal, critique tool, and report-
they are reviewing. Compounding the problem is the lack of ing guidelines.
a “gold standard” critical appraisal tool and the sheer volume Nine critical appraisal tools and eight reporting guidelines
of available tools. This can make matching the tool to the type were selected based on their relevancy to nursing, their ease
of evidence problematic, particularly for novice consumers of of use, and their reported frequency of use. The literature dis-
evidence (Katrak, Bialocerkowski, Massy-Westropp, Kumar, & cussing the development and use of each selected tool and
Grimmer, 2004). guideline was reviewed. A brief synopsis of each tool was de-
Having the skills to select the appropriate tool or guideline veloped, along with tables to help select the appropriate tool
is an essential part of meeting EBP competencies for both prac- or guideline, information about how to access the full text of
ticing registered nurses and advanced practice nurses (Melnyk the tool or guideline, and an example of the tool or guideline
& Gallagher-Ford, 2015; Melnyk, Gallagher-Ford, & Fineout- in a publication. Where one tool serves both functions—a tool
Overholt, 2017). Additionaly, critical appraisal is an EBP com- that was developed to be a critical appraisal tool and a report-
petency for both of these groups of practicing nurses. (Melnyk, ing guideline; we have noted it and included the tool in both
Gallagher-Ford, Long, & Fineout-Overholt, 2014). In order to categories.
educate nurses to evaluate a body of literature and translate
research into practice, academic institutions must lay the foun- CRITICAL APPRAISAL TOOLS
dation by teaching students to critically appraise research and The following steps provide a roadmap for selecting an appro-
other types of evidence using the most appropriate tools. priate critical appraisal tool:

Rationale for Using Reporting Guidelines in


(1) Determine the type of evidence to be appraised.
Publishing Prioritize preappraised evidence (systematic review,
Reporting guidelines—checklists of items that researchers meta-analysis, meta-synthesis, CPGs) over individ-
should include in a publication—ensure that the research pro- ual primary research studies (Melnyk et al., 2017).
cess, EBP and quality improvement projects, and CPGs are
reported with clarity and in a manner that allows for critical (2) Go to Table 1 and identify the tools appropriate for
appraisal. Reporting guidelines often specify a minimum set that type of evidence.
of items that need to be reported in order to provide a clear (3) Read the brief summaries on relevant tools in the be-
and transparent account of the process and findings (National low “Summaries of Selected Critical Appraisal Tools”
Library of Medicine, 2015). section and select the most apropriate one.
Opaque reporting is directly associated with biased conclu-
sions and, less directly, with errors in biomedical publishing (4) Go to Table 2 to locate the full text of the tool and a
and the inefficient use of scarce resources. As Moher, Altman, citation for an article that demonstrates the tool in
Schulz, Simera, and Wager (2014) state, “without a clear un- use.
derstanding of how a study was done, readers are unable to
judge whether the findings are reliable” (p. 4). A systematic SUMMARIES OF SELECTED CRITICAL
review by Samaan et al. (2013) found that adherence to report- APPRAISAL TOOLS
ing guidelines in the medical literature was suboptimal and
Below are brief descriptions of eight frequently used critical
they recommended that educators incorporate guidelines into
appraisal tools that are also displayed in Table 1. Information
the curriculum to increase the amount of medical literature
on how to access each critical appraisal tool and an example of
that adheres to reporting guidelines. Incorporating reporting
each tool’s use in an article are included in Table 2.
guidelines into nursing education would help registered and
advanced practice nurses achieve EBP competencies related to
Appraisal of Guidelines for Research and
disseminating the evidence (Melnyk et al., 2017).
Evaluation II
The Appraisal of Guidelines for Research and Evaluation II
SEARCH METHODOLOGY (AGREE II) instrument is a critical appraisal tool specifically
One author amassed a bibliography of critical appraisal tools for CPGs. It was first developed in 2003 by the AGREE col-
and reporting guidelines during her 8 years of teaching EBP laboration, an international group of guideline developers. The
at the doctoral level. The collection was expanded through con- original instrument was refined and AGREE II was released

2 Worldviews on Evidence-Based Nursing, 2017; 00:0, 1–10.



C 2017 Sigma Theta Tau International
Original Article
Table 1. Selected Critical Appraisal Tools

Johns
Johns Hopkins
Hopkins Non- Rapid
CASP Research Research Critical
checklist JBI Evidence Evidence Appraisal
(Critical Cochrane checklists Appraisal Appraisal Checklists
a
AGREE II Appraisal Risk of EPQA (Joanna Tool Tool (Melnyk &
Name of rating scale (Brouwers Skills Pro- Bias Tool Guidelines GRADE Briggs (Dearholt (Dearholt Fineout-
or checklist/type et al., gramme, (Higgins et (Lee et al., (Dijkers, Institute, & Dang, & Dang, Overholt,
of evidence 2010) 2017) al., 2011) 2013) 2013) 2016) 2012) 2012) 2015)

Developed for use in N N N Y N Y Y Y Y


evidence-based
practice
Meta-analysis X X
Systematic review X X X X X
Literature review X
Randomized X X X X X
controlled trial
Cohort study X X X X
Case-control study X X X X
Meta-synthesis X
Qualitative study X X X X
Expert opinion X X
Evidence-based X X
practice project
Quality improvement X
project
Clinical practice X X X X X
guideline

Note. Directions: (a) Locate the type of evidence you would like to evaluate in the left column and read across the rows to identity an appropriate critical appraisal
tool. (b) For information on accessing the full text of a tool and to see an example of its use, see Table 2.
a
Developed to be both a critical appraisal tool and reporting guideline.

in 2010 (Brouwers et al., 2010). The AGREE II can be used Appraisal Skills Programme, 2017). The checklists all have
as a quality assessment tool for readers of clinical guidelines. between 10 and 12 yes or no items with some open-ended
The checklist covers six quality domains and each domain has questions. These checklists were developed for use in educa-
between two and six questions. The Agree II can be found at tional workshops and may be challenging for novices working
https://www.agreetrust.org/resource-centre/. independently. The various CASP checklists can be found at
https://www.casp-uk.net/#!casp-tools-checklists/c18f8.
Critical Appraisal Skills Programme Checklists
Critical Appraisal Skills Programme (CASP) checklists were Cochrane Risk of Bias Tool
developed in 1993 and are a product of the CASP from Oxford, This tool was developed to assess the risk of bias in each study
England. CASP checklists are critical appraisal tools, and CASP reported in a Cochrane Systematic Review. Bias occurs when,
offers checklists for the following eight types of research: sys- because of methodological flaws, authors overestimate or
tematic reviews, randomized controlled trials, diagnostic stud- underestimate the effect of interventions. Bias can affect the
ies, economic evaluations, qualitative research, case control validity of study findings. In clinical trials, common types of
studies, cohort studies, and clinical prediction rules (Critical bias include selection bias, performance bias, detection bias,

Worldviews on Evidence-Based Nursing, 2017; 00:0, 1–10. 3



C 2017 Sigma Theta Tau International
Critical Appraisal Tools and Reporting Guidelines

Table 2. Accessing Critical Appraisal Tools and Examples of Their Use

Agree II Full text https://www.agreetrust.org/resource-centre/agree-reporting-checklist/


Example Tremblay et al. (2011)
CASP checklists: Full text https://www.casp-uk.net/#!casp-tools-checklists/c18f8
qualitative
Example Masood, Thaliath, Bower, and Newton (2011)
CASP checklists: Full text https://www.casp-uk.net/#!casp-tools-checklists/c18f8
quantitative
Example Smith, Walker, and Russell (2007)
Cochrane Risk of Bias Full text https://handbook.cochrane.org/chapter_8/table_8_5_a_the_cochrane_
Tool collaborations_tool_for_assessing.htm
Example van Esch, Stegeman, and Smit (2017)
EPQA Guidelines Full text Lee et al. (2013)
Example Milner (2014)
GRADE Full text https://www.gradeworkinggroup.org/
Example Dellinger et al. (2013)
Joanna Briggs Full text https://joannabriggs.org/research/critical-appraisal-tools.html
Institute Critical
Appraisal Tools
Example Paton, Hatton, Rome, and Kent (2016)
Johns Hopkins Full text https://www.hopkinsmedicine.org/evidence-based-practice/_docs/appendix_e_
Research Evidence research_evidence_appraisal_tool.pdf
Appraisal Tool
Example Santos, Woith, Freitas, and Zeferino (2016)
Johns Hopkins Full text https://www.hopkinsmedicine.org/evidence-based-practice/_docs/appendix_f_
Non-Research nonresearch_evidence_appraisal_tool.pdf
Evidence Appraisal
Tool
Example of Gutierrez, Silbert-Flagg, and Vohra (2014)
tool in use
Rapid Critical Full text Found in: Melnyk and Fineout-Overholt (2015)
Appraisal
Checklists
Example of Hoffman Snyder and Facchiano (2011)
tool in use

attrition bias, and reporting bias (Higgins, Altman, & Sterne, Bias Tool is published in chapter 8 of the Cochrane Handbook
2011). Unlike many of the other tools described in this paper, for Systematic Reviews of Interventions and can be found at
the Cochrane Risk of Bias Tool supports just one column in https://handbook.cochrane.org/chapter_8/table_8_5_a_the_
an evidence table—the risk of bias column. The Cochrane cochrane_collaborations_tool_for_assessing.htm.
Risk of Bias Tool includes seven items, and each item has
a “Support for Judgment” field that provides background Evidence-Based Process Quality Assessment
information on how to evaluate that item, and a “Review Guidelines
Authors’ Judgment” field that includes examples of language Evidence-Based Process Quality Assessment (EPQA) Guide-
that can be included in an evidence table. The Cochrane Risk of lines were created in 2013 in response to both the

4 Worldviews on Evidence-Based Nursing, 2017; 00:0, 1–10.



C 2017 Sigma Theta Tau International
Original Article
proliferation of publications reporting on EBP projects, as well questions that facilitate the evaluation of the study design
as the lack of critical appraisal tools to evaluate rigor and qual- and level of evidence. The tool asks users to answer three
ity of EBP projects before implementing recommendations to fairly simple questions, the answers to which allow users
change practice (Lee, Johnson, Newhouse, & Warren, 2013). to determine the methodology of the study, and hence the
The EPQA guidelines checklist is based on the Preferred Re- level of evidence. Levels of evidence range from I (random-
porting Items for Systematic Reviews and Meta-Analyses State- ized controlled trial) to III (nonexperimental-qualitative).
ment (PRISMA) Tool, but with specific edits to make it appli- The tool also includes a section on appraising systematic
cable to publications that discuss EBP projects. The checklist reviews, meta-analysis, and meta-synthesis. The next section
contains 34 items and can be used as a critical appraisal tool for of the tool walks users through appraising the quality of
readers of EBP project reports. It can also be used by nursing the research study through the use of a 16-item checklist
educators for discussions with students about the attributes of for research studies and a 12-item checklist for systematic
EBP projects. More information about EPQA can be found at reviews, meta-analyses, or meta-syntheses. More information,
https://www.ncbi.nlm.nih.gov/pubmed/23387900. as well as permissions and the full text of the Research
Evidence Appraisal Tool can be found at https://www.
Grading of Recommendations Assessment, hopkinsmedicine.org/evidence-based-practice/
Development, and Evaluation jhn_ebp.html.
Grading of Recommendations Assessment, Development, and
Evaluation (GRADE) was developed by an international panel
in 2011 (Dijkers, 2013). GRADE was designed to provide one Johns Hopkins Non-Research Evidence Appraisal
systematic approach for evaluating the quality of medical evi- Tool
dence and grading the strength of recommendations in system-
The John Hopkins Non-Research Evidence Appraisal Tool
atic reviews, health technology assessments, and CPGs (Guyatt
(Dearholt & Dang, 2012) first guides users through identifying
et al., 2011). The goal was to reduce bias and assist in the devel-
what type of nonresearch item they are reading—a CPG, a
opment of “a common, sensible and transparent approach to
consensus or policy statement, a literature review, an expert
grading quality (or certainty) of evidence and strength of recom-
opinion piece, an organizational experience, a case report, or
mendations” (Grade Working Group, n.d.). GRADE guidelines
a community standard or clinician experience or consumer
outline criteria for grading the quality of evidence for each study
preference article. Within each nonresearch item subsection,
outcome, upgrading and downgrading evidence, and for rating
there is an evaluation checklist. It is an appropriate tool
the overall quality of the evidence. GRADE has been adopted
for both novice and expert consumers of evidence. More
for use by organizations such as the Cochrane Collaboration
information, as well as permissions and the full text of
and the World Health Organization (Dijkers, 2013). GRADE
the Non-Research Evidence Appraisal Tool can be found at
is part of GRADEpro, software package for guideline develop-
https://www.hopkinsmedicine.org/evidence-based-practice/
ment and adoption. More information about GRADE can be
jhn_ebp.html.
found at https://www.gradeworkinggroup.org/.

Joanna Briggs Institute: Critical Appraisal Tools


Joanna Briggs Institute (JBI), an international organization Rapid Critical Appraisal Checklists: EBP in Nursing
dedicated to the promotion and adoption of EBP, offers and Healthcare
a selection of critical appraisal tools (Joanna Briggs Insti- Melnyk and Fineout-Overholt’s (2015) Evidence-Based Practice
tute, 2016). There are 13 tools, each of which addresses a in Nursing and Healthcare contains a series of Rapid Critical Ap-
specific type of study or other form of evidence. Each tool praisal Checklists, all of which are appropriate tools for novice
contains an introduction to JBI and a checklist followed and expert consumers of evidence. There is a General Appraisal
by an in-depth explanation of each question. Each check- Overview for All Studies that contains fields for the article ci-
list contains a series of critical appraisal questions and tation, the Population, Intervention, Comparison, Outcome,
ends with an overall appraisal decision. The questions and Timeframe (PICOT) question (Melynyk & Fineout-Overholt,
explanations are clearly written and could be utilized by 2015), and a very general overview of the study (e.g., purpose,
novice consumers of evidence. The checklists can be found at design, sampling). This general appraisal form is followed by
https://joannabriggs.org/research/critical-appraisal-tools.html. Rapid Critical Appraisal Checklists for the following types of
literature: descriptive studies, EBP implementation or quality
Johns Hopkins Research Evidence Appraisal Tool improvement projects, cohort studies, randomized controlled
The Johns Hopkins Research Evidence Appraisal Tool trials, systematic reviews of clinical interventions and treat-
(Dearholt & Dang, 2012) is a tool and rating scale that ments, qualitative evidence, and evidence-based guidelines.
facilitates the critical appraisal of evidence. It is a commonly The checklists contain between 3 and 32 items. The check-
used tool appropriate for both novice and expert consumers lists can be found in Evidence-Based Practice in Nursing and
of evidence. The Research Evidence Appraisal Tool includes Healthcare (https://www.lww.com/Product/9781451190946).

Worldviews on Evidence-Based Nursing, 2017; 00:0, 1–10. 5



C 2017 Sigma Theta Tau International
Critical Appraisal Tools and Reporting Guidelines

Table 3. Selected Reporting Guidelines

CONSORT
AGREE checklist
a
Reporting and flow EPQA PRISMA SQUIRE 2.0 STROBE
Name of reporting Checklist diagram COREQ Guidelines ENTREQ Guidelines Guidelines (Vanden-
guideline/type of (Brouwers (Turner (Tong et al., (Lee et al., (Tong et al., (Moher et (Ogrinc broucke
evidence et al., 2016) et al., 2012) 2007) 2013) 2012) al., 2009) et al., 2016) et al., 2007)

Meta-analysis X
Systematic review X
Randomized X X
controlled trial
Cohort study X
Case-control study X
Cross-sectional study X
Meta-synthesis X
Qualitative study X
Evidence-based X
practice project
Quality improvement X
project
Clinical practice X
guideline

Note. Directions: Locate the type of evidence you are disseminating in the left column and read across the rows to identify an appropriate reporting guideline.
a
Developed to be both a critical appraisal tool and reporting guideline.

REPORTING GUIDELINES AGREE Reporting Checklist


The following steps provide a roadmap for selecting an appro- The AGREE Reporting Checklist was developed to improve the
priate reporting guideline. comprehensiveness, completeness, and transparency of prac-
tice guidelines (Brouwers, Kerkvliet, & Spithoff, 2016). The
(1) Determine the type of evidence to be disseminated. 23-item checklist aligns with the structure of the AGREE II and
retains its six quality domains. The checklist can be found at
(2) Go to Table 3 and identify the appropriate guideline https://www.agreetrust.org/resource-centre/agree-reporting-
to report that type of evidence. checklist/.
(3) Read the brief summary of the relevant reporting
guideline in the below “Summaries of Selected Re-
porting Guidelines” section and select the most apro- CONsolidated Standards of Reporting Trials
priate one. CONsolidated Standards of Reporting Trials (CONSORT; Con-
sort: Transparent Reporting of Trials, 2010) was developed to
(4) Go to Table 4 to locate the full text of the report-
provide standardized guidelines for the transparent reporting
ing guideline and a citation for an article using this
of randomized clinical trials (Turner et al., 2012). It consists
guideline.
of a 25-item checklist that provides detailed information to be
reported under six categories (title and abstract, introduction,
SUMMARIES OF SELECTED REPORTING methods, results, discussion, and other information) and a flow
GUIDELINES diagram that includes four categories (enrollment, allocation,
Below are brief of descriptions of eight guidelines that nurses follow-up, and analysis). It asks for the specific number of sub-
are likely to encounter. The guidelines below are listed in jects who participated from initial assessment of eligibility to
Table 3. number of subjects included and excluded in the final analysis,

6 Worldviews on Evidence-Based Nursing, 2017; 00:0, 1–10.



C 2017 Sigma Theta Tau International
Original Article
Table 4. Accessing Reporting Guidelines and Examples of Their Use

Agree reporting checklist Full text https://www.agreetrust.org/resource-centre/agree-reporting-checklist/


Example Deery (2017)
CONSORT Full text https://www.equator-network.org/reporting-guidelines/consort/
Example O’Brien et al. (2015)
COREQ Full text https://www.equator-network.org/reporting-guidelines/coreq/
Example Alnahedh, Suttle, Alabdelmoneam, and Jalbert (2015)
EPQA Guidelines Full text Lee et al. (2013)
Example Milner (2014)
ENTREQ Full text https://www.equator-network.org/reporting-guidelines/entreq/
Example Hall, Leach, Brosnan, and Collins (2017)
PRISMA Statement Full text https://www.equator-network.org/reporting-guidelines/prisma/
Example Minges and Redeker (2016)
SQUIRE 2.0 Full text https://www.equator-network.org/reporting-guidelines/squire/
Example of tool in use Repique, Vernig, Lowe, Thompson, and Yap (2016)
STROBE Guidelines Full text https://www.equator-network.org/reporting-guidelines/strobe/
Example of tool in use Walston, Cabrera, Bellew, Olive, Lohse, and Bellolio (2016)

and reasons for inclusion and exclusion. The checklist can be both in the planning and reporting of their EBP project.
found at https://www.consort-statement.org/. More information about EPQA Guidelines can be found at
https://www.ncbi.nlm.nih.gov/pubmed/23387900.
COnsolidated Criteria for REporting Qualitative
Research ENhancing Transparency in REporting the
The COnsolidated Criteria for REporting Qualitative (COREQ) Synthesis of Qualitative Research
is a checklist developed as a reporting guideline for the explicit ENhancing Transparency in REporting the Synthesis of Qual-
and comprehensive reporting of qualitative studies that use in- itative Research (ENTREQ) reporting guideline was created
depth interviews and focus groups (Tong, Sainsbury, & Craig, in 2012 (Tong, Flemming, McInnes, Oliver, & Craig, 2012).
2007). The 32-item checklist covers three domains: Research ENTREQ provides a reporting guideline for meta-synthesis
team and reflexivity, study design, and analysis and findings. articles—articles that synthesize qualitative research. The EN-
The checklist was developed from a comprehensive search for TREQ reporting guideline consists of 21 items that are grouped
existing guidelines to assess qualitative research reports. The into five distinct domains: introduction, methods and method-
authors reported finding no comprehensive reporting checklist ology, literature search and selection, appraisal, and synthe-
for qualitative research so items retrieved were compiled into sis of findings. ENTREQ reporting guideline can be found
the COREQ. More information on the checklist can be found at at https://bmcmedresmethodol.biomedcentral.com/articles/
https://www.equator-network.org/reporting-guidelines/coreq. 10.1186/1471-2288-12-181.

EPQA Guidelines PRISMA Statement


According to Milner (2016), “EPQA [Guidelines] can be used PRISMA Statement was developed in 2009 by an inter-
as reporting standards for EBP projects” (p. 301). The EPQA national group of researchers who revised the QUality Of
Guidelines checklist contains 34 items that can serve as a Reporting Of Meta-analyses (QUOROM) Statement to include
reporting guideline for authors writing an EBP report. Mil- systematic reviews (Moher, Liberati, Tetzlaff, Altman, &
ner suggests that authors address all 34 items and should The PRISMA Group, 2009). PRISMA consists of a flow
state in their report that EPQA Guidelines were followed diagram and a checklist of 27 items that are essential to clear,

Worldviews on Evidence-Based Nursing, 2017; 00:0, 1–10. 7



C 2017 Sigma Theta Tau International
Critical Appraisal Tools and Reporting Guidelines

transparent systematic review reporting (Moher, Liberati, SUMMARY AND CONCLUSIONS


Tetzlaff, & Altman, 2009). PRISMA is a tool authors can use Critical appraisal tools help nurses move from subjective eval-
to improve the reporting quality of their systematic reviews uation toward a more objective and analytical assessment of ev-
and meta-analyses. Improved reporting of systematic reviews idence. Reporting guidelines improve both transparency and
and meta-analyses results in increased transparency, and the quality of publications and reports. Together these tools
allows readers to more effectively evaluate the quality and help nurses attain EBP competencies (Melnyk et al., 2017)
findings of these publications (Liberati et al., 2009; Moher as well as improve general critical thinking skills (Whiffin &
et al., 2009). More information on PRISMA can be found at Hasselder, 2013).
https://www.ncbi.nlm.nih.gov/pubmed/19621072. Although critical appraisal tools and reporting guidelines
are useful tools that have the potential to improve scholarship
Revised Standards for QUality Improvement and EBP, identifying and selecting the appropriate tool is a po-
Reporting Excellence 2.0 tentially challenging and frustrating experience for both novice
and expert consumers and reporters of evidence. By providing
Standards for QUality Improvement Reporting Excellence
clear descriptions of each tool, as well as tables that provide
(SQUIRE) guidelines were developed to provide a framework
easy reference for matching the type of tool with an article’s
for authors reporting results of system level approaches de-
methodology, this article lessens that challenge and minimizes
signed to improve health care (quality, safety, value). The
frustration.
most recent version, SQUIRE 2.0 (2017), includes 18 cate-
Facilitating the selection of appropriate critical appraisal
gories (each with multiple items) that should all be consid-
tools and reporting guidelines is useful to nurses with vary-
ered but are not all applicable to every report (Ogrinc et al.,
ing levels of competency in EBP. Nurses who are just learning
2016). The SQUIRE 2.0 Explanation and Elaboration with ex-
how to critically appraise research and other types of evidence
amples, the Guidelines, and the Checklist can all be found at
will find the overview of the different types of critical appraisal
https://www.squire-statement.org.
tools particularly useful. For those with more advanced EBP
competencies, this article will serve as both a resource for se-
STrengthening the Reporting of OBservational lecting a critical appraisal tool that can be used during the
studies in Epidemiology Guidelines evidence review process, and as resource for identifying report-
The STrengthening the Reporting of OBservational studies ing guidelines for use when writing up reports to disseminate
in Epidemiology (STROBE) Guidelines were created in 2007 evidence. WVN
by an international group of epidemiologists, methodologists,
statisticians, researchers and journal editors (Vandenbroucke
et al., 2007). STROBE Guidelines are intended to strengthen LINKING EVIDENCE TO ACTION
the reporting of observational epidemiological studies
(Vandenbroucke et al., 2007). Specifically, STROBE check- r Practicing registered nurses and advanced prac-
lists exist for cohort studies, case-control studies, and cross- tice nurses must be able to critically appraise and
sectional studies (Vandenbroucke et al., 2007). STROBE also disseminate evidence in order to meet EBP com-
makes available an Explanation and Elaboration article which petencies.
discusses each checklist item and provides examples of trans-
r Differentiating between a critical appraisal tool
parent reporting. The Explanation and Elaboration article can
be found at https://journals.plos.org/plosmedicine/article?id, and a reporting guideline is an essential EBP skill,
https://doi.org/10.1371/journal.pmed.0040297 and the full as is selecting the appropriate tool or guideline.
text of all checklists can be found at https://www.strobe- r Selecting the appropriate critical appraisal tool or
statement.org/index.php?id=available-checklists. reporting guideline has the potential to make the
critical appraisal and publishing processes more
Additional Reporting Guideline Resource effective and less frustrating and laborious.
In addition to the selected guidelines summarized above, En- r Increased use of critical appraisal tools and report-
hancing the QUAlity and Transparency Of health Research ing guidelines will support EBP and improve nurs-
(EQUATOR Network) is a useful resource for identifying addi- ing practice.
tional reporting guidelines. The EQUATOR Network, founded
in 2006 and funded by the UK National Health Services (NHS)
National Knowledge Service, currently maintains a library that
contains over 200 reporting guidelines (Moher et al., 2014). In Author information
addition, the EQUATOR Network provides extensive tool kits Robin K. Buccheri, Professor, School of Nursing & Health Pro-
to improve the reporting of health research studies and can be fessions, University of San Francisco, San Francisco, CA, USA;
found at https://www.equator-network.org. Claire Sharifi, Reference Librarian and Primary Liaison, School

8 Worldviews on Evidence-Based Nursing, 2017; 00:0, 1–10.



C 2017 Sigma Theta Tau International
Original Article
of Nursing & Health Professions, Gleeson Library, Geschke Guyatt, G., Oxman, A. D., Akl, E. A., Kunz, R., Vist, G.,
Center, University of San Francisco, San Francisco, CA 94117, Brozek, J., . . . Schunemann, H. J. (2011). GRADE Guidelines:
USA 1. Introduction—GRADE evidence profiles and summary of
findings tables. Journal of Clinical Epidemiology, 64, 383–
Address correspondence to Dr. Robin Buccheri, School of 394.
Nursing & Health Professions, Cowell Hall #222, University
Hall, H., Leach, M., Brosnan, C., & Collins, M. (2017). Nurses
of San Francisco, 2130 Fulton Street, San Francisco, CA, USA; attitudes towards complementary therapies: A systematic review
buccherir@usfca.edu and meta-synthesis. International Journal of Nursing Studies, 69,
47–56. https://doi.org/10.1016/j.ijnurstu.2017.01.008
Accepted 2 June 2017 Higgins, J. P. T., Altman, D. G., & Sterne, J. A. C. (2011). As-
Copyright 
C 2017, Sigma Theta Tau International
sessing risk of bias in included studies. In J. P. T. Higgins &
S. Green (Eds.), Cochrane handbook of systematic reviews of inter-
References ventions (Version 5.1.0). Cochrane Collaboration. Retrieved from
http://handbook-5-1.cochrane.org/
Alnahedh, T., Suttle, C. M., Alabdelmoneam, M., & Jalbert,
I. (2015). Optometrists show rudimentary understanding of Hoffman Snyder, C. R., & Facchiano, L. (2011). An evidence-based
evidence-based practice but are ready to embrace it: Can bar- critical appraisal of a topic: Effectiveness of high dose donepezil
riers be overcome? Clinical and Experimental Optometry, 98(3), for advanced Alzheimer’s disease. Journal for Nurse Practitioners,
263–272. https://doi.org/10.1111/cxo.12238 7(3), 201–206. https://doi.org/10.1016/j.nurpra.2011.01.018
Brouwers, M. C., Kerkvliet, K., & Spithoff, K. on behalf of the Joanna Briggs Institute. (2016). Critical appraisal tools. Retrieved
AGREE Next Steps Consortium. (2016). The AGREE Report- from https://joannabriggs.org/research/critical-appraisal-tools.
ing Checklist: A tool to improve reporting of clinical prac- html
tice guidelines. British Medical Journal, 352(3), e1–e2, https:// Katrak, P., Bialocerkowski, A. E., Massy-Westropp, N., Kumar, V.
doi.org/10.1136/bmj.i1152 S., & Grimmer, K. A. (2004). A systematic review of the content
Brouwers, M., Kho, M. E., Browman, G. P., Burgers, J. of critical appraisal tools. BMC Medical Research Methodology, 4,
S., Cluzeau, F., Feder, G., . . . Zitzelsberger, L. for the (22), e1–11. https://doi.org/10.1186/1471-2288-4-22
AGREE Next Steps Consortium. (2010). AGREE II: Advanc- Lee, M. C., Johnson, K. L., Newhouse, R. P., & Warren, J.
ing guideline development, reporting and evaluation in health- I. (2013). Evidence-based practice process quality assessment:
care. Canadian Medical Association Journal, 182, 839–842. EPQA Guidelines. Worldviews on Evidence-Based Nursing, 10(3),
https://doi.org/10.1503/090449 140–149. https://doi.org/10.1111/j.1741-6787.2012.00264.x
Consort: Transparent Reporting of Trials. (2010). Welcome Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gøtzsche, P.
to the CONSORT Web site. Retrieved from https://www. C., Ioannidis, J. P. A., . . . Moher, D. (2009). The PRISMA state-
consort-statement.org/ ment for reporting systematic reviews and meta-analyses of stud-
Critical Appraisal Skills Programme. (2017). CASP checklists. Re- ies that evaluate healthcare interventions: Explanation and elab-
trieved from https://www.casp-uk.net/checklists oration. BMJ, 339, e1–e34. https://doi.org/10.1136/bmj.b2700
Dearholt, S., & Dang, D. (2012). Johns Hopkins nursing evidence- Masood, M., Thaliath, E. T., Bower, E. J., & Newton, J. T. (2011).
based practice: Models and guidelines(2nd ed). Indianapolis, IN: An appraisal of the quality of published qualitative dental re-
Sigma Theta Tau International. search. Community Dentistry and Oral Epidemiology, 39(3), 193–
203. https://doi.org/10.1111/j.1600-0528.2010.00584.x
Deery, C. (2017). Clinical practice guidelines proposed the use of pit
and fissure sealants to prevent and arrest noncavitated carious Melnyk, B. M., & Fineout-Overholt, E. (2015). Evidence-based prac-
lesions. Journal of Evidence-Based Dental Practice, 17(1), 48–50. tice in nursing and healthcare: A guide to best practice. Philadelphia,
https://doi.org/10.1016/j.jebdp.2017.01.008 PA: Wolters Kluwer Health.
Dellinger, R. P., Levy, M. M., Rhodes, A., Annane, D., Gerlach, Melnyk, B. M., & Gallagher-Ford, L. (2015). Implementing the new
H., Opal, S. M., . . . Zimmerman, J. L. (2013). Surviving sepsis essential evidence-based competencies in real-world clinical and
campaign: International guidelines for management of severe academic settings: Moving from evidence to action in improving
sepsis and septic shock: 2012. Critical Care Medicine, 41(2), 580– healthcare quality and patient outcomes (Editorial). Worldviews
637. https://doi.org/10.1097/CCM.0b013e31827e83af on Evidence-Based Nursing, 12(2), 67–69.
Dijkers, M. (2013). Introducing GRADE: A systematic approach Melnyk, B. M., Gallagher-Ford, L., & Fineout-Overholt, E. (2017).
to rating evidence in systematic reviews and to guideline de- Implementing the evidence-based practice (EBP) competencies in
velopment. KT update. Center on Knowledge Translation for Dis- healthcare. Indianapolis, IN: Sigma Theta Tau International.
ability and Rehabilitation Research, 1(5), e1–e9. Retrieved from Melnyk, B. M., Gallagher-Ford, L., Long, L. E., & Fineout-Overholt,
https://www.ktdrr.org/products/update/v1n5/ E. (2014). The establishment of evidence-based practice compe-
Duffy, J. R. (2005). Critically appraising quantitative re- tencies for practicing registered nurses and advanced practice
search. Nursing and Health Sciences, 7(4), 281–283. https:// nurses in real-world clinical settings: Proficiencies to improve
doi.org/10.1111/j.1442-2018.2005.00248.x healthcare quality, reliability, patient outcomes, and costs. World-
Grade Working Group. (n.d.). What is GRADE? Retrieved from views on Evidence-Based Nursing, 11(1), 5–15.
https://gradeworkinggroup.org/ Milner, K. A. (2014). 10 steps from EBP project to publica-
Gutierrez, E., Silbert-Flagg, J., & Vohra, S. (2014). Natural health tion. Nursing, 44(11), 53–56. https://doi.org/10.1097/01.NURSE.
product use and management in pediatrics: An integrative re- 0000454954.80525.8c
view. European Journal of Integrative Medicine, 6(2), 226–233. Minges, K. E., & Redeker, N. S. (2016). Delayed school start
https://doi.org/10.1016/j.eujim.2013.12.020 times and adolescent sleep: A systematic review of the

Worldviews on Evidence-Based Nursing, 2017; 00:0, 1–10. 9



C 2017 Sigma Theta Tau International
Critical Appraisal Tools and Reporting Guidelines

experimental evidence. Sleep Medicine Reviews, 28, 82–91. tology, Arthroscopy, 15(11), 1301–1314. https://doi.org/10.1007/
https://doi.org/10.1016/j.smrv.2015.06.002 s00167-007-0390-0
Moher, D., Altman, D. G., Schulz, K. F., Simera, I., & Wager, E. SQUIRE. (2017). SQUIRE 2.0 guidelines. Retrieved from https://
(2014). Guidelines for reporting health research: A user’s manual. www.squire-statement.org/index.cfm?fuseaction=Page.
Chichester, UK: Wiley. ViewPage&PageID=471
Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G. & The PRISMA Tong, A., Flemming, K., McInnes, E., Oliver, S., & Craig, J. (2012).
Group. (2009). Preferred reporting items for systematic reviews Enhancing transparency in reporting the synthesis of qualitative
and meta-analyses: The PRISMA statement. PLoS Med, 6(7), research: ENTREQ. BMC Medical Research Methodology, 12(181),
e1000097. https://doi.org/10.1371/journal.pmed.1000097 e1–e8. https://doi.org/10.1186/1471-2288-12-181
National Library of Medicine. (2015). Research reporting Tong, A., Sainsbury, P., & Craig, G. (2007). Consolidated criteria
guidelines and initiatives: By organization. Retrieved from for reporting qualitative research (COREQ): A 32-item checklist
https://www.nlm.nih.gov/services/research_report_guide.html for interviews and focus groups. International Journal for Quality
O’Brien, K., Bracht, M., Robson, K., Ye, X. Y., Mirea, L., Cruz, in Health Care, 19(6), 349–357.
M., . . . Lee, S. K. (2015). Evaluation of the family integrated Tremblay, M. S., LeBlanc, A. G., Janssen, I., Kho, M. E.,
care model of neonatal intensive care: A cluster randomized Hicks, A., Murumets, K., . . . Duggan, M. (2011). Cana-
controlled trial in canada and australia. BMC Pediatrics, 15, e1– dian sedentary behaviour guidelines for children and youth.
e9. https://doi.org/10.1186/s12887-015-0527-0 Applied Physiology, Nutrition and Metabolism, 36(1), 59–64.
Ogrinc, G., Davies, L., Goodman, D., Batalden, P., Davidoff, F., https://doi.org/10.1139/H11-012
Stevens, D. (2016). SQUIRE 2.0 (Standards for QUality Im- Turner, L., Shamseer, L., Altman, D. G., Weeks, L., Peters, J.,
provement Reporting Excellence): Revised publication guide- Kober, T., . . . Moher, D. (2012). Consolidated standards of re-
lines from a detailed consensus process. BMJ Quality & Safety, porting trials (CONSORT) and the completeness of reporting
25(12), 986–992. https://doi.org/10.1136/bmjqs-2015-004411 of randomised controlled trials (RCTs) published in medical
Paton, J., Hatton, A. L., Rome, K., & Kent, B. (2016). Effects of journals. Cochrane Database Systematic Reviews, 11, MR000030.
foot and ankle devices on balance, gait and falls in adults with https://doi.org/10.1002/14651858.MR000030.pub2.
sensory perception loss: A systematic review. JBI Database of Vandenbroucke, J. P., Von Elm, E., Altman, D. G., Gøtzsche, P.
Systematic Reviews and Implementation Reports, 14(12), 127–162. C., Mulrow, C. D., Pocock, S. J., . . . Initiative, Strobe (2007).
https://doi.org/10.11124/JBISRIR-2016-003229 Strengthening the reporting of observational studies in epidemi-
Repique, R. J. R., Vernig, P. M., Lowe, J., Thompson, J. ology (STROBE): Explanation and elaboration. PLoS Med, 4(10),
A., & Yap, T. L. (2016). Implementation of a recovery- e297. https://doi.org/10.1371/journal.pmed.0040297
oriented training program for psychiatric nurses in the inpa- van Esch, B. F., Stegeman, I., & Smit, A. L. (2017). Comparison
tient setting: A mixed-methods hospital quality improvement of laryngeal mask airway vs tracheal intubation: A systematic
study. Archives of Psychiatric Nursing, 30(6), 722–728. https:// review on airway complications. Journal of Clinical Anesthesia,
doi.org/10.1016/j.apnu.2016.06.003 36, 142–150. https://doi.org/10.1016/j.jclinane.2016.10.004
Samaan, Z., Mbuagbaw, L., Kosa, S., Borg Debono, V., Dillenburg, Walston, J. M., Cabrera, D., Bellew, S. D., Olive, M. N., Lohse, C. M.,
R., Zhang, S., . . . Thibane, L. (2013). A systematic scoping re- & Bellolio, M. F. (2016). Vital signs predict rapid-response team
view of adherence to reporting guidelines in health care liter- activation within twelve hours of emergency department admis-
ature. Journal of Multidisciplinary Healthcare, 2013(6), 169–188. sion. Western Journal of Emergency Medicine: Integrating Emer-
https://doi.org/10.2147/JMDH.S43952 gency Care with Population Health, 17(3), 324–330. https://doi.
Santos, S. C. V. O., Woith, W., Freitas, M. I. P., & Zeferino, org/10.5811/westjem.2016.2.28501
E. B. B. (2016). Methods to determine the internal length Whiffin, C. J., & Hasselder, A. (2013). Making the link between crit-
of nasogastric feeding tubes: An integrative review. Inter- ical appraisal, thinking and analysis. British Journal of Nursing,
national Journal of Nursing Studies, 61, 95–103. https://doi. 22(14), 831–835.
org/10.1016/j.ijnurstu.2016.06.004
Smith, T. O., Walker, J., & Russell, N. (2007). Outcomes of
medial patellofemoral ligament reconstruction for patellar in- doi 10.1111/wvn.12258
stability: A systematic review. Knee Surgery, Sports Trauma- WVN 2017;00:1–10

10 Worldviews on Evidence-Based Nursing, 2017; 00:0, 1–10.



C 2017 Sigma Theta Tau International

You might also like