You are on page 1of 6

Original article 109

Development of key performance indicators for emergency


departments in Ireland using an electronic modified-Delphi
consensus approach
Abel Wakaia, Ronan O’Sullivanb,c, Paul Stauntond, Cathal Walshe,
Fergal Hickeyf and Patrick K. Plunkettg

Objective The objective of this study was to develop a standards (4.7; 95% CI 4.5–4.8) and time from ED arrival
consensus among emergency medicine (EM) specialists to first ECG in suspected cardiac chest pain (4.7; 95%
working in Ireland for emergency department (ED) key CI 4.5–4.9). The top-three highest rated indicators
performance indicators (KPIs). specific to clinical care of children in EDs were: time to
administration of antibiotics in children with suspected
Methods The method employed was a three-round
bacterial meningitis (4.6; 95% CI 4.5–4.8), separate area
electronic modified-Delphi process. An online
available within EDs (seeing both adults and children)
questionnaire with 54 potential KPIs was set up for round
to assess children (4.4; 95% CI 4.2–4.6) and time to
1 of the Delphi process. The Delphi panel consisted of
administration of analgesia in children with forearm
all registered EM specialists in Ireland. Each indicator on
fractures (4.4; 95% CI 4.2–4.7).
the questionnaire was rated using a five-point Likert-type
rating scale. Agreement was defined as at least 70% of the Conclusion Employing a Delphi consensus process, it
responders rating an indicator as ‘agree’ or ‘strongly agree’ was possible to reach a consensus among EM specialists
on the rating scale. Data were analysed using standard in Ireland on a suite of 97 KPIs for EDs. European Journal
descriptive statistics. Data were also analysed as the mean of Emergency Medicine 20:109–114 c 2013 Wolters
of the Likert rating with 95% confidence intervals (95% Kluwer Health | Lippincott Williams & Wilkins.
CIs). Sensitivity of the ratings was examined for robustness European Journal of Emergency Medicine 2013, 20:109–114
by bootstrapping the original sample. Statistical analyses
were carried out using SPSS version 16.0. Keywords: Delphi technique, emergency departments, emergency medicine,
key performance indicators, performance indicators
Results The response rates in rounds 1, 2 and 3 were 86, a
Emergency Care Research Unit (ECRU), Division of Population Health
88 and 88%, respectively. Ninety-seven potential indicators Sciences, Royal College of Surgeons in Ireland (RCSI), Dublin, bEmergency
Department, Our Lady’s Children’s Hospital, Crumlin, cDepartment of Paediatrics,
reached agreement after the three rounds. In the context University College Dublin (UCD), dDepartment of Emergency Medicine,
of the Donabedian structure–process–outcome framework Beaumont Hospital, eDepartment of Statistics, Trinity College, Dublin,
f
Department of Emergency Medicine, Sligo General Hospital, Sligo and
of performance indicators, 41 (42%) of the agreed g
Department of Emergency Medicine, St James’s Hospital/School of
indicators were structure indicators, 52 (54%) were Medicine, Faculty of Health Sciences, Trinity College, Dublin, Ireland
process indicators and four (4%) were outcome indicators. Correspondence to Abel Wakai, MD, FRCSI, FCEM, Royal College of Surgeons
in Ireland (RCSI), Dublin, Ireland
Overall, the top-three highest rated indicators were: Tel: + 353 1 402 2304; fax: + 353 1 402 2764; e-mail: awakai@rcsi.ie
presence of a dedicated ED clinical information system
(4.7; 95% CI 4.6–4.9), ED compliance with minimum design Received 6 September 2011 Accepted 27 January 2012

Introduction KPIs are performance indicators in key areas of a service.


Performance monitoring is a continuous process that They are specific and measurable elements of healthcare
involves collecting data to determine whether a service that can be used to assess the quality of care. They are
meets desired standards or targets. It is dependent on measures of performance, based on standards determined
availability of good quality information on healthcare, which through evidence-based academic literature or through the
can only be achieved by having a systematic process to consensus of experts when evidence is unavailable [1].
ensure that data are collected consistently, both within and They are used to identify areas in which performance is
across organizations. One tool that is frequently used to good and meets desired standards and those in which it
assist performance monitoring at an organizational level, requires improvement. In and of themselves, KPIs cannot
and which can subsequently contribute to improvement of improve quality; however, they can effectively act as flags or
quality and safety, is key performance indicators (KPIs). alerts to identify good practice and provide comparability
within and between similar services, identifying areas
having opportunities for improvement and those in which
Supplemental digital content is available for this article. Direct URL citations
appear in the printed text and are provided in the HTML and PDF versions of this more detailed investigation of standards is warranted.
article on the journal’s website (www.euro-emergencymed.com). Increasing numbers of countries are using performance
0969-9546
c 2013 Wolters Kluwer Health | Lippincott Williams & Wilkins DOI: 10.1097/MEJ.0b013e328351e5d8

Copyright © Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.
110 European Journal of Emergency Medicine 2013, Vol 20 No 2

indicators to evaluate the quality of clinical care, whereas Ireland were recruited by initially using personal contacts
some link payment to achievement [2]. and then employing the ‘snowballing’ technique [8]. One
study investigator (A.W.) made contact with all EM
Traditionally, measurement of emergency department (ED)
specialists in Ireland (through electronic mail), inviting
performance has centred on time targets [3,4]. Although
them to participate in the Delphi process. Each specialist
standards based on time targets may reduce ED patient
was sent an introductory letter describing the study.
stay, there are dangers in focusing too heavily on only time
spent in the ED. Time targets are a blunt instrument for
monitoring ED performance, as they do not differentiate Delphi process
the time spent on administering active treatment to the An online/web-based electronic questionnaire was used
patient from the time spent by the patient in just waiting for all rounds of this Delphi process. The first-round
for the next step in their care [5]. Furthermore, time targets questionnaire consisted of items (KPIs) derived from the
incentivize moving the patient rapidly through the ED, systematic review of the academic literature and con-
without a counterbalance to ensure that the patient receives sidered by the study investigators as relevant to EM
the highest quality of care [5]. practice in Ireland.

Modern EDs perform ever increasing numbers of A pilot study was carried out using four eligible Delphi panel-
diagnostic investigations and therapeutic interventions lists for each Delphi round to refine the study questionnaire.
before patient disposition [5]. Consequently, such EDs At the pilot stage, the questionnaire items and the format of
require a more balanced suite of KPIs to measure ED the information were agreed upon. After the pilot study, the
quality of care. Suites of KPIs should ideally reflect not weblink to the study questionnaire was sent to the panel-
only timeliness but also quality of care and clinical lists and they were asked to rate each KPI electronically
outcomes. The development of a suite of KPIs suitable (by clicking on the appropriate answer) using a 5-point
for audit is also essential in defining the role of the ED Likert rating scale that ranged from ‘strongly disagree’ (1) to
and monitoring the standard of care by emergency ‘strongly agree’ (5). The panellists then submitted their
medicine (EM) within the healthcare system. The aim responses to the questionnaire electronically to a data collec-
of this study was to develop a consensus among EM tion office. The study investigators were blinded to the
specialists working in Ireland for a suite of KPIs aimed at source of the data submitted by the data collection officer,
monitoring the performance of EDs in Ireland. who was not involved in the interpretation of the study.
At least 70% of Delphi panellists had to rate a KPI in the
Methods high ‘agreement’ range of scores (score 4 or 5) for the
A systematic review of the academic literature was carried indicator to be selected for further inclusion and
out to establish the evidence base for performance development. In the questionnaire for each Delphi
indicators directly referable to EM practice in Ireland. round, panellists were given the opportunity to suggest
The proposed study was approved in April 2009 by the additional KPIs that they felt were appropriate to be
Joint Research Ethics Committee of St James’s Hospital considered in the next round of the process.
and the Adelaide and Meath Hospital incorporating the
National Children’s Hospital, Dublin. A three-round Once the KPIs were rated by panellists, the next step
electronic modified-Delphi study was conducted be- involved identifying the KPIs rated in the high ‘agree-
tween September 2009 and October 2010. ment’ range of scores (score 4 or 5) and additional KPIs
suggested by the panellists. KPIs that did not reach
Systematic literature review agreement (< 70% of the panellists rating the KPI in the
We searched the following electronic databases to identify high ‘agreement’ range of scores) in rounds 1 and 2 were
studies on ED performance indicators directly referable to included in the following Delphi round. KPIs that
EM practice in Ireland: HealthStar (1966 to September reached agreement and statements that reached con-
2009), MEDLINE (1950 to October 2009), Embase (1980 sensus at the end of each round were not reiterated in
to October 2009) and System for Information on Grey subsequent rounds. Panellists were asked to rerate each
Literature in Europe. Additional efforts were made to KPI that did not reach agreement in the subsequent
locate eligible studies by cross-referencing from the round in the light of the results and the qualitative
reference lists of major articles on the subject. No language comments of respondents in the previous round.
or other limitations were imposed.
Data collection and analyses
Delphi panel recruitment and sample The study involved three phases of data collection and
Purposeful sampling was used to ensure that appropriate analyses.
experts were invited to participate [6,7]. A Delphi panel
of all registered EM specialists (Consultants in Emer- Phase one
gency Medicine and Fellows of the College of Emergency Fifty-four potential ED KPIs informed by the results of
Medicine awaiting Consultant appointment) working in the systematic review of the academic literature and

Copyright © Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.
Development of ED KPIs in Ireland Wakai et al. 111

considered by the study investigators to be relevant performance indicators directly referable to EM practice
to EM practice in Ireland provided the starting point in Ireland.
for this Delphi process. The weblink to the online
first-round questionnaire containing a set of 54 po- Response rate and number of potential key
tential indicators was sent to all the EM specialists in performance indicator questionnaire items
Ireland. In the first Delphi round, 41 of the 48 (86%) EM spe-
cialists invited to participate responded to the ques-
The study investigators were provided with responses tionnaire. The first round questionnaire contained 54
in a blind format by the data collection office. A summary potential KPIs, of which 36 (67%) indicators reached
report of the first round results was prepared within a agreement. In the second Delphi round, 50 of the 57
month of completion of the first Delphi round and the (88%) EM specialists invited to participate responded to
results were sent back to the panellists. This comprised the questionnaire. The second-round questionnaire con-
graphs showing the collated responses from the first- tained 68 potential KPIs, of which 47 (69%) indicators
round questionnaire and a synopsis of any qualitative reached agreement. In the third Delphi round, 52 of the
comments made by respondents to that round. 59 EM specialists invited to participate (88%) responded
to the questionnaire, which contained 36 potential
Phase two indicators with 15 (42%) indicators reaching agreement.
The panellists were sent a second-round questionnaire The increased number of specialists from rounds 1–3,
asking them to rate KPIs suggested by respondents to the who were eligible to be Delphi panellists, was the result
Delphi round-1 questionnaire and to rerate the KPIs that of newly qualified specialists and new Consultant
had not reached agreement in the previous round. appointments in the specialty of EM in Ireland during
the study period.
The same 5-point Likert rating scale used in the first-
round questionnaire was used for the second-round Donabedian classification of key performance
questionnaire. The KPIs that reached agreement and indicators that reached agreement
the statements that reached consensus at the end of After the three Delphi rounds, 97 of the potential KPIs
round 2 were not reiterated in round 3. The same in the study questionnaires reached agreement (see
predetermined decision rule of more than 70% of Supplementary Tables 1–3, Supplemental digital content 1,
panellists strongly agreeing (rating of 4 or 5) was used http://links.lww.com/EJEM/A18). In terms of the Donabedian
to determine whether a KPI reached agreement. structure–process–outcome framework, 41 (42%) of the indi-
cators that reached agreement were structure indicators, 52
Phase three (54%) were process indicators and four (4%) were outcome
In the final round of the Delphi study, those KPIs that indicators.
had not reached consensus at the end of round 2,
together with a summary of the rest of the panel’s Key performance indicators that reached agreement
findings, were presented. In this round, we used the same Overall, the top-three highest rated indicators (reported
5-point Likert rating scale and the predetermined as the mean values of Likert rating with 95% CIs) after all
decision rule regarding agreement used in the previous three Delphi rounds were as follows:
two rounds.
(1) presence of a dedicated ED clinical information
system (4.7; 95% CI 4.5–4.9);
Data and statistical analysis
(2) ED compliance with minimum design standards
Data were collated online and analysed using SPSS
recommended by the Irish Association for Emergency
version 16.0 (SPSS Inc., Chicago, Illinois, USA) for
Medicine (IAEM)(4.7; 95% CI 4.5–4.8);
statistical analysis. Data were analysed using standard
(3) time from ED arrival to first electrocardiogram in
descriptive statistics. The mean of the Likert rating and
patients with suspected cardiac chest pain (4.7; 95%
95% confidence interval (95% CI) were also calculated for
CI 4.5–4.9).
the data from the questionnaire of each Delphi round.
The proportion of individuals in each Likert category for
Seven potential KPIs specific to Paediatric Emergency
each KPI was also reported in the summary report of each Medicine (PEM) reached agreement from all three
Delphi round. The sensitivity of the ratings was examined
Delphi rounds. The first-round questionnaire contained
for robustness by bootstrapping the original sample. 11 potential KPIs specific to PEM, three of which
reached agreement. The second-round questionnaire
Results contained two potential KPIs specific to PEM, both of
Systematic literature review which reached agreement. The third-round questionnaire
Of the 693 citations identified by the comprehen- contained nine potential indicators specific to PEM, two
sive literature search, there was no publication on of which reached agreement.

Copyright © Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.
112 European Journal of Emergency Medicine 2013, Vol 20 No 2

The top-three highest rated indicators (reported as Process indicators are accessible, practical and amenable
means of Likert rating with 95% CIs) specific to PEM to change and are thus usually the focus of quality
were as follows: improvement efforts. By contrast, only 4% of the 97 KPIs
that reached agreement were outcome indicators (those
(1) time to administration of antibiotics in children with events occurring after the patient leaves the ED).
suspected bacterial meningitis (4.6; 95% CI 4.5–4.8); Outcome indicators evaluate the effect of care and
(2) separate area to assess children in EDs treating both typically include readmission, mortality, morbidity, pa-
adults and children (4.4; 95% CI 4.2–4.6); tient satisfaction and quality of life [13,17]. Although
(3) time to administration of analgesia in children with policymakers and patients have shown greater interest in
forearm fractures (4.4; 95% CI 4.2–4.7). outcome measures than in either structure or process
measures, few outcome measures have currently been
validated for implementation in the emergency-care
Discussion setting [13,18,19]. This is because outcome measures
The aim of this study was to develop a consensus among are challenging and difficult to validate in the ED setting.
EM specialists working in Ireland for a suite of KPIs to Methods of risk adjustment in the ED setting are
monitor the performance of EDs in Ireland. A Delphi imperfect, and attributing a patient’s clinical outcome
technique was used to reach consensus in this study as it after ED care to whether or not a specific treatment
is a widely accepted method of achieving convergence of intervention was performed is generally difficult [20,21].
opinion on real-world knowledge solicited from experts Furthermore, as admitted patients spend a relatively
within a topic [9–12]. small fraction of their overall hospital stay in the ED,
attribution of patient outcomes to ED care alone is
We initially carried out a systematic literature review for probably invalid.
two reasons. First, to ensure that there were no previous
studies on KPIs directly referable to EM practice in Using ED patient satisfaction as an outcome indicator is
Ireland. Second, to inform our choice of indicators for the controversial because of poor response rates and negative
Delphi questionnaires we used in this study. The association with ED overcrowding, both of which are
systematic review confirmed the existence of a knowl- often out of the control of the providers of ED care [22].
edge gap in relation to evidence-based KPIs for EDs in In this study, use of ED patient satisfaction was proposed
Ireland. These findings are consistent with the previously as a KPI by respondents in round 2. It was therefore
identified knowledge gap that exists in relation to a included for consideration by the Delphi panellists in the
robust evidence base for ED quality-of-care indicators round 3 questionnaire but failed to reach agreement; 67%
and performance measures [13–15]. of the panellists in round 3 rated patient satisfaction as a
potential ED KPI in the high ‘agreement’ range of scores
There was a high participation rate by the Delphi
(score 4 or 5).
panellists (all EM specialists in Ireland) invited to
participate in this study. A typical Delphi panel size
ranges from 15 to 35, with the expectation that 35–75% The two highest rated indicators at the end of all three
of invitees will actually participate [6]. In this study, the Delphi rounds in this study were structure indicators
participation rates in Delphi rounds 1, 2 and 3 were 86, 88 namely presence of a dedicated ED clinical informa-
and 88%, respectively. This makes the findings of this tion system and ED compliance with the IAEM-recom-
Delphi process particularly robust, diminishing the like- mended minimum design standards. Structure indicators
lihood that the findings are compromised by nonresponse relate to aspects that are present before the patient visits
error. Furthermore, when we investigated the location of the ED [17]. They are related to the characteristics of the
the nonrespondents, there was no geographical bias or care environment and include the physical layout of the
practice bias in relation to their EM practice (adult EM, department, the equipment (e.g. the presence of an
adult and paediatric EM and paediatric EM only). The electronic health record), available laboratory facilities,
robustness of the findings of this study was further staff resources (e.g. the nurse-to-patient ratio), protocols,
confirmed by examining the sensitivity of the Likert clinical guidelines and the accreditation status of the
ratings in each Delphi round by bootstrapping (resam- ED [17]. ED management often focuses much attention
pling) the original samples. on structural issues (building, equipment, protocols), and
structural issues may be relevant to quality improvement
In the context of the Donabedian structure–process– initiatives but they are not the primary focus of most
outcome framework for quality of care indicators, the quality improvement initiatives. Nevertheless, the find-
findings of this study are consistent with the practical ings of this study suggest that EM specialists in Ireland
reality of the majority of quality improvement initia- currently consider structural issues as an important factor
tives [16]. The majority (54%) of the 97 KPIs that in ED quality improvement initiatives in Ireland. This view
reached agreement were process indicators. These are is supported by numerous position statements published
events that occur while the patient is in the ED [17]. by IAEM highlighting the importance of structural issues in

Copyright © Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.
Development of ED KPIs in Ireland Wakai et al. 113

relation to the improvement of quality of ED patient care in point, for the development of performance indicators for
Ireland [23–25]. This may be taken as an indication of a the emergency care system in Ireland.
frustrated focus on poorly implemented recommendations.
Second, the external validity of the findings of this study
Although in this consensus study the majority of potential may be limited because many of the agreed KPIs in this
KPIs included in the questionnaire for each Delphi round study probably reflect what the Delphi panellists consider
were applicable to both adult and paediatric ED practice priorities in relation to ED quality improvement in Ireland.
(e.g. time to administration of antibiotics in sepsis of any This may not be directly transferable to EM practice in
cause, proportion of ED patients triaged within the ac- other countries. Nevertheless, the methodological frame-
cepted timeframe of the triage scale used), the question- work for this study and some of the agreed KPIs may
naire for each round also contained potential KPIs specific inform the development of ED KPIs in other countries.
to the ED care of children (e.g. time to administration of
Third, some of the agreed KPIs in this study may be
analgesia in children with forearm fractures and time to
considered to be outside the control of ED care providers
administration of antipyretics in children with a tempera-
(e.g. the proportion of admitted patients transferred to an
ture greater than 38.51C if not given in the preceding 6 h).
inpatient ward within 6 h of ED arrival). Nevertheless, this
Some of the KPIs relevant to both adult and paediatric EM
does not diminish the value of such KPIs to still effectively
that reached agreement in this study have also been chosen
act as flags or alerts to identify good practice, provide
as clinical indicators of ED care in a previous consensus
comparability between EDs and hospitals and identify
study on ED care of children [14]. For example, in this
areas of opportunities for improvement and those in which
study the proportion of patients who make an unscheduled
a more detailed investigation of standards is warranted [1].
return attendance to the ED within 7 days of discharge
reached agreement as an ED KPI. Fourth, as a suite, the KPIs are generic and ‘context-free’,
and their interpretation in the broader Irish healthcare
Valid and reliable KPIs depend on the availability of high-
system may necessarily be limited.
quality data. Consequently, both the WHO and, in
Ireland, the Health Information and Quality Authority Finally, the study investigators must assume that all the
recommend performing a feasibility analysis before using Delphi panellists, as specialists in EM, had relevant
KPIs for performance monitoring in healthcare services. knowledge on the evidence base for all 97 KPIs that
This is because the feasibility of collecting the relevant reached consensus. However, it is conceivable that not all
minimum data set (MDS) is always a limiting factor. MDS panellists were aware of the evidence supporting all the
is defined as core data identified as the minimum data KPIs that reached agreement.
required to measure performance for a KPI [1,26]. However,
this study did not aim to measure criteria such as the Conclusion
feasibility, reliability (interrater reliability and test–retest Improving the quality of ED care requires EM specialists
reliability), sensitivity and specificity of KPIs; future studies to first define performance indicators relevant to key
are required to measure these criteria as part of the process areas of ED patient care. In this study, EM specialists in
of building a framework for measuring ED quality of care in Ireland reached consensus on a suite of 97 KPIs to
Ireland. In the meantime, using a scientifically valid monitor the performance of EDs in Ireland. This suite of
methodology, we have started to measure the availability, KPIs may be useful in starting to build a framework for
reliability and internal consistency of MDS items relevant to measuring ED quality of care in Ireland.
some of the KPIs that reached consensus in this study [27].
Acknowledgements
Study limitations This study was supported by grants from the following:
Although the Delphi technique is a widely used and Irish Association for Emergency Medicine; National
accepted consensus-building method for gathering data Children’s Research Centre (NCRC), Our Lady’s Chil-
from respondents within their expertise [9] and has been dren’s Hospital, Crumlin, Dublin, Ireland; Department of
used previously to develop quality of clinical care indicators Emergency Medicine Research and Education Fund, St
for EDs [10–12], this study has some limitations. James’s Hospital, Dublin, Ireland. The authors thank all
Emergency Medicine Specialists in Ireland for their
First, the focus of this study was on ED patient care, and support and participation in this study.
the Delphi panel consisted of medical specialists in EM
only, rather than focussing on the emergency care Conflicts of interest
‘system’ involving other emergency care providers and There are no conflicts of interest.
settings (e.g. prehospital emergency care). Performance
indicators that are based on separate services may be less
patient-oriented than the measures based on a ‘systems’ References
1 Health Information and Quality Authority (HIQA). Guidance on developing
approach to emergency and urgent care [12]. However, key performance indicators and minimum data sets to monitor healthcare
we consider this as a necessary starting point, not the end quality, September 2010. Available at: http://www.hiqa.ie/publication/

Copyright © Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.
114 European Journal of Emergency Medicine 2013, Vol 20 No 2

guidance-developing-key-performance-indicators-kpis-and-minimum-data- 15 Welch S, Augustine J, Camargo CA Jr, Reese C. Emergency department


sets-monitor-health. [Accessed 27 July 2011]. performance measures and benchmarking summit. Acad Emerg Med 2006;
2 Lindenauer PK, Remus D, Roman S, Rothbeg MB, Benjamin EM, Ma A, 13:1074–1080.
Bratzler DW. Public reporting and pay for performance in hospital quality 16 Donabedian A. Evaluating the quality of medical care. Milbank Q 1966;
improvement. N Engl J Med 2007; 356:486–496. 44:166–200.
3 Health Service Executive (HSE). Emergency department task force report, 17 Graff L, Stevens C, Spaite D, Foody J. Measuring and improving quality in
2007. Available at: http://www.hse.ie/eng/services/Publications/services/ emergency medicine. Acad Emerg Med 2002; 9:1091–1107.
Hospitals/HSE_Publications/Emergency_Department_Task_Force_Report_.pdf. 18 Krumholz HM, Normand SL, Spertus JA, Shahian DM, Bradley EH.
[Accessed 24 January 2011]. Measuring performance for treating heart attacks and heart failure: the case
4 NHS Executive. Quality and performance in the NHS: NHS performance for outcome measurement. Health Aff 2007; 26:75–85.
indicators. London: Department of Health; 2000. 19 National Quality Forum. NQF-endorsed standards. Available at: http://
5 Department of Health. A&E clinical quality indicators implementation www.qualityforum.org/Measures_List.aspx. [Accessed 27 July 2011].
guidance. Available at: http://www.dh.gov.uk/en/Publicationsandstatistics/ 20 Iezzoni LI. The risks of risk adjustment. JAMA 1997; 278:1600–1607.
Publications/PublicationsPolicyAndGuidance/DH_122868. [Accessed 24 21 Glance LG, Dick A, Mukamel DB, Li Y, Osler TM. Are high-quality cardiac
January 2011]. surgeons less likely to operate on high-risk patients compared to low-quality
6 Steele R, Bosma H, Johnston MF, Cadell S, Davies B, Siden H, Straatman L. surgeons? Evidence from New York State. Health Serv Res 2008; 43:300–312.
Research priorities in paediatric palliative care: a Delphi study. J Palliat Care 22 Pines JM, Iyer S, Disbot M, Hollander JE, Shofer FS, Datner EM. The effect
2008; 24:229–239. of emergency department crowding on patient satisfaction for admitted
7 Ospina MB, Bond K, Schull M, Innes G, Blitz S, Rowe BH. Key indicators of patients. Acad Emerg Med 2008; 15:825–831.
overcrowding in Canadian emergency departments: a Delphi study. CJEM 23 Irish Association for Emergency Medicine (IAEM). Standards for emergency
2007; 9:378–379. department design and specification for Ireland 2007. Available at: http://
8 Mason J. Qualitative researching. Thousand Oak, CA, USA: Sage www.iaem.ie/images/stories/iaem/publications_position_statements/2007/
Publications; 1996. iaem_standards_for_ed_design__specification_for_ireland_300907.pdf.
9 Hsu C, Sandford BA. The Delphi technique: making sense of consensus. [Accessed 27 July 2011].
Pract Assess Res Educ 2007; 12:1–8. 24 Irish Association for Emergency Medicine (IAEM). Health and safety
10 Lindsay P, Schull M, Bronskill S, Anderson G. The development of indicators standards for Irish emergency departments – providing an optimum
to measure the quality of care in emergency departments following a environment for patients and staff. Available at: http://www.iaem.ie/images/
modified-delphi approach. Acad Emerg Med 2002; 9:1131–1139. stories/iaem/publications_position_statements/2007/iaem_health__
11 Beattie E, Mackway-Jones KA. Delphi study to identify performance safety_standards_for_irish_eds_120607.pdf. [Accessed 27 July 2011].
indicators for emergency medicine. Emerg Med J 2004; 21:47–50. 25 Irish Association for Emergency Medicine (IAEM). Staffing needs for
12 Coleman P, Nicholl J. Consensus methods to identify a set of potential emergency departments in Ireland 2006. Available at: http://www.iaem.ie/
performance indicators for systems of emergency and urgent care. J Health images/stories/iaem/publications_position_statements/2006/iaem_
Serv Res Policy 2010; 15 (Suppl 2):12–18. staffing_needs_for_eds_in_ireland_231106.pdf. [Accessed 27 July 2011].
13 Pines JM, Fee C, Fermann GJ, Ferroggiaro AA, Irvin CB, Mazer M, et al. The 26 World Health Organisation (WHO). Mental Health Information Systems
role of the Society for Academic Emergency Medicine in the development (Mental Health Policy and Service Guidance Package). Geneva,
of guidelines and performance measures. Acad Emerg Med 2010; Switzerland: WHO; 2005.
17:e130–e140. 27 Wakai A, McCabe A, Cummins FH, McCoy S, Cronin J, Anagor C, et al. The
14 Guttmann A, Razzaq A, Lindsay P, Zagorski B, Anderson GM. Development availability and reliability of minimum data set items for four Emergency
of measures of the quality of emergency department care for children using Department key performance indicators – a pilot study. Natl Inst Health Sci
a structured panel process. Pediatrics 2006; 118:114–123. Res Bull 2011; 6:65.

Copyright © Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

You might also like