This action might not be possible to undo. Are you sure you want to continue?
Developing Benchmarks for Clinical Engineering Activities: A Methodology
Jonathan A. Gaev
n fall 2006, AAMI selected ECRI Institute to perform a study to determine if it is feasible and desirable to develop benchmarks for the activities of hospital-based clinical engineering (CE) departments. The study reported here corresponds to the ﬁrst phase of the project. The ultimate goal of the project is to enable departments from different institutions to compare their performance. This study included a review of literature related to benchmarking, ﬁeld interviews, and an analysis of ECRI Institute’s benchmarking projects. ECRI Institute’s project also included a series of structured interviews with more than 30 healthcare executives whose job descriptions ranged from CE managers to hospital chief executive ofﬁcers (CEOs). ECRI Institute then developed criteria to identify a representative set of indicators (RSI) that would highlight the data elements required to establish benchmarks; this step would evaluate the feasibility of collecting the data. We need to emphasize that the purpose of developing this RSI is to evaluate the desirability and feasibility of benchmarking hospital-based CE department activities. The goal of the project was not to develop a ﬁnal set of recommended indicators for the entire CE community. The RSI include the percentage of repairs completed within one working day; total CE cost/device serviced; percentage of preventive maintenance (PM) complete; and percentage of technician time spent on maintenance, customer satisfaction, CE department development, and technology management intensity. ECRI Institute proposed that the equipment inventory and hospital parameters representing the frequency of use of the equipment under CE management be used to establish the context in which to interpret the set of indicators to compare CE departments.
Biomedical Instrumentation & Technology
As the RSI were primarily derived from the requirements expressed by decision-makers and based on data elements that are feasible to obtain by most CE departments, ECRI Institute has concluded that it is feasible and desirable to develop the benchmarks described above.
ECRI Institute’s method to demonstrate feasibility and desirability is to develop an RSI that could be used as benchmarks, and then show that the indicators are both desirable and feasible. By representative set, we mean that the indicators are meaningful, but may not be the ﬁnal or optimal set of indicators that will be used by the CE community. The ﬁnal set of indicators may be determined in another project or study. Desirability is determined by identifying the indicators that are of most interest to those who will use the information. Feasibility is determined by the challenges in collecting the data required to determine the indicators under real-life conditions. This study needed to demonstrate both feasibility and desirability, but used desirability as the starting point for the analysis. The steps in developing the RSI were: Research: Literature review, interviews, and review of ECRI Institute experience to collect the following: 1. List of indicators 2. Methodology for developing indicators of CE department performance 3. Needs/desires of CE department managers/supervisors for data related to department performance. 4. Hospital, department, or inventory characteristics to be considered when comparing indicators
director of clinical engineering. CE department activities Indicator development: From the information obtained during research. attending Environment of Care meetings.072 beds. What information is essential for you to monitor the performance of the CE department? • How does using measures of performance impact your activities or decisions? • What information regarding CE department performance is requested by your superiors or peers? • What are your biggest concerns on benchmarking? In the second round we interviewed 22 people (including two who were interviewed in the ﬁrst round). ECRI Institute was surprised about what wasn’t said in either round of interviews. Almost no literature discussed the needs of hospital managers above the level of CE department director in terms of data related to the department’s performance. • Literature Review We conducted an extensive review of available CE literature and business publications. Concerns of CE department members 6. Validity testing: We conducted a limited check of the RSI by applying several management scenarios to determine how the RSI would respond to real-life problems. and also included representatives from third-party service organizations. The questions that resulted from that process included: 268 . and handling recalls. in good condition. judged on an annual basis.e. vice president of facilities. The senior hospital managers we spoke with focus on the ﬁnancial aspects of the CE department and want to know that their hospital is getting a good value for the funds expended. Summary of Interview Results Based on our ﬁndings and experience. director of clinical engineering. desirable. as organizations won’t be motivated to share information if they feel that it won’t be used appropriately. These deﬁciencies in the literature alerted us to information that we would need to obtain through the interview process. Addressing this point is crucial to the adoption of benchmarks. “apples to apples” comparisons are made between institutions (i. looking for sources relevant to CE benchmarking. Although a signiﬁcant challenge to the adoption of benchmarks is to ensure that fair. Comprehensive data collection and veriﬁcation was beyond the scope of this project. each in a different state. CE manager. Their job descriptions included CE manager. we concluded that hospital managers two levels above the CE director do not typically deal with operational problems of the CE department. and members of the executive suite. and hospital CEO. July/August 2007 Interviews We conducted two rounds of interviews. clinical engineer. CE department directors. Ten people were interviewed in the ﬁrst round. ranged in size from 102 beds to a multi-facility health system with 2.. They assume that these issues have been handled by the CE director and his or her supervising administrator. we developed a comprehensive list of indicators and applied a logical process developed for this project to narrow this down to a limited. They expected the CE department to keep the equipment safe by performing preventive maintenance. CE services would not need to be benchmarked more frequently than that. and ready to be used by the clinical staff. nor did they mention additional projects that the department did that were not directly related to equipment service (such as investigating a new type of telemetry device or providing equipment planning services for a new emergency department). and feasible RSI. No group emphasized the CE department’s role in patient safety. Questions for additional investigation 7. Among our goals for this second round of interviews was to discover indicators that were not mentioned in the literature. We recorded each indictor—more than 125 in all—exactly as it was stated. and to design the questions used in the second round. want their superiors to interpret the value of the CE department in the proper context by only comparing it to the appropriate peer group in other institutions. From their point of view. There were some signiﬁcant omissions in the literature—for example. The institutions. The emphasis given by all interviewees was that the CE department’s goal was to make sure that the equipment was up and running. The ﬁrst was to learn about the concerns of CE department employees and superiors related to benchmarking. no methodology for developing indicators of CE department performance was found. peer groups).COVER STORY Measure for Measure 5. we didn’t ﬁnd any literature that described how to develop peer groups. Their job descriptions included biomedical equipment technician (BMET). We prompted interviewees by asking how the CE department contributes to patient safety. on the other hand.
Numbers that measure an activity of a CE department. we developed a list of characteristics that would be required for the RSI. A group of institutions with similar characteristics that are relevant to the benchmarking indicators. together with indicators mentioned by the interviewees. taken together as a set of indicators. Gaev Additional Indicators ECRI Institute’s Health Systems Group regularly conducts onsite reviews of CE department performance. 12 11 10 9 8 7 6 5 4 3 2 1 Selection Criteria Combining the information that we had gathered.NET product). • Revealing as internal and external benchmarks. • Meaningful and desirable to both the CE department director and supervisor. Both systems are still used in hundreds of hospitals. For example. Each indicator must be: • Independent of the size of the institution. In the database we noted the source for each indicator. External Benchmark. clinical laboratory. The RSI. Internal Benchmark. now sold through Phoenix Data Systems in their AIMS. must: • Demonstrate the tradeoffs between performance and cost. we are aware of indicators of CE department performance currently used in the ﬁeld. We removed many redundancies. imaging. Deﬁnitions of Benchmarking and Related Concepts Benchmark. For many years. We added those indicators to the list. An example is the monthly PM completion rate (percentage of scheduled PMs that are completed in a month). bringing the total to more than 140 indicators. The ﬁnal list contained 42 indicators. • Easy to communicate to the senior management team and clinicians. ECRI Institute also sold and serviced a computerized maintenance management system (Hospital Equipment Control System™.COVER STORY Jonathan A. The reference value for the indicator is established by comparing the performance of the institution to itself. Preference was given to indicators that are already used by many institutions as it is a signiﬁcant advantage to know the existing strengths and weaknesses of an indicator and to know that the data can be collected and has proven useful. Biomedical Instrumentation & Technology 269 . the number of pieces of medical equipment and the speciﬁc types (general biomedical. Peer Group. if CE departments at similar institutions achieve an average monthly PM completion rate of 95% for general medical equipment. Indicators. if we know the “repair cost per device” we could calculate the “repair cost per 100 devices”). institutions within that peer group may use 95% as a benchmark to measure the performance of their CE departments. so further analysis was required to identify the RSI and evaluate the indicators for desirability and feasibility. For example. It continues to sell and support the Inspection and Preventive Maintenance System. • Intuitive in meaning. and indicators that were mentioned in the literature but not by the interviewees or identiﬁed by ECRI Institute. • Based on data easily collected by the CE department.) in the inventory managed by the CE department may be important factors in establishing a peer group. From those experiences and others. etc. it may choose to use 97% as the reference value for its 2007 monthly PM completion rate. • Address the majority of time spent on CE department activities. A reference value for an indicator. The reference value may be asserted (many hospitals set a benchmark of 100% for the monthly PM completion rate of life-support equipment—a rate mandated by the Joint Commission for this type of equipment) or established using other methods such as internal or external benchmarking. For example. indicators that could easily be calculated from other indicators (for example. The reference value for the indicator is established by comparing the performance of one institution to its peer group (see below). if a CE department achieved an average monthly PM completion rate of 97% in 2006.
He proposed three indicators for repair and maintenance services: ratio of service cost to acquisition cost. The inventory would need to be separated into three categories: general biomedical equipment. The indicators are summarized in Table 1 (page 274). We would have preferred using an indicator that reﬂected all of the time that the equipment was unavailable to the clinician. To ensure that we developed a feasible set of indicators. the review team conﬁrmed that the data required for each indicator typically resided in systems under the control of the CE department. CE departments that enter work order information only when they return the equipment to service would need to change their practice. For the purpose of this project. The indicators presented in Table 1 are summarized below. We believe that the abbreviated deﬁnitions presented here are sufﬁcient to determine the desirability and feasibility of benchmarking CE department activities. and average turnaround time per repair. In some cases. or could be obtained from other departments with a modest effort. and clinical laboratory equipment. imaging 270 equipment.” is described in the next section. some CE departments maintain hospital beds and some don’t). This was an iterative process as we needed to ﬁrst ensure that the individual indicator met the selection criteria and then to ensure that the set of indicators met the selection criteria. the deﬁnition that was initially stated for the indicator was revised so that the resulting indicator would use data more easily acquired by the majority of CE departments. as a result of the ﬁrst phase of AAMI’s study. For each indicator. repair requests completed per device. or they would need to create a user-deﬁned ﬁeld so that the technician could declare that the repair was (or was not) completed within one working day. Table 1 also shows that all activities are covered and that some indicators reﬂect the impact of more than one activity. In this study. we considered that the time spent on the repair also includes activities required to coordinate outside services and purchasing/requisitions related to the repair. One of the RSI requirements is that it represents the majority of CE activities. To validate the indicators. This indicator. could easily be calculated by the CE department. Jan/Feb 1997). we assume that the data are collected for benchmarking purposes on an annual basis. “Technology Management Intensity. starting when the call was received by • AAMI’s Involvement in Indicator Development AAMI has been involved in the exploration of CE department indicators for many years. he requested data from 100 clinical engineers. To develop a central database of information. which was insufﬁcient for validation. we recognize that more speciﬁc deﬁnitions regarding the equipment are needed (for example. each department would need to prepare a complete inventory of all medical equipment under its management (including equipment maintained under service contracts and “time and material service” provided by outside vendors).COVER STORY Measure for Measure Representative Set of Indicators The list of 42 indicators was reviewed by two former managers of CE activities. We compared the proposed indicators with the job responsibilities described in the Journal of Clinical Engineering’s salary survey. several other candidates were also considered but not included in this presentation. additional discussion regarding the deﬁnitions of each indicator will be required. They would need to either enter the dates and times of when the repair began and ended and when the equipment was returned to service. Ted Cohen published the results of the ﬁrst year of AAMI’s Validating Metrics pilot project (BI&T. If this indicator is used for benchmarking. but only received satisfactory data from seven institutions. a senior member of ECRI Institute’s Health Systems Group. We recognize that if a set of indicators will be proposed to the CE community to be used to benchmark CE activities. The ratio of service cost to acquisition cost has become widely cited in the literature and is used by many third-party service organizations and some CE departments to benchmark their programs. We needed to create an indicator that addressed the CE department’s technology management activities. To facilitate comparisons among CE departments. The Indicators % repairs completed in one working day = [number of CM events completed in one working day / total number of CM events] * 100% This indicator requires that an accurate start and stop time and date be entered for each repair (also called corrective maintenance or CM). as those efforts occupy about 15% of the CE department’s time (Journal of Clinical Engineering 2005). and ECRI Institute’s technical director and vice president of health technology evaluation and safety. July/August 2007 .
and corrective mainreturned to service. • % Technician time spent on maintenance = 100% * [Time spent on inspection. We have included % PM complete in the representative set of CE department indicators to ensure that the set of indicators appears credible to the senior management team.2. They also know that equipment needs to be appropriately maintained to comply with Joint Commission requirements.080 hours (52 weeks * 40 hours/week).080 hours * number of techniclock” with the receipt of the call. Gaev the CE department and ending when the equipment was incoming testing. efforts in this indicator as we felt that focusing on BMET We did not select the service cost ratio (Service cost maintenance activities would make it easier to collect the / device acquisition cost) as an indicator because the information required and interpret the results. this indicator is very important to many senior healthcare managers as they associate safety with making sure that PM is up to date.COVER STORY Jonathan A.time maintaining equipment. The methodology used by ECRI Institute in phase 1 of the study. We did not propose “starting the tenance] / [2. Technicians spend time on ment costs. etc. research conducted conﬁrms that many CE departments can not easily obtain the device acquisition cost for the entire inventory under their management. • % PM complete = [# PM events completed / # PM events scheduled] * 100% According to our interviews. PM.for breaks. To facilitate comparison among receiving service facilities. but not feasibility. but did not include their head. The time spent on maintenance activities also includes • Total CE cost / device serviced = total cost coordinating outside services and purchasing/requisitions for all CE activities / total number of devices related to maintenance. as the overhead varies with each institution. The indicator had desirability. We recognize that clinical tutions can be more accurately compared if the internal engineers and some CE department directors also spend labor costs are based on salaries without applying over. Biomedical Instrumentation & Technology 271 . ECRI Institute recommends that CE departments review their PM intervals for each device category as some departments may ﬁnd that they could discontinue PM for some types of devices without increasing their failure rate. that this will vary depending on each institution’s policies The total CE cost includes administrative and manage. ECRI Institute applied a neutral standard of The denominator “devices serviced” includes all de. and service contracts. as many CE cians] departments are not able to record this precise data.indicator will not reach 100%. vacation time. Data from various insti. so this service. but recognizes vices that received service through the CE department. however. CE department costs not directly related to useful activities beyond equipment maintenance.
we would expect their “Total CE cost/device” to differ substantially. derived from a 5-point scale where 1 = poor. we believe that the frequency of use may be related July/August 2007 Peer Groups Benchmarking requires that CE department performance in one institution be compared to similar CE departments in other institutions. For example. each institution would need to organize that information in a standard format and agree to share it with other institutions (the data would be reported in a conﬁdential manner to ensure anonymity). we propose that the CE director have the results of the facility’s internal survey translated into a single numeric score. Many institutions already have a customer satisfaction survey that meets their needs. if two institutions are both 250bed facilities with equally skilled staff and similar equipment responsibilities. The frequency of use of the equipment will also impact service activities. For the purpose of determining feasibility. and risk management. the number of cardiac catheterization laboratories. we conclude that it is feasible to obtain information for the types of hospital equipment. we would also expect their Total CE cost/device to differ. and development of policy and procedures for technology management. We have several examples of implementing this concept. Since databases containing the types of information required to determine equipment inventory have been developed.or off-site training courses. • Technology Management Intensity: [# hours spent on these activities in one year / Total number of working hours for all CE department employees in one year] * 100% Technology management intensity represents activities that contribute to patient care but have not been captured in the other categories. To obtain the inventory information. but one has many more in-patient admissions and outpatient visits than the other. assuming that half of the hospitals in the United States purchase the information. but one has twice as much imaging equipment as the other. ECRI Institute has designed and maintained databases of conﬁdential hospital information that is shared among other institutions regarding the cost of equipment and reports of problems with medical devices. Two parameters are required to establish peer groups: the equipment inventory (the number of items for imaging. clinical laboratory. and general medical equipment) and the frequency of use for that equipment. If two institutions both have 250 beds. and other events that improve staff skills. but direct measures of frequency of use are not available.COVER STORY Measure for Measure Customer satisfaction (5-point scale) The customer is usually deﬁned as the clinical staff. To show that it is feasible to benchmark CE department activities. ECRI Institute prepared a list of technology management activities based on our review of the literature. The Department of Health for the State of Pennsylvania collects information describing CTs and MRIs. and 5 = exceptional. These activities include attending on. Two parameters are required to establish peer groups: the equipment inventory (the number of items for imaging. professional meetings. and general medical 272 . and the utilization of many medical services for hospitals in Pennsylvania. These survey results are likely to be most helpful for internal benchmarking. and we do not recommend changing established surveys. direct experience reviewing the performance of CE activities. A rough estimate puts the annual cost at less than $50 per hospital. • equipment) and the frequency of use for that equipment. CE departments can review the list of technology management activities in Table 2 (page 275) and record the time spent on those activities during the year. 3 = average. Instead of prescribing a single survey or set of questions to be used by all institutions. inspection. conferences. and have similar types and quantities of equipment. • CE department development = hours spent on development activities per year / [# of BMETs + Clinical Engineers + CE department Managers] CE department development applies to the hospital staff of the CE department. preventive maintenance and repair. we needed to identify the information required to establish peer groups for CE departments and to show that it is feasible to obtain that information. This could be accomplished by establishing a central database accessible to all institutions who contribute—a project that ECRI Institute believes to be feasible. clinical laboratory. offer similar services.
Here’s the impact: Indicator *% repairs completed within one working day *Total CE cost/device serviced % PM complete % Technician time spent on maintenance Customer satisfaction CE department development Technology management intensity Impact Decreased Decreased No Change Increased Decreased No Change No Change Testing the Indicators We tested the indicators by proposing hypothetical scenarios that are likely to occur in practice. Reducing staff saved money but adversely affected performance.” which led to a decrease in customer satisfaction. Reduce your staff! Result 1: Looking at the technician productivity.” Since senior management at the hospital and the Joint Commission place a very high priority on the PM completion rate. such as the number of medical services offered. Gaev to hospital parameters that are found in the American Hospital Association (AHA) guide. change in the percentage of repairs that were completed in one work day.” The technicians fell behind on their repair work as shown in the decrease in “% repairs completed within one working day. which was reﬂected by the increase in “% technician time spent on maintenance. They spent more time on CM and PM. The department did not have too many people. Reducing staff saved money and didn’t adversely affect performance. Scenario 2 Management challenge: The head of nursing wants to reduce medical errors. The department really did have too many people. Scenario 1 Management Challenge: Your expenses are too high. 273 The remaining technicians were able to handle the additional workload as reﬂected by the increase in the “% technician time spent on maintenance. in-patient data (number of admissions). Do something! Result: The CE department spent more time with the procurement process and convinced the hospital to standardize with a particular model of infusion pumps. Result 2: Looking at the technician productivity.COVER STORY Jonathan A. so there was no change in the “% PM complete. The overall expenses for the CE department were decreased so the “Total CE cost/device” decreased. recording how the indicators change based on the scenario. it appears that the CE department can reduce stafﬁng by one technician so one BMET was laid off. the remaining technicians were not able to handle the additional workload. it appears that the CE department can reduce stafﬁng by one technician so one BMET was laid off. the department made sure to complete all of its PMs on time. and evaluating those changes to see if they are consistent with our practical experience. Further studies would be required to demonstrate the correlation between equipment maintenance parameters (such as the number of failures) and the related hospital parameters. we conclude that it is feasible to obtain information for the intensity of equipment use. Since the AHA information is readily available. and the number of out-patient visits. If the indicators did not respond to the scenario.” Laying off the BMET decreased the overall expenses for the CE department as reﬂected in the decrease in “Total CE cost/device. they would be judged to be poor indicators and new indicators would need to be selected.” The PMs that were scheduled were completed on time and there was no Biomedical Instrumentation & Technology . Here’s the impact: Indicator *% repairs completed within one working day % PM complete % Technician time spent on maintenance Customer satisfaction CE department development Technology management intensity Impact No Change *Total CE cost/device serviced Decreased No Change Increased No Change No Change No Change In this case. The following simple scenarios are provided for illustrative purposes to represent some of the management scenarios that were considered.
The smart pump project required more effort than was anticipated so the clinical engineer was not able to spend as much time on CE department development activities. which is reﬂected in their increased “Customer satisfaction” score.S. purchasing/ requisitions Most activities Most activities may be affected Clinical support. Discussion The overall goal of phase 1 of this project is to identify the desirability and feasibility of developing benchmarks for CE activities in U. This effort was led by the clinical engineer who was assisted by one of the BMETs. Incoming testing. clinical laboratory and general hospital equipment (such as physiological monitors and infusion pumps). Indicators with an asterisk are reported for the following equipment categories: imaging. Scheduled PM and safety testing. which adversely impacted “CE department development. design. is desirable. modiﬁcations. 274 July/August 2007 . hospitals.COVER STORY Measure for Measure They also implemented a “smart pump” system in the areas with the most critical patients. the nursing department is pleased with the CE department response. The representative set of indicators and data source. but still got his work done. research and development. The clinical engineer was ﬁnishing up an equipment planning project for the emergency department and took on the smart pump project Indicator *% repairs completed within one working day *Total CE cost/device serviced *% PM complete % Technician time spent on maintenance Customer satisfaction CE department development CE Department Activity from the Journal of Clinical Engineering Article Repairs All activities are considered in the cost calculation Scheduled PM and safety testing Repairs. Here’s the impact: Indicator *% repairs completed within one working day *Total CE cost/device serviced % PM complete % Technician time spent on maintenance Customer satisfaction CE department development Technology management intensity Impact No Change No Change No Change No Change Increased Decreased Increased when the other project was ﬁnished. Coordinate outside services. We spoke with many healthcare professionals while conducting this study and all of them agreed that it would be helpful to have meaningful benchmarks of CE department performance and that the existing measures were not sufﬁcient. incident investigation/risk management Data Source Computerized Maintenance Management System (CMMS) CMMS CMMS CMMS and CE department work time recording system CE department or hospital survey CE department work time recording system Technology management intensity CE department work time recording system Table 1. The BMET spent a bit less time servicing equipment.” For now. primarily maintenance activities. We conclude that benchmarking of CE department activities.
Gaev Category Technology Evaluation Activity Support of hospital research activities through device design.g. including multi-year planning Systems analysis and support (e. direct device operation.COVER STORY Jonathan A. participation in and leadership of industry associations and standards-setting organizations) Community outreach/education Device design and development Research support Wireless network management and support Support of speciﬁc patient safety activities Health information technology and medical device integration Incident investigation/risk management Hazard and recall management Technology Planning and Assessment Assistance in equipment standardization efforts Providing capital budget assistance Supporting new technology forecasting..g. and other technical support Technology Management Centralized service contract management Clinical department rounds Other support for clinical departments Special Activities Attend and support capital planning meetings and functions Development of the CE profession and policy inﬂuence (e. Table 2. Biomedical Instrumentation & Technology 275 . assessment. Technology Management Intensity Table.. and planning efforts Attend/participate/manage technology assessment programs Conduct/manage technology assessments Conduct/manage comparative device evaluations Capital equipment process. technology integration) Equipment planning.
transparent methodology to the development of CE activity indicators. Gaev. Although we believe that they are helpful indicators of CE department performance. 276 It would be very interesting to gather data and then test the relationships between the indicators. is whether they are used. Department of Veterans Affairs. Harvey Kostinsky. and Malcolm Ridgway of Masterplan. Jim Keller. Health Devices Group. ECRI Institute strongly believes that there are many other activities that may be performed by a CE department that have significant value for the institution. their CE managers identify areas where the performance of their department differs significantly from that of their peer group. and we proposed using indicators that require data that can be obtained from current information sources and systems available at small hospitals (fewer than 200 beds). even with very precise systems for collecting information. then benchmarking would be shown to be useful (as well as feasible and desirable). We want to collect ideas from the CE community on how to develop a RSI. but benchmarking helps CE managers to use their time well to effectively focus their analysis on the “right” areas. and Jonathan Treadwell. but we recognize the great difficulty in quantifying those other activities and do not feel that meaningful indicators can easily be developed. That type of knowledge would enable us to conﬁrm the parameters used to establish meaningful peer groups and to therefore develop effective benchmarks for the performance within those groups. the CE manager knows that it will be worthwhile to find out the reasons for the variance. Phil Englert of Catholic Health Initiatives. Making sure that the definitions are consistently applied regarding equipment to be included in the inventory and services performed is hard to achieve. Please send your comments on the development of the RSI. the data do not need to be perfect to enable managers to use them well. PMP. especially activities that improve patient safety. Jonathan A. In the real world. Once those areas are identified. The ﬁnal and most persuasive test of indicators.org. Based on a report that they receive of the performance of VHA institutions. Benchmarking requires that hospitals provide speciﬁc data. to the author at jgaev@ecri. as well as any other comments that you have regarding this study. These indicators were selected to demonstrate the feasibility and desirability of benchmarking CE department performance. ensuring a good “apples-toapples” comparison. The Veterans Health Administration has effectively used benchmarking data for more than 20 years. we do not intend for them to be interpreted as the recommended set of indicators. ECRI Institute. HEM. July/August 2007 . even though the data they receive is not perfect. Brody. Because of this. CCE. is the director of technical programs. of the Veterans Health Administration (VHA). If managers ﬁnd that the indicators can truly be used as benchmarks and that benchmarking helps them to improve their processes and decision-making.COVER STORY Measure for Measure The key challenge was to ﬁnd out what is feasible. the data collected will have some errors. Signiﬁcant contributions were also made by Mark S. though. It would be very interesting to gather data and then test the relationships between the indicators. as he or she may discover that changes can be made to improve practices and obtain better results. producing a set of representative indicators that can be used to compare performance of the majority of the activities performed by CE departments. Maintenance activities represent the majority of the CE department budget and therefore get more attention by senior managers than other CE department activities. Rob Maliff. Acknowledgments ECRI Institute’s team included Jonathan Gaev. and management (either at the CE director level or above) must allocate the required resources. Only a program that facilitates participation in will be feasible. Tim Ritter. an expensive data collection program will not succeed. Maintenance activities were an obvious focus for this study. In practice. MSE. CCE. Testing hospital data would enable us to know if hospitals with higher levels of equipment use (as measured by number of in-patient and out-patient admissions) really do have higher maintenance costs. Conclusion We have developed and applied a logical.
Health Facilities Management. V3.hfmmagazine. A new perspective on clinical technology management. The Harvard Business School Press. revised 2004 edition. Joint Commission on Accreditation of Healthcare Organizations (1990):29–33.ﬂexmonitoring. Cost-effectiveness and productivity. and Benchmarking…” on page 332. 1995. (Accessed Sept. Journal of Clinical Engineering. Fennigkoh L. Benchmark indicators for medical equipment repair and maintenance. Productivity and cost-effectiveness of clinical engineering. Washington. The Biomedical Engineering Handbook. Journal of the American medical Informatics Association 6(4):319. 2006. Wang B et al. Focus On: Thomas Jefferson University Hospital. Health Forum LLC. Developing an effective inspection and preventive maintenance program. Validating medical equipment repair and maintenance metrics: a progress report. 7. Translating Strategy into Action-The Balanced Scorecard. Hansen DK. Halept VA. July/September 2006:145–151. Comprehensive Accreditation Manual for Hospitals: The Ofﬁcial Handbook 2007. S-Business: Deﬁning today’s technology services business. March 27. Autio DD. Guidance article-best practices for health technology management. 28. Technology and Safety Management Series: Measuring Quality in PTSM. Brown S et al. Morris RL. Tips from the ﬁeld: How to strengthen your customer service program. October/December 2005:229 –224. May 2003 p. 1988. Diez C. Norton P. 2006. 2006). 21. 21. 29(4):308–321. 36(6):405–408. 2004) © 2004. 2006. Cohen T. Triola M. Flex Monitoring Team Brieﬁng Paper No. Bauld TJ. Strack R. 35(12):437–448. Published January 12.COVER STORY Jonathan A. 24x7. August 2005. Kaplan S. David Y. 11(2):105–113. Biomed Instrum Technol. Thomas R. The Leapfrog Group Hospital Quality and Safety Survey. Health Forum inc. Cohen J. University of Southern Maine (May 2005). June 8. Clinical engineering program indicators. VHA Directive 2006-15. Barber F. University of Minnesota. American Hospital Association. Journal of Clinical Engineering 12(2):139–145. Health Devices. Journal of Clinical Engineering. Gater L. Financial Indicators for Critical Access Hospitals. DC. Hinesly D. Biomedical Instrumentation & Technology 277 . 29. Addison Wesley. ECRI Institute Experience and HECSTM Reports. Haas J. Hertz E. Washington. Biomed Instrum Technol. Plant. American College of Clinical Engineering. Clinical engineering program productivity and measurements. 2004. AAMI 2004 p. Wang B. 2nd Ed. Rohe D. Biomed Instrum Technol. 2nd Ed. 31(1)23–32. (2000):170–1– 170-9. 2006). Design of Clinical Engineering Quality Assurance and Risk Management Programs. Chicago. Joint Commission on Accreditation of Healthcare Organizations. Break through management: a new model for hospital technical services. ACCE Body of Knowledge Survey Results-2005. Are you benchmarking yet? 24x7 (December 2006):24–31. DC. Estimated Useful Lives of Depreciable Hospital Assets. ASFM. Benchmarking VHA biomedical engineering operations. Biomed Instrum Technol. ANSI/AAMI EQ56:1999 Recommended practices for a medical equipment management program. Hansen L. American Hospital Association. Department benchmarks.1. Statistical Power Analysis for the Behavioral Sciences. Biomed Instrum Technol. 37(3):181–189. The Harvard Business Review (June 2005):81–90. Journal of Clinical Engineering. Lawrence Erlbaum Associates. FOR MORE ON BENCHMARKING: please see “On Sculpture. Joint Commission on Accreditation of Healthcare Organizations. 40(6):418. Elementary Statistics 8th Ed. Guideline for medical equipment management programs (MEPs). (Accessed Sept. RH. Association for the Advancement of Medical Instrumentation. 24x7. Developing Quality Indicators for a Clinical Engineering Department. 1996. 36(4):231–236. Biomed Instrum Technol. an afﬁliate of the American Hospital Association.2. Tackel IS et al. Piotrowski MB. Fotopoulos M. How to be in complete and continuous compliance with the JCAHO standards.org. 18(6):501–509. Journal of Clinical Engineering.. Joint Commission Primer on Indicator Development and Application-Measuring Quality in Healthcare. 19–31. Safety in numbers. The impact of computerized physician order entry on medication error prevention. Stiefel. The surprising economics of a people business.com. Cohen T et al. 2005 Survey of salaries and responsibilities for hospital biomedical/clinical engineering and technology personnel. Campbell S. Journal of Clinical Engineering. Available at http://www. Health Facilities Management (May 2003):28. Gordon GJ. American College of Clinical Engineering. 1990. Bates et al. Furst E. Available at http://www. The Clinical Engineering Handbook (2004):199–202. A comprehensive Review of Development and Testing for National Implementation of Hospital Core Measures. University of North Carolina at Chapel Hill. Department of Biomedical Instrumentation. National Committee for Quality Assurance. Salary survey: benchmarking your employment information. 2006. 37(6):398–404. Stiefel RH. NY 2001. 2006. The business of running an in-house biomed program. AHA Guide 2005 edition (based on data collected as of June 30. Health Plan Employer Data information Set 2007. Department of Veterans Affairs. ECRI Institute. Global failure rate: A promising medical equipment management outcome benchmark. Baseball. Gaev References AAMI. Submitted June 20. 11(6): 435–443. Who drank my wine? Biomed Instrum Technol. Productivity: Standard terminology and deﬁnitions.