Benchmarking in Dutch healthcare

Towards an excellent organisation

Drs. R.J.C. Poerstamper MBA Drs. A. van Mourik – van Herk Drs. A.C. Veltman

Benchmarking in Dutch healthcare
Towards an excellent organisation

Robbert-Jan Poerstamper Anneke van Mourik - van Herk Aafke Veltman

© PricewaterhouseCoopers, Amsterdam 2007

What is the added value of benchmarking? What makes benchmarking a success and what does the future hold? Just a few questions out of the many that the HEAD Association (the Dutch Association of Finance Managers in Healthcare) raised when we discussed a publication on benchmarking as part of our sponsorship agreement. Many HEAD Association members are experienced benchmarkers, and their ideas for improvement and critical comments have helped us fine-tune our benchmarking model over the years. We highly value the important lessons we draw from their feedback, lessons which remind us that it is always possible to make benchmarking better: more inspiring, more instructive and less taxing for its participants. We would like to thank everyone who provided us with material and who took the trouble to read and comment on draft versions of this report. We're particularly grateful for the invaluable feedback from our colleagues Jane Duncan and Maura Kelly at PricewaterhouseCoopers in Ireland. A special word of thanks is also due to the members of the steering committee and the sounding-board group who helped to create this publication and whose names you will find in our first appendix. Their input has helped to make this a co-production of many interested parties. This publication also provides an excellent opportunity to thank the very many that have enabled us to engage in the benchmarking process: first of all our customers, of course - initially the Ministry of Health, Welfare and Sport and later also industry associations ActiZ and VGN and the care providers themselves. And let's not forget all those people who have helped us improve the quality of our benchmarking by participating in sounding-board groups or by giving us their assessments. But most of all we pay tribute to the hundreds of care organisations and many hundreds of thousands of employees and clients who have provided data and sent in questionnaires. Thanks to them, benchmarking in Dutch healthcare has become what it is today: a valuable management tool used by increasing numbers of healthcare providers. We'd also like to thank the team of translators and editors - Anita Graafland, Willemien Kneppelhout and Tom Scott - who have worked so hard to ensure that this English language version is not only accurate but also, we hope, a pleasure to read. The authors


6 Key success factor 5: Do not leave everything to external consultants 3.2 Other definitions of benchmarking 2.1 Our definition of benchmarking 2.7 Key success factor 6: Aligning benchmark to regular records 3.1 Rationale for benchmarking 1.10 Key success factor 9: Strength through repetition 2 3 .3 Learning and improving 1.2 Key success factor 1: Optimise learning 3.4 Key success factor 3: A multidimensional approach 3.6 Benchmarking: increasingly embedded 2.3 Benchmarking and Total Quality Management 2.8 Reasons for not benchmarking 1.1 When is benchmarking an appropriate tool? 3.5 Transparency and profile 1.5 History of benchmarking 2.9 In conclusion Benchmarking: comparing and improving 2.2 Positioning 1.7 Benchmarking as necessity How to make benchmarking a success 3.8 Key success factor 7: Sensitive data handling 3.4 Relationships between performances 1.5 Key success factor 4: High-quality tools 3.7 Benchmarking and accountability 1.6 Information for industry associations 1.9 Key success factor 8: No compulsory benchmarking 3.4 Definitions: differences and similarities 2.Contents Preface Contents Introduction 3 5 9 11 11 12 14 15 16 16 17 20 20 21 21 22 23 24 26 27 29 31 31 33 37 38 38 40 40 41 42 43 1 Why benchmark? 1.3 Key success factor 2: The benchmark model should be broadly based 3.

8 Profile of a healthcare benchmark model Benchmarking model for healthcare benchmarks 5.5 Quality of care 5. care and home care benchmark 7.3 A phased approach to healthcare benchmarking Healthcare benchmark: notable features 7.2 Classification by benchmarking objective 4.4 Classification by reference group: internal or external benchmarking 4.9 Best practices 5.4 Different types of benchmarking 4.2 Child healthcare benchmark 7.5 Classification by level of organisation 4.1 Classification criteria 4.3 Classification by what is being measured 4.13 Benchmark strategic management information The step-by-step benchmarking process 6.5 Partial benchmarks in mental healthcare 7.6 Quality of the job 5.7 Social responsibility 5.7 Classification by research process 4.10 Explaining performance 5.3 Healthcare administration agency benchmark 7.7 Benchmarking Dutch hospitals 45 45 45 46 48 49 49 51 53 55 56 59 59 66 72 73 75 76 79 82 84 86 87 87 89 96 103 103 104 105 106 110 111 113 5 6 7 .2 Keehley’s step-by-step plan 6.4 Benchmarking care for the disabled 7.8 Relationship between building blocks 5.1 Benchmarking phases in the public sector 6.2 Building blocks of benchmark surveys 5.3 The financial building block 5.12 Reporting results 5.1 Nursing.1 Input and strategic themes 5.6 Benchmarking the healthcare chain 7.11 Innovation 5.6 Classification by use of normative standards 4.

4 More research into cost-to-reward ratios 8.2 Excellence and innovation 8.6 More dynamic reporting 8.8 Simplified data supply 8.7 Continuous benchmarking 8.8 Innovations in benchmarking 8.3 Benchmarking outside the box 8.1 Towards performance excellence 8.9 Introduction of XBRL 121 121 123 124 124 125 127 128 130 130 Appendices A B C D E Steering committee and sounding-board group Bibliography Benchmark studies Dutch healthcare abbreviations and acronyms Endnotes 133 135 137 145 149 150 .5 More benchmark partner involvement 8.


of course – anyone who for policy-making or operational reasons wishes to engage in benchmarking – e. PricewaterhouseCoopers publishes a report on a specific topic every year.’ Under our sponsorship agreement. ‘Benchmarking is a clear trend in Dutch healthcare and there is a demand for more background information.’ the HEAD Association commissioned PricewaterhouseCoopers.g. acronyms and abbreviations to our English readers. All participants can compare their outcomes with those of others. The report focuses on the benchmarks that PricewaterhouseCoopers has conducted with other consultancies and agencies in the Dutch healthcare and related sectors over the past decade. although we have strived to explain typically Dutch phenomena. Our target readership includes – aside from HEAD Association members. .Introduction ‘Write a report on benchmarking. and emphatically does not purport to be. an academic study. quality managers and any interested readers outside the field of healthcare. and in particular with those of their best-in-class peers. As our key target readership is in the healthcare sector. and involves collecting and reporting on data from different organisations and organisational units. That said. Benchmarking was selected for its immediate relevance to many healthcare providers in the Netherlands. directors of healthcare organisations. we do believe that this report might also be of interest to people in other sectors. However. real-world experience is put in the context of the literature on the subject. It is not. This helps to identify areas for improvement and appropriate action. we have assumed that readers will be familiar with a number of key healthcare terms. This report aims to enable its readers to make conscious decisions in benchmarking: Is it worth our while to benchmark? How do we achieve the maximum possible return? Benchmarking is the process of systematically comparing performance as a starting point for improvement. In 2005 the spotlight was on social responsibility and for 2006 the focus is on benchmarking.

what did not and what we ended up changing. Appendix C presents a comprehensive review of the benchmark studies and those who commissioned them. then. but since 2003 the industry associations have been their sole sponsors. PricewaterhouseCoopers has been responsible for the financial building block and for overall programme management in all these benchmarks. healthcare administration agencies and mental healthcare. . Welfare and Sport and the industry associations initially acted as co-sponsors of the healthcare benchmarks. Our benchmarks – which we refer to in this report as healthcare benchmarks – comprise a series of comprehensive projects in nursing.For the purposes of this report. Our large-scale benchmark model has also been applied in the Dutch vocational education and training and housing corporation sectors. while we also include a few benchmarks on sub-areas such as treasury and invoicing. detailing the number of participants and the names of consultants and agencies we have worked with. The Ministry of Health. care and home care. and are thus able to describe what went well. we have carried out extensive research into the literature of benchmarking. The report’s focus on benchmarks in which we have been involved ourselves reflects the fact that we’re familiar with all their ins and outs.

’ . profile and image.’ ‘We want to improve the management of our organisation. but to further understanding here we present our working definition: Benchmarking is the process of systematically comparing performance as a starting point for improvement.’ Positioning ‘We want to see how we’re doing compared with others. drawing on research literature and our own and others’ experience with healthcare benchmarks.’ ‘We want to set target standards.1 Why benchmark? This section describes the benefits of benchmarking for both benchmark participants and any other parties involved. 1.’ ‘We want to put our performance in perspective.1 Rationale for benchmarking Learning and improving ‘We want feedback on and insight into areas in need of improvement. A benchmark allows organisations to gauge their own position.’ ‘We want to identify areas of particular expertise on which to focus.’ ‘We don’t want to fall below the average.’ ‘We want to share key success factors.’ ‘We want to broaden our view.’ ‘We want confirmation that we’re doing better/worse than others. creates a learning curve to help improve performance and helps boost transparency.’ ‘We want to learn from best practice.’ ‘We want to use the benchmark to set priorities. The following sections will review varying definitions of benchmarking.

As the diamond-shaped symbol shows. Figure 1. Benchmarking in Dutch healthcare Transparency ‘We want to communicate our performance to our clients.’ ‘We want to supply management information to our industry association. The blue triangles indicate the scores of the participating organisations.’ ‘We want to boost the industry’s image.’ 1. The figure’s horizontal line captures the average score.g. with the emanating lines demarcating the confidence intervals. do you rank among the leaders or the laggards? Benchmarking helps to broaden your perspective and to make your organisation less inward-looking. the relevant healthcare provider clearly lags the average.’ ‘We want to be externally accountable.2 Positioning Benchmarking allows comparison of your organisation’s performance and that of other benchmark participants and provides insight into where you stand relative to other organisations – e.’ ‘We want to present a clear profile to the outside world. It is taken from a report submitted by a participant in a home care benchmark and in this instance relates to client assessment of the care provider’s accessibility.1 gives an example of the kind of information that might arise from a benchmarking exercise. .

55 Average 68. how big the gap is and how they rank in competitive league tables.37 154.1 Example of benchmark information: accessibility of the healthcare provider Source: 2004 home care benchmark survey Benchmark thuiszorg Benchmark participants will be presented with their scores on all issues in a similar way. Table 1.5 8 7.5 Figure 1. with a number of aggregate scores also provided.00 17.5 7 6.36 10.72 28. the Benchmark verpleeg.Why benchmark? ! 9.1 Example of benchmark information: time spent on care in care homes in minutes per client per day Time Direct client-facing Indirect client-facing Non-client-facing Total Your organisation 77. allowing care providers to compare their performance with both the average and the best-performing organisations.5 1 5 9 13 17 21 25 29 33 37 41 45 49 53 57 61 65 69 73 77 81 Healthcare providers Average score Z-org 2004: 8.5 9 Client assessment 8.64 9.90 100.2 Your score 2004: 7. Our second example was taken from the benchmark study on nursing and care homes.48 21.09 Source: 2004/2005 nursing and care home benchmark study Benchmark verpleeg.en verzorgingshuizen . Table 1.74 111.en verzorgingshuizen.02 Best practice 108.1 provides a review of care provided per client in care homes.45 23. Participants can thus identify precisely where their clients are more or less satisfied than those of other care providers.

If you have opted for a specific make-up of your workforce.4 % 11. A ‘traffic-light’ system highlights performances and immediately shows up areas where improvements have – or have not – been made. shows workforce assessments of their working conditions.8 % 15. Note that benchmarking is always a means and never an end in itself. for instance.3. it allows you to identify any areas in need of improvement and actions that need to be taken. Figure 1.8 % 11.2 % 24.3 % 10. Top 10 areas for improvement according to your clients Organisation: • Annual evaluation meeting • Fewer personnel changes • Improved accessibility of manager by telephone • Improved backup for carers in case of illness etc. a benchmark will also produce a comparison over time. ." Benchmarking in Dutch healthcare 1. Providing comprehensive insight into your performance. • Greater account taken of client wishes • Reduced waiting period for start to home care • More convenience services + supplementary care • Greater telephone availability • More oral information Provision of care: • Greater focus on quality of life 7. Figure 1. as cited by clients (in percentages of clients citing relevant areas) Source: 2004 home care benchmark Benchmark thuiszorg If previous benchmarking has been carried out. and that benchmark outcomes should always be tested against your own vision and policies.2 Example of benchmark information: areas for improvement in home care.9 % 10.3 % 11.8 % 25.2 provides an example of possible areas for improvement according to clients in the Dutch home care industry.8 % 15. for instance.3 % Figure 1. benchmark outcomes in this area are bound to deviate from those recorded for other participants and need not be a reason for change.3 Learning and improving A benchmark typically shows up any areas for learning.

The 2004/2005 home care study suggested that the smaller the span of middle management control. showed a connection between better client assessments and higher spending on management. The 2004 home care benchmark. The same benchmark revealed positive work ratings from staff in organisations where managers applied effective management and had their business operations in order.Why benchmark? # Overall score Energy boosters Response Work stressors Wellbeing Wellbeing Figure 1. the more positive the clients were. A lesson to be drawn from the first example could be not to cut down on management too rigorously as this might jeopardise quality.4 Relationships between performances A benchmark also provides insight into relationships between performances. These types of findings help set the outcomes of the benchmark against the implications for other scores.3 Example of benchmark information: comparison of workforce assessments 2002 and 2004 Source: 2004 home care benchmark Benchmark thuiszorg 1. for example. . but also that this relationship disappeared when management costs exceeded around 6 per cent of the overall spend.

They do so because they want to. 0-4 years Source: 2005 JGZ financial benchmark Industry associations have been known to use benchmark-derived financial data when negotiating pricing and additional resources.5 Transparency and profile Care providers use benchmark outcomes in communicating with stakeholders to improve their profiles and promote transparency. child healthcare (JGZ).4 Example of industry information: breakdown of costs per child per year.$ Benchmarking in Dutch healthcare 1. a benchmark will always produce outcomes that capture the state of play in the industry. 1. Figure 1. 350 300 250 200 150 100 50 0 Total costs per child Average total costs per child € 267 Figure 1. But industry associations also use their benchmarks . External stakeholders such as regional client organisations and financial backers are equally interested. works councils and supervisory boards. Sometimes this backfires: home care providers in the Netherlands at one point faced retrenchments because their benchmark had showed up opportunities for efficiency improvement. Benchmark outcomes feature on the agendas of internal stakeholders such as client councils. but stakeholders also insist that they do. These serve as input for industry associations when promoting the interests of their members. or at the very least among the benchmark participants.4 has an example of industry information.6 Information for industry associations Aside from outcomes for individual care providers.

Only when participants are not judged on the basis of their results will they be prepared to openly discuss less brilliant outcomes. Sometimes. Obviously. The Raad voor de Volksgezondheid en Zorg (RVZ) . For example. . allowing governments.7 Benchmarking and accountability Clients are looking for information to help them pick the right care provider. has found that benchmarking changes the nature and intensity of the relationship between government. anecdotal evidence has supervisory boards using benchmarks to set targets for management. benchmarking serves to prevent the government or other national bodies from launching an investigation. Insurers want to see proof that the organisation meets specific conditions – often related to quality of care and financial health – before agreeing contracts. The first bank asking about benchmark outcomes before extending a loan has been spotted. 1. or MBO Raad. a benchmark requires a safe environment guaranteeing anonymous results or exclusive outcome-sharing with a self-selected group of peers. the Dutch Council for Public Health and Care. regulators and financial backers to adopt a less 1 interventionist approach. regulators and independent agencies. when the Netherlands Association of Vocational Education and Training Colleges. benchmarking contributes to an industry’s positive image. market and private enterprise. it was not just demonstrating its commitment to benchmarking. revealed that it had completed its first benchmarking study. Healthcare benchmarks may thus help to instil and increase trust in the health sector. it was also telling the government there was no need to press ahead with the information-seeking exercise the latter had proposed. And yet there is a fundamental difference between a benchmark used for accountability purposes and a benchmark used to help improve performance and encourage mutual learning.Why benchmark? % to underline their willingness to be transparent and to show that they make no secret of their strengths and weaknesses. By demonstrating in this way that its members are prepared to self-reflect and work on their performance. the line between communication and accountability is fuzzy. and benchmark outcomes are already being used in official accountability statements to governments. If used for the latter purpose.

‘A choice can be made for a broader or a narrower perspective. 4 . Watson questions whether. Competitive issues are argued to be less suitable for benchmarking. Watson even feels that benchmarking represents a fundamental shift in thinking about competition. Organisations would hold back information or make things look better than they really are. Drawing on an example of two companies that lived by the adage that competitors do not tell each other 3 anything and that proved singularly unsuccessful as a result. some argue.& Benchmarking in Dutch healthcare If benchmark results are used to hold them to account. They would display strategic behaviour and thus totally disrupt any learning curve. Any organisation that does not do so is not really prepared to learn and will go on the defensive if they do not like the scores. a competitive stance is the best way to go. one key factor will be whether the benchmark in the relevant sector is still developing. at the end of the day. improving their market environment in the process. organisations could shy away from proper use of benchmarks.’ Healthcare benchmarks span the whole range of these views. The authors of Benchmarking in de publieke sector (‘Benchmarking in the public sector’) would seem to consider learning and being held accountable as a gradated difference. In 2 its report Presteren door excelleren (‘Performing by excelling’). arguing that benchmarking should serve both purposes. Nonsense. For the rather more limited purpose of accountability. the Dutch government explicitly linked performance improvement and transparency/accountability. A broader perspective implies measuring and improving. others say. with the learning curve a vital ingredient for benchmarking organisations. many public organisations can stick to benchmarking in its narrowest sense: comparison with a benchmark as a means of determining relative performance. he finds that competing companies are increasingly tackling shared problems together and openly debating them. In fact. To an extent. organisations may wonder if its outcomes are sufficiently valid and reliable to be used for accountability purposes. If it is. Yet others feel a distinction should be made between competitive industries and sectors where competition is less of a feature. In the long term he sees little gain in acting as competitors only. Any self-respecting organisation will compare and report without reservations.

Aggregation levels will typically differ.g. The best way to go is obviously to collect the same data for both benchmarking and accountability purposes wherever possible. whether they benchmark or not. Association for Care of the Disabled in the Netherlands). care and home care) and the Inspectie voor de gezondheidszorg (IGZ) (the Dutch Healthcare Inspectorate) apply standards of responsible healthcare for both benchmarking and accountability purposes. the Vereniging Gehandicaptenzorg Nederland (VGN. has ensured that a key quality indicator in the benchmark now also features in its annual disclosures. benchmark participants are in danger of having to disclose a completely different set of data for accountability purposes – not exactly an encouraging scenario. ActiZ (the Dutch association for nursing. By making a change to its annual healthcare report (accountability). As accountability is required anyway. supplemented where applicable with data used for one of these purposes only. healthcare providers had best make sure that the required data are streamlined as much as is feasible and that definitions are harmonised. A basic set of data thus emerges for use towards both learning and accountability. That said. with disclosures to regulators aggregated at high levels – e. at the level of the organisation – while benchmarking requires lower levels of aggregation as its outcomes are intended to feed into actions for improvement. . at least some of the information disclosed is exactly the same. and one that would put a double burden on healthcare providers. Whatever the approach. both benchmarking and accountability data should basically be the same as the data needed for healthcare providers’ internal strategic management.Why benchmark? ' We take the view that learning and accountability are fundamentally different goals. If they do not. This makes for maximum alignment and keeps the burden on the organisation to a minimum.

’ ‘We’ve just come out of a restructuring process.’ ‘We reckon we’re doing quite well.’ Some organisations have reservations about the use of benchmarking. Accountability and regulatory disclosures are fundamentally different from learning and improving.’ ‘Our organisation does not compare with others.9 In conclusion It is our experience that benchmarking can produce many rewards for organisations.’ ‘We’ve just come out of a merger. while others see practical obstacles.’ ‘We think the costs of benchmark participation are too high.’ Practical impediments ‘We feel it’s too much work. image improvement and input towards policy-making are other benefits. 1.’ ‘We don’t like washing our dirty linen in public. . And.’ ‘We already know our weaknesses. Their doubts and concerns may be quite legitimate. but aligning data sets is vital if organisations are not to face a double burden.8 Reasons for not benchmarking This section has so far only discussed the potential benefits of benchmarking. with gauging their position and identifying areas for improvement being the most basic. on very rare occasions. But what reasons could organisations have not to participate in benchmark studies? Doubts as to its use ‘We use different methods that produce comparable results. an organisation will indeed be so unique that no others could offer any points for learning. Organisations aware of the areas in need of improvement have little more to expect from a benchmark. Benchmarking in Dutch healthcare 1. But sometimes there is something else lurking in the shadows: an organisation’s unwillingness to change. Transparency. It is also true that benchmarking requires investment: not just in the cost of participation but also in tackling the areas in need of improvement. And not every juncture is the right time for benchmarking: the situation after a restructuring or merger is often not the most representative time at which to gauge how an organisation is doing – albeit that some will want to benchmark precisely at this juncture to determine the baseline situation.

In addition to comparing and improving. All these elements will feature in this section. and. . The section also touches on the relationship between benchmarking and Total Quality Management and describes how benchmarking has developed into a management tool that is widely used across the world and has now also made inroads into the public and healthcare sectors. best practice. In our view. continuous also refers to a system that allows organisations to start their benchmarking process at any given time and retrieve their comparative data from a database. This reads as follows: Benchmarking is a continuous and systematic process for generating strategic management information by equally measuring and comparing both the efficiency and quality of performance. ‘Comparing’.2 Benchmarking: comparing and improving This second section sets out the various definitions of benchmarking found in the literature. implies that benchmarking is not a one-off exercise: organisations taking the trouble to make improvements will want see the effects of their efforts over time. 2. above all. but here is just a taste of what is in store. a balance between efficiency and quality. Continuous. with the express purpose of identifying starting points for the improvement of an organisation’s own performance by adopting best practices. strategic management information. for one. ‘learning’ and ‘improving’ typically crop up in almost all of them.1 Our definition of benchmarking The working definition we presented in the introductory section – benchmarking as the process of systematically comparing performance as a starting point for improvement – in fact derives from a rather more extended definition that we have arrived at in the course of our benchmarking research and surveys. our broader definition includes several other elements that are key: continuous.

systematic process for evaluating products. it should lead somewhere. Learning should be action-oriented. Edwards Deming . Benchmarking also means aiming higher. those that excel in the product or process to be studied. with learning not just the product of measuring (quantitative) but also involving investigating (qualitative). and comparing one’s own performances with these best practices. and work processes of organisations that are recognized as representing best practices for purpose of organisational improvement’. and preliminary research should help narrow down the list of suitable benchmark partners. with the aim of identifying one’s own position and improving one’s own performance. for one. that is to say. 2. Their advice? ‘Adapt. don’t adopt. And striking a balance between efficiency and quality in our book means that a benchmark should always cover multiple dimensions. Benchmarking involves continuous measuring of trends and developments on the basis of a series of activities. reckons it is dangerous to copy and argues that people should understand the background to what they want to do. Camp defines benchmarking as systematically investigating the performance and underlying processes and practices of one or more leading reference organisations in a particular field.2 Other definitions of benchmarking How do others define benchmarking? Spendolini defines benchmarking as a ‘continuous. Best practice of course implies organisations that serve as examples to others. an extremely high-quality healthcare provider might not present such a good picture. if its quality came at such a steep price that the organisation’s continuity was at risk. services.’ 6 5 . Several authors warn of the dangers of indiscriminate imitation of best 7 practices. this also means that benchmarking is more than a simple comparison with the average. i.Benchmarking in Dutch healthcare Strategic management information implies that benchmark data should provide clear information that allows management to do what it is there to do: manage the organisation. After all. Benchmarking is not restricted to specific types of activities or organisations. 8 Watson feels the same. Implicitly.e.

Benchmarking: comparing and improving


Westinghouse calls benchmarking a ‘continuous search for and application of significantly better practices that leads to superior competitive performance’. Note that the Westinghouse definition is rather more ambitious than that of most others. We will return to the use of benchmarking to achieve superior performance in Section 8. In Benchmarking in de publieke sector the authors describe benchmarking as creating insight into the relative performance of organisations within a group through comparison with a benchmark organisation. However, they also observe that organisations typically aim for more, with benchmarking also expected to contribute to improving the way institutions or companies function. Performance should not just be measured and compared, but where possible also improved. Benchmarking in the public sector is primarily seen as a tool to measure and enhance effectiveness – i.e. are the right things being done? – and efficiency – are things being done well and affordably? In other words: the learning curve is key. For Van Gangelen the learning aspect is so important that he includes it in his definition: systematically investigating the performance and underlying processes and practices of one or more leading reference organisations in a particular field, and comparing one’s own performance with these best practices, resulting in action-oriented learning. The European Commission also includes learning in its definition, which is the briefest we have found: ‘benchmarking is improving by learning through comparison’.
12 11 10



Benchmarking and Total Quality Management

Benchmarking and Total Quality Management are related concepts. TQM can be described as a way of managing an organisation that results in the continuous improvement of all processes and thus meets or, better still, 13 surpasses the expectations of customers and principals. TQM builds on three key principles: meeting customer expectations; managing processes and continuous improvement. Benchmarking and TQM are comparable approaches. Benchmarking could be argued to be a sophisticated 14 quality management tool and successful benchmarks often feature in TQM 15 strategies. Daft in fact sees benchmarking as one of the TQM techniques on a


Benchmarking in Dutch healthcare

par with outsourcing and continuous improvement. He feels that benchmarking is often a very useful boost to energy and direction in a TQM 17 programme. Benchmarking and TQM differ in that benchmarking focuses on key issues and best-in-class comparisons, while TQM covers all aspects of an organisation and 18 may also be totally internally focused. Bendell sees the current interest in benchmarking as ‘a natural evolution from total quality management’ and in fact takes TQM one step further. TQM focuses on a set of minor inefficiencies in need of improvement, but small incremental improvements are not enough in this day and age, Bendell reckons. Global competition requires quantum changes that are only achievable through benchmarking.



Definitions: differences and similarities

Definitions of benchmarking would seem to agree on a number of points. Measuring, comparing, understanding, learning and improving keep cropping up. And so do terms like systematic and continuous, clearly putting benchmarking in a different bracket from one-off corporate comparisons. Many definitions include terms like gauge, reference or best practice: comparing one’s own performance with a better one. In fact, the term benchmark itself literally refers to such a gauge or reference: a benchmark is a feature in the landscape used as a point of reference by a land surveyor. The term is also used to indicate sea levels, as Figure 2.1 shows.

Benchmarking: comparing and improving


Figuur 2.1 A benchmark The differences in benchmarking definitions typically involve the areas being benchmarked: performance or process, efficiency and/or quality. Like Watson, we tend to see these differences as different types or generations of 20 benchmarking rather than as fundamentally different definitions. Our definition of healthcare benchmarks differs from most other definitions presented in this section on one notable point: its multidimensional nature, 21 combining both efficiency and quality. We see the multidimensional or integrated nature of healthcare benchmarks as absolutely crucial, as this avoids a one-sided approach. Using a one-dimensional benchmark increases the risk of launching actions for improvement that do indeed enhance the benchmarked performance but only at the expense of other areas – leaving the organisation no better and perhaps even worse off. We therefore consider the multidimensional element an integral part of our benchmarking definition.

Virtually simultaneous with Phase 4. The focus of learning has now shifted from processes to fundamentally changing performance – which makes this type of benchmarking strategic. in another part of the world and/or culture. In other words: the search is on for the best product or process. e.$ Benchmarking in Dutch healthcare 2. Benchmarking may have started in the private sector but it has long since made the transition to the public sector. In global benchmarking. this phase again reflects a minor difference in regardless of whose it is. September 1998). • Phase 5: global benchmarking (1993 . Roughly coinciding with Phase 2. In 1998 a very tentative research question read: ‘Is it possible to develop a single. provincial. As for the healthcare benchmarks at the heart of this report. . • Phase 3: process benchmarking (1982-1988). this phase is primarily about a difference in emphasis: learning from best-in-class performers. Learning from both the processes and products of the competition.5 History of benchmarking Benchmarking first emerged in the 1950s and 1960s. integrated benchmark model for nursing and care homes and if so under what conditions?” (Request for a feasibility study on a benchmark for nursing and care. Dissecting and analysing competitors’ products to identify technical advances and then copy them. Watson attributes the rise of benchmarking to Frederick Taylor.g. at local. 23 Bullivant and Watson break down the development of benchmarking into five phases: • Phase 1: reverse engineering (1950-1975). national and European level. before really taking off in 22 subsequent decades. • Phase 4: strategic benchmarking (1988 . organisations seek to learn from players doing essentially the same thing but in a totally different external setting. • Phase 2: competitive benchmarking (1976-1986). we would rate the following quotations as revealing of their a proponent of corporate comparisons as early as the late 19th century.

But it is doing so within a broader framework. 2. drawing on state-of-the-art ICT and complying with other information trajectories. beaten only by strategic planning 28 and Customer Relationship Management (CRM). the Dutch child healthcare system (JGZ. with Bendell even referring to its use as ‘booming’. the request was to set up a sector-wide and forward-looking benchmark investigation: ‘Develop a continuous benchmark for the totality of nursing. covering 0-19 years of age) has no comprehensive industry-wide benchmark. to end up as these benchmark studies’ sole sponsors. Benchmarking has 25 become a popular tool.’ The ministry may have taken the initiative. Welfare and Sport to launch an investigation in 1998 into the possibilities for benchmarking in all sectors governed by the Algemene Wet Bijzondere 24 Ziektekosten (AWBZ. as part of its Beter Voorkomen (Prevention is Better) programme that also encompasses financial accountability. Things started out like this: ‘The Government has tasked the Ministry of Health.’ The change in who is giving the assignment is also interesting. and we are seeing the government provide the first push by way of a project structure and subsidies. The tool is also highly regarded: in 2004 its satisfaction rating was significantly above the average for other management tools. . a mere eight years later. but industry associations quickly developed into co-sponsors. 26 Bain & Company periodically investigates the use of management tools across 27 the world. In 2002 benchmarking even ranked second. To date. The benchmark subsequently changed from a subsidised project into an activity paid for by healthcare providers themselves. In 2004 it came third among the 21 most used tools. with benchmarking featuring very high in its rankings. In some instances the government is still acting as the driving force behind benchmarks in sectors that have had no sector-wide benchmarking. care and home care.6 Benchmarking: increasingly embedded Benchmarking is evolving into an ever more firmly embedded tool with an increasingly broader horizon in terms of reference groups. Exceptional Medical Expenses Act).Benchmarking: comparing and improving % In 2006.

1 4. 2005 management tool survey Bain & Company also finds benchmarking all over the world. In Europe. An informal experts group. Accenture’s 2006 global survey into the use of 31 benchmarking within public administration finds that government and government bodies are increasingly reporting the use of benchmarking as a tool. making it second only to strategic planning. The network has a particular focus on types of 32 international benchmarking.7 3.2.2 Most used management tools in 2004 Source: Bain & Company.6 3. with the one exception of Asia.3 4.97 3.91 3.5 3. the use of benchmarking would move up a slot in Figure 2. A sure sign of its growing popularity in the public as well as the private sector was the creation of the International Benchmarking Network under the auspices of the Organisation for Economic Cooperation and Development (OECD). where the tool is clearly less popular.& Benchmarking in Dutch healthcare Customer segmentation Outsourcing Benchmarking CRM Strategic planning 60% 65% 72% 73% 73% 75% 79% 70% 75% 80% 85% Level of satisfaction with various tools Outsourcing CRM Customer segmentatin Benchmarking Strategic planning 3.98 4.89 3.14 4.8 3.9 4 3.4 Figure 2.2 4. leading the consulting firm to conclude that it is not a fad but a consistently used instrument. The group first met in Paris on 21 November 30 29 . a hefty 88 per cent of respondents used benchmarking as a management tool. the network’s objective is to monitor benchmarking developments in public sector organisations and to gather and disseminate such information. Ignoring Asia. Benchmarking featured high on the list of tools surveyed by Bain & Company for over a decade.

and its planned activities include maintaining a database of web links on benchmarking in the public sector.7 Benchmarking as necessity Various studies suggest that benchmarking is becoming a necessity in both the private and the not-for-profit sectors. In the Netherlands. companies will have to match or exceed best practice at their competitors all across the world. In its survey of benchmarking in the public sector. products or services.Benchmarking: comparing and improving ' 1997. it has become imperative that organisations make improvements that bring real breakthroughs in production processes. Bendell lists three developments driving benchmark studies: global competition. but also in the public sector in return for tax revenues. Unchanged or shrinking public sector budgets and rising healthcare demand are combining to necessitate major improvement. by contrast. or the European Quality Award for Business Excellence. Accenture observes that the objective of the benchmarking exercise (performance improvements 35 and/or cost-cutting) often derives from heightened outside pressure. organisations are increasingly having to improve their efficiency and/or quality – or die. 33 2. In the face of fierce competition or the battle for funding. especially in the public sector. Benchmarking serves as a means to help achieve such improvements and at the same time demonstrate that the organisation is indeed seeking to improve. not just in the private sector in return for disposable purchasing power. Higher expectations in the community are also playing a part. People want value for money. benchmarking is still seen as something out of the ordinary. The authors of Benchmarking in de publieke sector believe that the United Kingdom and the United States have a clear edge in benchmarking. Lastly. 34 . Holland’s equivalent is the Nederlandse 36 Kwaliteitsprijs en -onderscheiding (the Dutch Quality Award). service processes. And winning awards brings more kudos. To survive. for instance. prices/publicity and the need for breakthrough projects. too: think the Malcolm Baldridge Award in the United States.

• Benchmarking is not suitable for all issues. cost-intensive benchmarking exercise.1 When is benchmarking an appropriate tool? Is benchmarking a panacea for improvement? Needless to say. it will not be necessary to engage in a time and energy-consuming. This section lists them and suggests solutions to potential challenges. Like any other management tool. reviewing such issues as: • preconditions for benchmarking • optimising learning • the need for a broad-based benchmark model • the use and purpose of external consultants • the added value of a multidimensional approach • tool quality requirements • aligning data gathering with general administrative duties • the ethics of benchmarking • the importance of voluntary participation • the importance of repeat benchmarking 3. If relatively simple matters are at stake that can easily be solved internally. it isn’t. benchmarking of course also comes with its preconditions and pitfalls. . If benchmarking is to be appropriate. • Benchmarking is no ‘quick fix with instant payback’. a range of preconditions will have to be met.38 • Benchmarking always requires follow-up and is not a method for implementing improvement. • Benchmarking makes demands on an organisation.39 Key preconditions are 40 senior management commitment and a culture and structure that are conducive to benchmarking.3 How to make benchmarking a success Benchmarking has a lot to offer. Benchmarking alone will not do the trick. Its limitations are reviewed at 37 length in Benchmarken in de publieke sector but are pointed out just as often in studies about private-sector benchmarking. the question in itself supplies the answer: No. and we would argue that it has its place in Dutch healthcare.

Watson even argues that an organisation resisting change is not yet ripe for quality. training facilities (benchmarking needs to be taught) and pre-established monitoring mechanisms Paraphrasing the words of a PrimaVera Working Paper. benchmarking requires a learning organisation. he feels: it can be managed. where are we taking the organisation? Senior management commitment is absolutely crucial. First is that the structure of the organisation allows differences. then. 44 43 . jobs or parts of the organisation • a pre-established TQM system • a framework encouraging information-sharing • a team-driven approach. i. is the appropriate course of action in dealing with complex issues with no obvious solutions. De Vries and Van der Togt identify the following elements in an organisation’s culture as conducive to benchmarking: • a focus on external measures (customer requirements or the performance of best-in-class organisations) instead of internal priorities • aiming for the very best • a willingness to change 42 • a willingness to learn or unlearn According to De Vries and Van der Togt . And benchmarking will really only come into its own if an organisation’s culture and structure meet certain 41 requirements. Its support should go further than merely allocating time and resources: senior management needs to be involved closely in the entire process and communicate the importance of benchmarking. organisations ready for quality see change as an exciting challenge. This paper’s authors also identify a number of preconditions if benchmarking is to be successful. benchmarking will be no more 45 than a one-off investigation without any teeth.! Benchmarking in Dutch healthcare Benchmarking. the organisation needs to have a vision that puts changes in perspective. In his view.e. Second. Quality readiness is no force of nature. In the absence of any willingness to change. that its culture encourages learning and experimenting with organisational change. a structure conducive to benchmarking typically displays the following features: • a focus on processes and operations and not on people. And lastly.

In our healthcare benchmarks we have also heard it said that not much is being done with benchmark outcomes. other strategic priorities and cultural aspects such as disposition to change. Grotenhuis’s dissertation arrives at a similar conclusion: ‘The effect of culture on the success [of a merger] can be managed. Surprisingly. Kishor Vaidya et al. KPMG argues that managers use culture as an easy excuse for not doing their 47 48 jobs. but that only four per cent of them have any inkling as to how to adjust their 50 51 procedures and systems subsequently. Reviewing twelve benchmarking initiatives. Half of respondents felt little or nothing had been done with their views. he argues. somewhat to their surprise it would seem. 52 49 . In one employee survey we asked staff that had taken part in the previous survey whether it had brought about any change. the success of a benchmarking exercise hinges on the degree to which the organisation actually manages to implement change.’ 3. Van Gangelen says more or less the same: ‘Benchmarking turns out to make a crucial contribution to obtaining insight but does not appear to inspire action-oriented learning.2 Key success factor 1: Optimise learning When all is said and done. Accenture . citing potential reasons such as limited capacity for change in the organisation. finds that nearly all companies and public organisations come out of a benchmarking study knowing what issues they score less well on.’ He considers that there is no evidence that benchmarking leads to the implementation of new knowledge in the organisation. But in a recent study into the success of mergers. benchmarking does not always produce learning.How to make benchmarking a success !! And what if the organisational culture is not favourable to benchmarking? Is 46 that an excuse? Van Gangelen’s case studies indeed suggest that it is sometimes used that way. Lack of transparency hides a fear of being held accountable. that this does not seem to affect the tool’s popularity. also find no trace of any measures for change but conclude. for one.

He explains that new insights are not applied because they do not fit in with our views of the world and reality. and really getting to the bottom of the problem. He reckons a key condition for achieving improvement is being able to handle the irrationality in organisations and managers. but because of our internal map of reality. And change is not something people do easily. Argyris has also done a lot of research into organisations’ capacity for learning. These views – or what he calls our ‘mental models’ – prevent us from thinking and acting in any other ways than those in which we are used to thinking and acting. he recommends. Managerial and professional behaviour creates defence mechanisms and then clothes these with various types of argument. He even goes as far as to call this the ‘skilled incompetence’ of teams of people who are experts at not learning.!" Benchmarking in Dutch healthcare So what prevents optimum learning? The literature on benchmarking has little to say about the absence of learning. A management team that really wants to learn is not just interested in the reality of the organisation but also in the actual nature of the management team itself. it will be tempted to blame the benchmark survey. as 55 54 53 . His findings gel with what Kets de Vries identifies as the need to ‘fight with the demon’ in his observations about organisations. but that is by no means to say that change is easy. Senge is convinced that less is being learned than should be possible not because of lack of will or good intentions. is to ‘shatter illusions’. he argues. seeing a ‘world of difference’ between identifying and analysing symptoms. And sometimes rightly so. One would be that learning and improving imply change. The real task. The learning organisation would do well to devote a great deal of attention to these mental models and discuss them at length. more specifically into the capacity for learning of management teams – or rather the absence thereof. But why would an organisation invest so much in a benchmark to then do nothing about it? We can only put forward a number of hypotheses. If benchmark outcomes are not what the organisation expected them to be. Watson may argue that organisations unwilling to change are not ripe for quality. even when they understand that it is necessary.

Prepare them for potential outcomes and their implications or consequences. The first condition for any change is that the need for change is recognised.How to make benchmarking a success !# we know from experience. Change experts call these the need for change. Answers to these questions provide pointers to ways of encouraging learning and improving. Don’t forget. Now this is a much less rational process. Benchmark outcomes are also rational: scores that reveal whether one performs better or worse than others. we soon hit on a number of fundamental change principles. we are now talking about the willingness to change of people who will actually have to implement the change. But whether the need for change actually leads to change then depends on a willingness to change – a willingness to think outside the box and take a risk. the willingness to change and the capacity for change – the three key tenets of any change programme. In terms of the rational aspects: • Ensure that participants understand early in the process how the benchmark works and are sufficiently aware of its potential for learning and improving. But occasionally the Not Invented Here syndrome kicks into action: ‘It wasn’t us who invented the indicator or method. The third and final precondition of change is the capacity for change. • Ensure that participants take time early in the process to think about the consequences of participating in the benchmark. .’ Fundamental change principles When looking into the question of how to encourage learning and improving. Are the people who need to change capable of changing? Do they have the knowledge and expertise to apply fresh insights? Focusing on the benchmark: do they know how to interpret benchmark outcomes? And how to translate these into action? Again. And they are not necessarily always the same people who have decided to participate in the benchmark in the first place. although the challenge of change always brings to light qualities that are rather less rational. Recognising the need for change is a largely rational mental process. we are talking rational aspects here. so we doubt its validity. and – in this instance – that the benchmark is agreed to be a tool that could potentially help in learning and improving.

not just in participating in a benchmark but even more so in implementing its findings. For ‘receiving feedback’ these golden rules are: – Do not perceive feedback as a personal attack. optimum use of benchmarking’s potential for learning makes demands on an organisation. • Think about how you will handle feedback. . Show senior management’s commitment by freeing up time and allocating resources to the benchmark. Let’s not forget that management set the standard.!$ Benchmarking in Dutch healthcare Ensure that participants are adequately advised of how the outcomes may be interpreted and translated into improvement measures. – Continue to treat the messenger with respect. demands that translate into questions to potential participants: • Think first about what you expect from the benchmark. The golden rules of communication apply to benchmarks. too. As we have seen. What kind of information are you most eager to find? What are you especially curious about? Which organisations would you like to be benchmarked against? Which organisation or organisations would you say display best practice today? • Encourage your organisation to set benchmark objectives: how good do we want to be? • What are we hoping to achieve with the benchmark? Draw up a plan stating what you intend to do with the benchmark outcomes. Find out what the message means and what the exact significance of the feedback is. – Do not immediately go on the defensive.

it is these organisations that will have to provide the data. but a broadly based benchmark is also crucially important. It is through this approach that we ensure that the model reflects the real world. The authors of Benchmarking in 56 de publieke sector also list a broadly based model as a key success factor. The building blocks that make up our healthcare benchmarks. After all. . Moreover.3 Key success factor 2: The benchmark model should be broadly based Willingness to change is key if any learning is to happen. of course. for instance.1 Deming Cycle 3. carried out by employees in their day-to-day work processes. advising that the approach to and implementation of a benchmark should be planned in consultation with all organisations involved.How to make benchmarking a success !% Embedding The crunch is how a benchmark’s findings and conclusions are embedded in the organisation. derive from the INK model that many healthcare providers in the Netherlands are familiar with. is the ultimate test for any benchmark. and that the model so developed needs to be appropriately communicated to all participants. This. The benchmark will yield its biggest rewards if the organisation translates its insights into concrete actions. Act Check Plan Do Figure 3. that feature in regular planning and control cycles and in Deming’s famous Plan-Do-Check-Act Cycle. that is. they often have a keen insight into who is performing well or less well – they know the story behind the numbers. It is our experience that a benchmark model needs to be developed in close consultation with the industry organisation and a working group of organisations acting as a sounding board. Actions.

Section 4 captures the strengths of the multidimensional approach in greater depth.’ Touching on its multidimensional aspects. Facts and figures can be validated and . or not to our minds at least. The title it chose for its discussion of the Dutch home care benchmark was ‘Benchmarking for home care: yes. 57 3. In a publication entitled Improving 58 the performance of health care systems . a fact-based approach primarily means that we stick to the facts. are a fact-based approach and optimum use of ICT. discussing building blocks and their interrelationships. We have said it before: working cheaply may be simple. but the quality of the research tools is of course crucially important for both the usability of the outcomes and for the backing enjoyed by the benchmarking process.4 Key success factor 3: A multidimensional approach A key feature of our healthcare benchmarks is their integrated multidimensional approach: investigation into efficiency should always include quality. we reckon. The proof would seem to be that best-in-class organisations typically score highly on individual building blocks but do not always command the highest scores: too much emphasis on one element invariably detracts from others.5 Key success factor 4: High-quality tools It may seem frivolous even to mention. it is possible. but if cheap means low quality the organisation will not be a good reference point for other organisations. the OECD reviewed eight projects in four different countries. More specific success factors here. the review calls the relevant benchmark an encouraging example.!& Benchmarking in Dutch healthcare 3. Logistics and content are equally important here: organisations should be able to devote their energies to the results of the benchmark and spend as little time and energy as possible on technical hiccups or distractions in terms of content.’ We have repeatedly and successfully drawn national and international attention to our benchmarking approach. As the term implies. Benchmarking in de publieke sector sees an integrated approach as a key success factor for benchmarks: ‘Always opt for an integrated approach comparing both financial and non-financial indicators and explicitly taking into account the resources – financial and otherwise – available to the individual organisations. and vice versa.

This is why we tend to use more general rankings. but at the very least per theme and individual section of the benchmark. Research tool quality is also reflected in ICT usage. ‘Doesn’t this fact-based approach provide a false sense of security?’ we are sometimes asked. by having clients agree or disagree with specific statements or list how often they have had specific experiences (the latter is now generally agreed to be the most exact line of questioning).36?’ Fair questions. a typical one being three categories with A containing the top quartile of best-scoring organisations. Which is also why we use a scoring system. ICT also comes into play in feedback reports. It is an arbitrary breakdown.34 and one scoring 7. In addition.How to make benchmarking a success !' objectively considered. . our more recent benchmarks feature a web-based tool for the sign-up procedure. an open text box allowing clients to raise their own 59 issues. Some pressed for a more qualitative approach. but in practice this is hardly possible and not really necessary either. if benchmark participants so desire. There are always several. In the early years of healthcare benchmarking there was certainly some debate as to whether client views could be captured in a quantitative gauge. storing all data that feed into the analyses. To avoid any false sense of security we will never identify a single best practice.g. ‘Is there really much to choose between a care provider scoring 7. which are automatically generated and include the relevant organisation’s data. the same also applies to employee surveys. We calculate scores per question if at all possible. This allows us to clearly pinpoint an organisation’s position among its fellow participants and to identify best practices. of course. e. Of course. and performance is compared to average scores on these best practices. A database is set up for every benchmark. category B the middle 50 per cent and C the bottom quartile of least good scores. but so is any other. It is now generally accepted that client views can indeed be adequately captured in a score. This type of survey can always also include a question that allows the client to pick the most urgent from a list of improvements or.

Make the necessary internal and external data gathering part and parcel of day-to-day operations. Aligning data flows requires full attention. So let us stick to what others have to say on the subject: ‘Consultants can be a great help. for your organization and to improve your organization. But that does not detract from Bendell’s exhortation that it is the organisation that needs to do something with the benchmark and not the consultants. If the data can be retrieved directly from the organisation’s records. and you would not expect an entirely objective perspective in a report written by one such consultant. But aligning should never come at the expense of quality. This would sharply reduce the chances of unearthing interesting interrelationships or finding useful clues for change. 3. 62 . participation becomes much less of a burden. But they should not do it for you. Benchmarking should be done by your organization. Keehley takes much the same view. an investment of time – matches the degree to which the data have to be supplied specifically for the benchmark.7 Key success factor 6: Aligning benchmark to regular records The degree to which organisations see benchmark participation as a burden – i. Align your regular administration of financial and non-financial data with your benchmark partners. thinking about one’s own performance and processes and." Benchmarking in Dutch healthcare 3.’ We concur. In the Dutch healthcare benchmarks. The authors of Benchmarking in de publieke sector recommend: ‘Embed periodic benchmarking in regular quality improvement activity within the organisation.’ 60 61 Bendell argues.6 Key success factor 5: Do not leave everything to external consultants Should you benchmark yourself or should you have external consultants come in and do it for you? Views on this issue vary widely.e. Getting the organisation ready for benchmarking. research consultants relieve organisations of much of the work: the questionnaires are provided. most importantly. It will not do simply to benchmark with data that happen to be there. the answers are analysed and the organisations receive feedback reports. devising and implementing improvements – these are and will remain tasks for the organisation itself.

too. but they do give a very good indication. We have found that there is a much greater willingness to share if a benchmark involves a small group of organisations working closely together in workshops. ‘Don’t organisations abuse the benchmark? How do you know whether you are getting genuine information?’ Our rejoinder is that the benchmark was designed by and for the participating organisations. for 64 . Of course. Organisations have to be willing to share information. these checks are not infallible. and that the responsibility for supplying true and accurate information lies squarely with them.8 Key success factor 7: Sensitive data handling Benchmarking requires openness. it would not be very logical for organisations to supply misleading information: they would be investing a lot of time.How to make benchmarking a success " 3. How to make organisations trust one another? And how to prevent benchmarking from resembling corporate espionage. Organisations refuse to share certain information. such as product selling prices. this primarily happens in benchmarks covering large groups of participants and not involving selection of benchmark partners by the organisations themselves. Needless to say. developed by companies and institutions working together in the Performance Improvement Group. In our work with healthcare benchmarks we also sometimes come across concerns about secrets getting out. Years of experience have taught us that strategic behaviour only happens once in a while – experience that derives from consistency checks included in the benchmark. We are often asked about this. ‘What you do not wish for yourself. Sharing information with other organisations is different from disclosing information to an insurer. with companies making off with each other’s secrets? US literature on the subject often calls this the ethics of benchmarking. Transparency is not the same as public disclosure. Besides. Benchmarking in de publieke sector observes that it is easy for organisations in benchmarks to make things look better than they are. among other things. money and energy in a benchmark whose outcomes they could not trust if they supplied incorrect data or thought that others did. Ethical guidelines to benchmarking have been set down in the Benchmarking Code of Conduct as developed by the American 63 Productivity & Quality Center and in the European Benchmarking Code of Conduct. do not do to others’ is Bendell’s key ethical principle of benchmarking.

Organisations do voice serious concerns about public disclosure. with government. based on the Business Excellence Model. A better way to go about this would be to turn a benchmark into such a success that organisations feel they are missing out if they are not participating." Benchmarking in Dutch healthcare instance. But sometimes reality catches up with views on public disclosure: the Dutch Healthcare Inspectorate on its website discloses individual quality scores by organisation – the same quality scores that feature in the benchmarks.’ We share their reservations. say. the Consumentenbond (the . for example. the dangers of public disclosure and the potential loss of support would seem to be a particular concern of authors working on benchmarking in the public sector. and finding the benchmark’s data in the media is something else again. An obligation to benchmark every two years. ‘In the Netherlands it seems to be rather too soon to start thinking along these lines. public disclosure could mean competitors making off with secrets after all. In the literature on benchmarking. Of course. arguing that this will encourage strategic reporting and lead to loss of control. In 2006. might run into so much resistance that any prospect of benchmark learning would be lost and the benchmark miss its objective altogether.9 Key success factor 8: No compulsory benchmarking 65 The authors of Benchmarking in de publieke sector also discuss whether benchmarking should be compulsory in some sectors. In the Dutch healthcare benchmarks it is standard practice for the organisations themselves to decide whom they wish to disclose individual benchmark outcomes to – the researchers definitely do nothing of the kind. perhaps a special Public Sector Quality Award. there is no denying that social pressure on organisations to benchmark is increasing. Another communication tool might make for an apter contribution to quality control than benchmarking. 3. Numbers can take on a life of their own. Moreover. Not joining when most others in the industry are doing so may seem almost suspect. are prone to misinterpretation and might create an unjustifiably negative picture. and is sometimes emphatically publicised. financial backers and clients expressly asking for benchmark information.

3. The literature bears this out: benchmarking’s biggest impetus often comes from repetition that helps to 66 show up its effects. with its magazine devoting a great deal of space to the fact that a number of hospitals had not participated. In fact. where they were ‘outed’ by name.How to make benchmarking a success "! Dutch consumers’ association) published a series of surveys into the quality of healthcare in the Netherlands. Section 8 has more on the ultimate in benchmark repetition. .10 Key success factor 9: Strength through repetition Benchmarking is not a one-off event. the continuous benchmark. hospitals that had not participated in any survey were the subject of a separate text box.

1 Classification criteria The literature shows that benchmarks are classified on the basis of the following criteria: • • • • • the benchmarking objective the nature of what is being measured the internal or external orientation of the benchmark (reference group) the level of organisation to which the benchmark applies use of normative standards We will add a sixth criterion to the above: • the research process Organisations planning to participate in benchmarking should decide for each of the above criteria what they consider to be important in a benchmark. Based on our own experience. we can add two further distinctions: benchmarks aimed primarily at determining one’s position (in which case general outcomes suffice) and benchmarks that seek to improve business practices (in which case more specific pointers are needed).4 Different types of benchmarking Benchmarking comes in different shapes and sizes. In doing so. . Each individual type is not necessarily suitable for all purposes. Choosing the most effective type of benchmark requires insight into how it can be used. This is addressed in this section. In this respect. ranging from one-dimensional to multidimensional and from small internal to industry-wide benchmarks.2 Classification by benchmarking objective Benchmarking can be classified by the objective of the benchmarking exercise. 4. De Vries and Van der Togt make a distinction between benchmarks that focus on internal efficiency and those aimed at external effectiveness. they will create an organisation profile with which they can go in search of a benchmark that matches it. 4.

’ The distinction between process and performance is made by various 68 authorities. such as their complaints procedures or telephone availability. such as econometric methodologies. the most important distinction is that between performance benchmarking and process benchmarking.’ They take process benchmarking to mean: ‘Undertaking a detailed examination within a group of organisations of the processes which produce a particular output. When classifying by what is being measured. Watson also implicitly applies this approach: processes serve as an . complex analytical techniques. An example of classifying by what is being measured. Cowper 67 and Samuels define performance benchmarking as: ‘Comparing the performance of a number of organisations providing a similar service. That is why benchmarks in healthcare focus on performance. And whereas relatively simple indicators tend to suffice in the latter case. Our experience with healthcare benchmarks is that comparing on the basis of process alone does not adequately meet the needs of the benchmarking partners. is to classify by scope. They want confirmation that the process in question has actually resulted in strong performance.3 Classification by what is being measured Benchmarks can also be classified by what is being measured. in principle. in which case allocating weightings to the various components is a tricky exercise."$ Benchmarking in Dutch healthcare 4. Whereas benchmarking may seek to compare the total performance of an organisation with that of other organisations. enabling comparisons to be made between dissimilar companies in different industries. Competitive benchmarking compares the performance of businesses that compete with one another in the same market. which is more gradated than a principle-based classification. measure a whole host of things: simple or highly complex. it may also simply seek to compare individual aspects of organisations. with a view to understanding the reasons for variations in performance and incorporating best practice. and use processes only to explain how the performance was 69 achieved. and therefore wish to know how the organisation has performed. factors that can be measured quantitatively or those that can only be described in qualitative terms. In this context Grayson distinguishes two different approaches: competitive benchmarking and process benchmarking. Benchmarking can. Process benchmarking focuses on the functionality of comparable processes. are needed in the first.

An example of a surprising outcome given by Watson is a survey by McKinsey into the commercial profits generated by new technological products. Yet the McKinsey study yielded a different outcome. and how can you prove that this improvement is the result of the care provided? That is why the focus on the quality of care in existing healthcare benchmarks serves to replace a lack of measurable outcomes. This debate has since subsided. Many healthcare providers felt that it did not apply to them and was a term coined in the corporate world that was not appropriate to the healthcare sector. The industry – in particular non-curative healthcare – still has a long way to go when it comes to unambiguously measuring the outcome of care. Until that time (1983) the world believed that the success of a new product was determined first and foremost by the degree to which development costs had been kept under control.Different types of benchmarking "% explanation for performance. . The question now is what the performance. for example. of healthcare providers actually is. So in this industry a timely product launch was found to contribute more to profits than cost-control. which makes the benchmark all the more interesting. Healthcare benchmarking to date has focused only on output. or outcome. which is to say on the amount of care. How can you uniformly measure an improvement in well-being on a national scale. During the first years after the introduction of healthcare benchmarks there was considerable debate about the term ‘performance’. He argues that the explanation given may be completely different from what the partners in benchmarking expected. namely that products whose development costs had overrun the budget (in some cases substantially) but that had been launched according to schedule appeared to be far more profitable than products that had remained within budget but were launched later than planned.

internal benchmarks compare partners within a single collaborative venture. An organisation examines the products or services of a competitor. An example of such benchmarking would be a car manufacturer that buys a car made by a direct competitor and subsequently takes it apart to examine how it has been made. business lines or university departments within a single organisation. 71 Odenthal and Van Vijfeijken state that this type of benchmarking is particularly suitable for comparing indicators that are not sector-specific. • External industry benchmarking. with a comparable denomination or of a similar type. Odenthal 72 and Van Vijfeijken observe that looking beyond existing boundaries can yield . A further analysis of the literature shows that many researchers are in favour of including organisations from other industries in the benchmark. An organisation compares itself with organisations in the same industry and in other industries. such as the overhead percentage.4 Classification by reference group: internal or external benchmarking Harrington & Harrington make a distinction between internal and external benchmarks and identify five types of benchmarking: • Internal benchmarking. Healthcare benchmarks are in the last group. • External competitive benchmarking. A variation on this type of benchmarking. a benchmark against other organisations in the healthcare sector.e. such as schools with comparable numbers of students."& Benchmarking in Dutch healthcare 4. The main question in such comparisons is: who performs better under the same circumstances? • External generic benchmarking (cross-industry). An organisation compares itself with other organisations in the same industry that are not direct competitors. is combined (if the organisation in question so wishes) with a benchmark against departments or districts within their own organisation. 70 Odenthal and Van Vijfeijken refer to a similar kind of benchmarking as ‘functional benchmarking’: comparing one’s performance or processes with those of organisations in similar circumstances. districts. without this other party being involved. i. A comparison between locations. where an external industry benchmark. • Combined internal and external benchmarking. or matters relating to facility management or personnel policy.

After all. the organisation as a whole is important. 4. Does the benchmark relate to a department. for example. This distinction has proved to be essential in healthcare benchmarks. Theoretical indicators could lead to spurious best practices which may be impossible for ‘flesh and blood’ organisations. particularly if policy is made at that level. the organisation as a whole tends to be too general a level for the identification of specific areas for improvement. learn from the way in which the health sector operates. division or department appears to be more suitable.Different types of benchmarking "' new perspectives and solutions. . criteria for quality awards such as the European Quality Award and the Dutch Quality Award) and benchmarking that uses non-normative empirical data. the organisations involved could negate the outcomes by referring to an invalid standard: the Not Invented Here (NIH) syndrome. Eleven years ago. Education could. That said.6 Classification by use of normative standards Benchmarking methodologies can also be classified by the standard used (the 74 benchmark). Klages makes a distinction between benchmarking that uses predefined normative standards (e. for instance in the areas of knowledge management. Another example is a hospital benchmarking exercise in the United States where the patient admission process was compared with the way in which airlines and hotels run their 73 check-in counters.g.5 Classification by level of organisation This classification is based on the level at which data are gathered and analysed. we believed that this would lead to greater objectivity. In this respect the level of a specific location. too. 4. rostering and planning. a division or the organisation as a whole? In our experience. We also found that existing indicators had been developed from theory and could not always be used in practice. our initial idea was to benchmark on the basis of definable normative standards. if indicators that are not generally accepted are used in a benchmark. however. we could not find any standards (indicators) that were both generally accepted and easily measurable. In practice. and – more importantly – were not always recognised.

either by formulating ambitious standards or by emulating organisations abroad or organisations in other industries. the organisations compare their operations with those of the best organisations included in the benchmarks. And the best performers themselves usually need to put in considerable effort to 75 remain the industry leaders. we are aware that this approach has its limits. we have so far opted for comparisons based on non-normative empirical data in the application of healthcare benchmarks. With respect to the identification of standards. cross-industry and/or international benchmark. In a report about benchmarking in the public sector in the United Kingdom. Experience to date has shown that these standards are highly suitable for use in benchmarking. One could point out that there may still be room for improvement even in the best-performing organisations in a particular industry.# Benchmarking in Dutch healthcare In view of the above. that are easily measurable and that can be incorporated in an integrated approach. of course. health inspectors and other parties are jointly developing quality standards that need to be met by all organisations and to be tested from time to time. For the time being. . Care providers.’ That is why we believe that the benchmark of the future (see Section 8) should include a normative. there is still much to be gained by taking the best-performing organisations in an industry as the point of reference. So is this the best possible solution? We see it as a process. Standards for responsible healthcare set specifically for the healthcare sector are expected to apply soon. In other words. Organisations aiming for true performance excellence tend to look beyond the performance of their own industry. the following warning should be heeded: select standards that enjoy widespread support. In some cases the adoption of standards may be closer than one might think. They wish to excel. clients. This also has its drawbacks. 76 Cowper and Samuels draw a conclusion that was formulated as follows by Professor Helmut Klages: ‘The strategy to “let the figures speak for themselves” produces the desired learning effect only if the comparisons discloses levels well below the norm. and speculate further that it may become 77 negative above the norm. That said. One can assume that the learning effect decreases to zero at an average level of performance. The standards combine scores 78 taken from client surveys with scores for indicators used by the Dutch Healthcare Inspectorate.

In another type of benchmark. questionnaires. Healthcare benchmarks come under the heading of fact-based benchmarks. they compare themselves with objective facts and there is little contact between the participants and the research designers or researchers. They use a fixed set of tools (analytical model. The main focus here is on the role played by the benchmark participants in designing the research process. Contact with the researchers takes place exclusively through the helpdesk. or they will be advised that they can download the reports. The organisation will then provide details using a web application. Organisations considering taking part in a benchmark can download pertinent information or attend an information session.Different types of benchmarking # 4. the next step is the sign-up procedure. There will be no contact with a researcher. the participants themselves develop their own benchmark and are in close contact with external experts.7 Classification by research process In addition to the five classification criteria distinguished so far. it will generally follow a fixed format. the organisations concerned are invited to take part in a workshop in which several organisations discuss the results. which we have called the research process. with the exception of the client survey. where researchers will actually visit the organisation if the client is unable to respond in writing. which we have called the fact-based benchmark. the benchmark reports will be sent to the participating organisations. The same applies to the stages during which data are gathered and analysed. After the results have been published. but as a rule the benchmarking partners are not directly involved in designing the questionnaires and reports and there tends to be little direct contact between participants and researchers. which will be validated and analysed by the researchers. In one type of benchmark. which we have called the interactive benchmark. score calculations and reports) and fixed benchmark items. The only times when the organisations benchmarked are in contact with the external research team are during the . If an organisation decides to take part. We will illustrate this by describing the benchmarking process in the healthcare sector. the participants use existing benchmarking tools. we would add a sixth criterion. which will always be available. unless the organisation gets in touch with the helpdesk. The organisations will provide benchmark data. In the final stage. if any. Questionnaires and reports are drawn up in cooperation with the client and with a group of organisations that act as a sounding board. Whilst it can make a few choices.

this is rarely the case in the second type. requires participants to be more closely involved in developing the benchmark from start to finish. In the interactive benchmark. performance is measured against objective. design the questions. validated and representative data. It goes without saying that these two types of benchmarks place different requirements on organisations. the benchmarking partners are in close contact from the very start and the external experts – if any – act as process managers and supervise the sessions. such as insufficiently ambitious benchmarks. The benchmark teams jointly establish the benchmark. via the helpdesk and during the workshop at the end. There are also practical reasons for choosing a particular type of benchmark. carry out the analyses and discuss the results. the second type is better suited to small-scale surveys. performance can be measured against a previous benchmark. An internal team will be required to initiate an awareness-raising process during the benchmark survey. Interactive benchmarks are based on a totally different research process. In the first type of benchmark. The second type of benchmark. Whereas large-scale surveys may include several dozen or even several hundred . Here. In the fact-based benchmark. The first requires that the organisations benchmarked largely give shape to the learning process themselves. determine which data should be gathered. Whereas the first type of benchmark is better suited to large-scale surveys with several dozen or even hundreds of partners (in which case jointly developing the benchmark is practically impossible). And whereas the fact-based benchmark may disregard the learning needs of individual participants. it may also be measured against non-representative data or qualitative descriptions. embed the benchmark results in the organisation and see to the implementation of quality-improvement measures. the interactive model. which would diminish the learning effect.# Benchmarking in Dutch healthcare information sessions at the start of the process. the interactive benchmark could lead to repeatedly ‘reinventing the wheel’ and possibly less-than-optimum solutions.

4. this only leaves the option of the fact-based benchmark in view of practical and cost considerations. Participants tend not to be directly involved in benchmark tools and content.e. in combination with an internal benchmark. which we will discuss in more detail in the next section. Classification criterion Objective What is being measured Reference group Healthcare benchmarks To improve efficiency and effectiveness.8 Profile of a healthcare benchmark model This section has discussed six different ways of classifying benchmarks. Level of organisation (company/foundation) as a whole. PricewaterhouseCoopers’ healthcare benchmark model.1 Profile of PricewaterhouseCoopers benchmark model Source: PricewaterhouseCoopers . First and foremost. we can create a profile for each benchmark. Processes only to the extent that they explain performance. Fact-based benchmark. performance. but also departments and products. small-scale surveys tend to be limited to about ten. By using these classification criteria.Different types of benchmarking #! partners. If standards are used they must be generally accepted by the industry. Level of organisation Use of standards Research method Table 4. comparing with other organisations in the same industry. i. External industry benchmark. So if industry associations strive for broad participation. has the following profile. To begin with no standards: performance is compared with that of best performers.

which does not mean that there is no room for improvement. For further information on this. We shall refer to this principle as the multidimensional approach.5 Benchmarking model for healthcare benchmarks In this section we will address more closely the benchmarking model used in recent decades in the benchmark surveys we have carried out together with our clients and other partners in the health sector. As far as we are concerned. or dimensions. The key features of the model are financial performance and quality. and we consider it to be a basic premise. the model will continue to form the basis of good benchmarking. This means that this section is based on practical experience. various benchmark studies. With this model the survey results yield strategic management information. . Strategic analyses Input Environment Building blocks Financial performance Strategic themes Results Industry Resources Identification of best practices Quality of care Quality of the job Data measured Organisation Historical context Department/unit Explaining performance Benchmark strategic management information Figure 5. A crucial characteristic of the model is that it is built up of several building blocks.1 Benchmark analysis model Source: PricewaterhouseCoopers. which the organisations can use to adjust their strategy or use of people/resources in order to improve performance. see Section 8.

That is why this benchmark paid particular attention to identifying the factors that influence the motivation of existing employees and the appeal of the organisation as an employer. Consequently. And the organisation employs resources: capital. The organisation will have to take account of its environment (or. we often start a benchmark survey with a session in which representatives of the partners are asked to name a number of themes about which information could be gathered. The left shows the organisation’s input. people.1 Input and strategic themes Strategic analyses Input Environment Building blocks Financial performance Strategic themes Results Industry Resources Identification of best practices Quality of care Quality of the job Data measured Organisation Historical context Department/unit Explaining performance Benchmark strategic management information The model should be read from left to right. . or starting point. Which product-market combinations do we wish to deliver? How do we deal with a tight labour market? How can we remain financially sound? How can we become the very best? The answers to these questions will largely determine the way in which people and other resources are employed. more precisely. being in the health sector. In the 2004 home care benchmark the tight labour market was named as one of the strategic themes. which so far has little to do with benchmarking. The organisation will then identify its strategic themes on the basis of this starting point. equipment and of course. In other words: the strategic themes determine an organisation’s structure and work processes. The benchmark can provide useful information for the organisations’ strategic themes. 5.#$ Benchmarking in Dutch healthcare We will discuss each component of the model in turn in the following subsections. the environment ultimately determines the organisation’s goal) and its historical context.

for example. We opted for a classification into three possible strategic positions. The organisations that took part in the benchmark completed a short questionnaire to determine their position. ICT in healthcare) while a cost leader focuses on offering the lowest possible price. A client leader focuses on optimum customer satisfaction. product leader and cost leader.Benchmarking model for healthcare benchmarks #% Strategic position Given the importance of strategic themes in influencing the choices organisations make. it would not be correct to treat the performance of different organisations in the same way. we proposed including their strategic position in the benchmark survey. In our analyses. this being so. and that product leaders would have a bigger investment budget for product development. In the benchmark we opted for a construction in which the position could vary per product. no more than 20 per cent of all home care providers actually opted for one of the three positions. by 73 organisations (some organisations completed questionnaires for more than one product). We assumed that the client leaders in the benchmark would score highest on the building block ‘client assessment’. imagine that home care organisations would choose the position of client leader for the product ‘personal care’. These researchers make a distinction between the position of client leader. of those that did. all chose the position of client leader. that cost leaders would score highest on the financial building block.g. A large majority of the organisations did not make a clear choice. We could. Treacy and Wiersema hold that successful companies choose to pursue one specific position. Our hypothesis was that an organisation’s performance was related to the strategic position it had opted for and. with service and long-term customer loyalty being key concepts. First of all. we related the benchmark performance to these positions. . or chose two or three positions. A product leader offers sophisticated and innovative products (e. but the position of cost leader in the case of ‘housekeeping services’. We were able to test this hypothesis in the 2004 home care benchmark. based on a 79 theory developed by Treacy and Wiersema. The findings of the study did not confirm our hypothesis. The questionnaire was completed 88 times.

draw any conclusions from such small numbers. of course. no correlation was found between the position opted for and the performance delivered. and not even for the public sector. The hypothesis that the strategic position opted for by organisations influences their performance has not been rejected. but that time will no doubt come. its implementation in the organisation was still in its infancy. which is much higher than the average of 20 per cent referred to above. First of all. . A third interpretation is that whilst they may have chosen a position explicitly. Secondly.#& Benchmarking in Dutch healthcare Secondly. The time may not yet be ripe to include this aspect in the benchmark. but this does give us grounds to suggest that including the strategic position of organisations in the benchmark can add value. there was also evidence indicating that the strategic position of organisations did affect their performance. Treacy and Wiersema did not develop their theory specifically for the healthcare sector. That said. it could be that the three positions distinguished and/or the questionnaire are not yet 80 sufficiently geared to the healthcare sector. Three of the five best performers in the benchmark survey had chosen a clear position. No more than five of the 17 organisations that made a clear choice in favour of client leadership scored above average in the client survey. We may not. it may be that most organisations have so far chosen their strategic position implicitly and have not related it to the structure of their organisation or their work processes. Several benchmark partners advised us that this instrument had helped them in the awareness-raising process. These findings can be interpreted in different ways.

As this is a well-known model in healthcare. The central part of the figure depicts the building blocks. equity .3 The financial building block Strategic analyses Input Environment Building blocks Financial performance Strategic themes Results Industry Resources Identification of best practices Quality of care Quality of the job Data measured Organisation Historical context Department/unit Explaining performance Benchmark strategic management information The financial building block has so far formed part of all healthcare benchmarks. In defining the various areas measured.2 Building blocks of benchmark surveys Strategic analyses Input Environment Building blocks Financial performance Strategic themes Results Industry Resources Identification of best practices Quality of care Quality of the job Data measured Organisation Historical context Department/unit Explaining performance Benchmark strategic management information Let us return to the analytical model. we decided to follow as closely as possible the INK management model designed by the Dutch Quality Institute. and therefore several dimensions.Benchmarking model for healthcare benchmarks #' 5. however. An organisation’s financial performance is based on cost. In all cases. namely the performance aspects measured. 5. These building blocks constitute the key elements of the benchmark survey. a benchmark should include several building blocks. its outcomes are familiar to participants and easily applicable to their own operations. The building blocks examined differ per benchmark.

Other weightings are also possible. healthcare providers also deliver meals. In the healthcare sector this means the amount of care delivered per euro.2 Weighting of financial building block components (example) Source: PricewaterhouseCoopers. various benchmark studies . Even an organisation that delivers much more care per euro than other providers. accounts for 10 per cent of the total score. which we took to mean costs in relation to production. The components are weighted in order to calculate a total score. The financial position.$ Benchmarking in Dutch healthcare and financial position. for example. A similar problem would arise in the long term if an organisation’s financial position were not sufficiently sound. for example. Our first benchmarking studies equated financial performance with efficiency. And whereas this does not cover all aspects of healthcare delivery – after all. Financial considerations always play a role in healthcare and the literature shows that the cost side of things is usually included in healthcare benchmarking. This dimension is excluded only in a few small-scale benchmarks in the public sector. If. That is why the financial building block was extended to include the components ‘financial position’ and ‘net profit/loss’. or chips in immediately in the event of losses. an organisation’s results are not necessarily a good indicator of performance. the government or another financial backer permits only limited capital accumulation. Total score financial performance 20% 10% 70% Efficiency (DEA) Net profit/loss Financial position (budget ratio) Figure 5. laundry services and financial overviews – the definition was considered to suffice at the time by the benchmarking partners. has a problem if costs exceed income in a given year. as efficiency is not the only aspect of financial performance that providers must deliver on. net profit/loss makes up 20 per cent of the score and efficiency 70 per cent. for example. The definition of financial performance has been extended since the 2004 home care benchmark survey. as the delivery of care is the core competency of healthcare providers. but the degree to which a care provider itself is able to influence performance should also be borne in mind.

works well in that context. time was measured by registering the care and treatment provided. How do we calculate the amount of care delivered per euro? The answer to this question could provide organisations with valuable insights into their financial management and possible cost savings. which may be related to staffing levels. The large number of observations made during the course of a week enabled the calculation of reliable averages. If we really want to know how much care is provided. Despite anecdotal evidence that staff found this an interesting experience. we first need to answer three sub-questions: how much care is delivered. All carers/nursing staff were issued with a palmtop computer on which they had to indicate which activity they were carrying out for which client(s). Time registration. we need to measure it. rather than nursing days. There was a clear wish to gain insight into the amount of care delivered using a simpler method. or care complexity modules. a method used in outpatient care. a method developed by Customers Choice and known in the Netherlands by the acronym ZBR. Some . Today (2007) this is much easier than it was some time ago. The activities entered were automatically clustered into activity groups and AWBZ functions (long-term care functions as specified under the Exceptional Medical Expenses Act) with the measurements spanning a period of a week. we came across a practical problem. now that we work with so-called ZZPs. and how can the amount of care be responsibly matched to costs? In answering the first sub-question. In order to give a complete answer. The amount of care delivered is now measured through production targets or registered production. Another development is that the amount of care provided is limited given the requirement that the care delivered should be matched to need. what are the costs involved.Benchmarking model for healthcare benchmarks $ How do we calculate the amount of care delivered per euro? Let us return to efficiency. They had to register this information every twenty minutes using pre-coded answers. the benchmarking partners indicated that they felt it to be a heavy burden. It is for this reason that the most recent benchmarks for care of the disabled and for nursing. We are better able to keep tabs on the amount of care delivered than we were a few years ago. During the early years of healthcare benchmarks. care and home care (VVT) no longer require that time is registered (although care providers may do so on a voluntary basis). but a different kind of time-keeping is needed in inpatient care.

and do they know in advance whether their employees would actually find it a burden? Taking stock of costs In each benchmarking exercise. but the amount of care actually provided can differ substantially within a ZZP. Benchmarking organisations could learn a great deal from comparing their own production with that of others. even after adjusting for the complexity of care. production targets and registered production are expressed in terms of ZZPs. In other words. this gives the . the following basic procedure applies. the salary costs. Time measurements are then used to calculate the percentage of contract hours spent on client care. The first item to be measured is the personnel costs per contract hour per group of employees or level of expertise. showing up clear areas for improvement. we would like to query the suitability of this solution. Time measurements conducted in nursing and care homes in the past have shown that some organisations were able to deliver many more minutes per client per day than others. We would therefore like to urge benchmark participants to examine whether the benefits of time measurement make up for the burden of keeping time. In residential care. These pointers for improvement disappear when measurements are based on care complexity modules. This enabled organisations to monitor such issues as whether an above-average amount of time was lost waiting for a lift to arrive. In terms of research quality. as non-client-facing activities were also recorded in the ZBRs. After adjusting for group-based care. Even so. or on transporting equipment. Do participants know exactly how much time is spent on measurements of this kind. time measurements enabled comparison of time spent on each activity. however. taking stock of costs is tailored to the specific industry being measured. It is unlikely that these differences have completely disappeared. In the past.$ Benchmarking in Dutch healthcare providers are even experimenting with electronic patient files in which the care provided is registered daily. Another advantage of time-keeping was that it provided a measure of total productivity. are related to the length of the working week (or ‘contract hours’) as stated in the employee’s contract of employment. special allowances etc.

3 presents the model for nursing and care. the share of each level of expertise in each function. In this way the benchmark can be used to provide pointers for improvement in an organisation’s financial performance. for instance to calculate the effects of a new budget or of certain cost measures. Amount of care in relation to costs Calculating efficiency gives the cost per AWBZ function (or ZZP) per time unit.Benchmarking model for healthcare benchmarks $! personnel costs per hour of client care. nursing and care benchmark . with the aid of the ZBR. Figure 5. The cost price model can also be used independently of the benchmark.3 Cost price model for nursing and care (See figure on next page) Source: PricewaterhouseCoopers. The costs per care complexity module could also be calculated in this manner. A calculation of this kind shows up the actual costs for each client-facing hour. The price level can then be explained by each of the underlying factors and any relationships between these factors. broken down by level of expertise. Figure 5. The costs are calculated for each level of expertise. which in turn can be used to calculate the costs per hour per AWBZ function by determining. These costs are multiplied by an extra amount for overhead and materials.

7am to 10am 10am to 4pm 3pm to 11pm 11pm to 4am Product and service catalogue B: l treatment V: l accommodation HVZ x hours PVA x hours VP x hours OB x hours OB x hours AB x hours AB x shifts B x hours V x yes/no . Mon Tue AB: l intervention in crisis situations l assisting groups of clients. number of clients in homogeneous group Day Night number of clients (day/night) number of clientfacing hours per function Cost price per function per time unit p (price) HVZ PV VP OB AB B accommodation Non-clientspecific = personnel costs healthcare workers per client-facing for example: HVZ: 10 4 3 24 24 48 l cleaning private quarters Group matrix Client Personnel costs healthcare workers € Assignable costs € Overhead € HVZ PV VP OB AB B accommodation Function PV: l washing Care Nursing VP: Wed Fri Sat Sun Thu l reserved procedures.Cliënt Care need Capital expenses & costs related to buildings Costs related to premises/buildings Depreciation Total costs Material & other costs Personnel costs Uniform cost units in keeping with Nzi schedule Organisation Regional determination Assessment Board of need (RIO) CAO Personnel costs healthcare workers AWBZ entitlement Food Hotel General costs Patient/residentrelated costs Personnel costs management and senior staff Personnel costs general and administrative staff Personnel costs hotel staff = Collective Labour Agreement Functions FWG Category (time unit) Healthcare workers 10 = Healthcare job evaluation system 9 8 7 25 25 6 20 hours 5 Costs per contract hour per level of expertise Direct client-facing hours Indirect client-facing hours l Sick leave l Leave/holidays l Training l Other work l Meetings l Personal time l Travelling time Share of client-facing hours per level of expertise Personnel costs healthcare workers per client-facing hour per level of Staff payroll department costs per contract hour productivity personnel costs 16 4 3 10 = healthcare workers per client-facing housekeeping personal care nursing support activities activation treatment accommodation 2 1 l Gross salary l Shift work bonus l Net social security Staff insourcing at group level costs per contract hour productivity personnel costs 0 contributions including savings premiums l Net pension personnel costs HVZ PV VP OB AB = healthcare workers per client-facing Insourcing healthcare factor workers per client-facing hour B V HVZ PV VP OB AB B V contributions l Holiday allowance l Overtime l Other allowances Contract hours Outsourced to third parties costs per contract hour productivity Activities Care delivered number of client-facing hours per function Max. Assistance OB: 7am to 11pm 11pm to 7 Weekly pattern l assisting clients with daily activities l assisting groups of clients. etc. etc. etc.

These organisations are found on the efficiency frontier.Benchmarkanalysemodel voor zorgbenchmarks $# Efficiency measurement entails more than cost price calculations per function per hour. we were able to use cluster classification. many home care providers offer comprehensive care packages that are very similar. Care providers to the left of this line are less efficient. These days. although its value has diminished over the years. One hour of care given to clients requiring highly complex care may be more expensive than an hour given to clients with less complex needs. Cost prices in organisations that focus on delivering housekeeping services care rather than nursing cannot be simply compared with organisations in which the situation is the reverse. This means that cost prices should not be indiscriminately compared with one another. It should also include a comparison of different organisations. for instance if more highly skilled staff are used.4. We calculated the cluster classifications used in the benchmarks with the aid of the DEA method. too. Here. We solved this problem of comparison in the benchmarks for nursing and care homes by grouping the organisations into clusters based on the complexity of 81 care. as shown in Figure 5. bearing in mind the differences in care complexity among clients. and subsequently to calculate which organisations offer most care per euro. DEA stands for Data Envelopment Analysis and is a method used to automatically calculate which clusters of care providers offer a similar product mix. Things have changed substantially since the days when family care providers offered a package that differed markedly from that offered by the home nursing organisations. Cluster classification was used mainly in measuring efficiency. . but the results of the client survey were also reported per cluster. The same problem was found in the home care benchmark study.

various benchmark studies The literature also recommends grouping similar organisations into clusters. This is also the case in other performance benchmarks.$$ Benchmarking in de zorg Number of hours of 150 14 11 15 housekeeping 12 services 13 per NLG 100 100 10 8 Efficiency frontier 9 50 7 5 Home care provider 4 6 3 0 1 50 2 100 150 Number of hours of care per NLG 100 Figure 5. 5.5 Quality of care Strategic analyses Input Environment Building blocks Financial performance Strategic themes Results Industry Resources Identification of best practices Quality of care Quality of the job Data measured Organisation Historical context Department/unit Explaining performance Benchmark strategic management information Healthcare benchmarks invariably consist of one or more quality building blocks in addition to the financial building block. a drawback would be that the influence of the product mix becomes less visible and that the benchmarking organisation may take inappropriate measures as a result. In our view. .4 Data Envelopment Analysis Source: PricewaterhouseCoopers. 82 Kaczmakre and Metcalfe suggest adjusting the outcomes for product mix differences as an alternative to calculating the outcomes per cluster.

but who are they to judge whether the intravenous drip was properly sterilised? All parties meanwhile agree that client assessments form an essential part of quality assessments and that quality involves more than just medical or technical nursing skills. we want to know what the client’s own assessment is. He or she (and in particular she) may notice whether the window sills have been thoroughly dusted. Client surveys used to assess care of the disabled are developed as part of the benchmarking exercise. but only if the client is truly incapable of answering any questions. What to do if clients are unable to judge the quality? Can we speak of a client’s judgement if the client himself or herself is unable to assess the situation. but to other stakeholders. We sometimes address additional questions to a client’s family. Measuring client assessments is indispensable in an industry in which clients can sometimes even switch to another care provider if they are not satisfied with the quality offered. such as that of a relative or carer? And who is to determine whether a client is competent to judge? If at all possible. but the principle is the same. they would argue. these . That is why client assessments have been included as a fixed parameter in the standards for responsible care referred to earlier. The benchmark participants indicate whether clients are competent to judge. What do clients know about the quality of care. In the existing nursing. And whereas 83 there are methods to measure the quality of care by observing clients . even if this means that individual interviews need to be conducted or that questionnaires need to be especially adapted. From the very beginning we have evaluated the quality of care using client assessments. care and home care benchmark this is done with the aid of the client surveys used to measure responsible care. Whilst we are aware of the drawbacks of this approach – do organisations assess the situation in a comparable fashion? – we see no alternative as things stand now.Benchmarkanalysemodel voor zorgbenchmarks $% The quality building blocks in healthcare benchmarking always represent the quality of care. not so much to the home care providers themselves or to industry associations. In other sectors this element of benchmarking may be referred to as the quality of service or the quality of the products. as in the case of clients suffering from dementia or the mentally disabled? May we then replace the client assessment by someone else’s assessment. We had to defend this choice at first.

Written surveys are easy to conduct. and for individual interviews in situations where response to a written questionnaire would probably be too low (residents of nursing and care homes and the mentally disabled). we should keep a close tab on developments in order to further improve this aspect of healthcare benchmarking. Where possible. but the danger of this is that clients influence each other. That said. Telephone interviews are not as time-consuming (no travelling time for the interviewer. we have therefore opted for written questionnaires in our healthcare benchmarks (home care clients. Interviews or questionnaires? Effective measurement of client assessments is a tricky business. On top of that. even if the clients are fully capable of stating their views. family). the quality of the interviewers is continuously monitored. and then only on a limited scale and in addition to the written questionnaires. interviewing is extremely time-consuming and costly. We only use an Internet tool for relatives of clients. care and home care. Whilst this drawback does not exist when clients are interviewed. We have so far not used Internet surveys in benchmarking care of the disabled or nursing. . Costs can be reduced by interviewing clients in groups. no time needed to introduce the interviewer). a danger of face-to-face interviews is that the client may be influenced by the interviewer. Each method seems to have its pros and cons. and they are time-consuming and expensive. but here we find that clients are not always keen to be interviewed by telephone. To ensure that the interview results are as objective as possible and comparable.$& Benchmarking in de zorg methods also make use of third parties (observers). as we assume that many clients in these sectors would have problems taking part. but the drawback is that the questionnaire could be completed by someone other than the client.

and developing user-friendly survey tools should no longer be a problem. for example. In the housing corporation benchmark. but we should at least ensure that the costs of written questionnaires or face-to-face interviews do not deter us from taking stock of clients’ opinions at regular intervals.5 Log-on screen client survey housing corporation benchmark Source: Pilot housing corporation benchmark . Figure 5.5). The pilot for this benchmark was recently completed. It may be somewhat crude to say that there is a bright future for digital client surveys. Clients and their families alike are becoming increasingly comfortable with the Internet. and software is currently being developed for the disabled to go online. as personal contact will always be the preferred choice. Providing group instructions on location in nursing and care homes could enable residents to subsequently complete the questionnaires by themselves. the entire client survey was digitised (see Figure 5.Benchmarkanalysemodel voor zorgbenchmarks $' The Internet The possible uses of web surveys need to be explored further.

In some cases. The indicators used included medication and fall incidents. as one would think that aspects such as bedsores or being strapped down would influence a client’s opinion.% Benchmarking in de zorg Content of client survey Whereas the contents of client surveys are tailored to the specific industry being benchmarked. showing respect) and the actual care provided (such as the quality of personal care. Surveys can usually be divided into questions relating directly to the care provided and questions relating to how the care is organised. In the most recent nursing. • Staff always comply with arrangements made. meals or daily activities). for example. RAI. Examples of statements used in the client survey of the nursing and care home benchmark: • The personal grooming I receive here is very good. • If I ask my carers to do something. carers themselves are also asked to give their opinions on the quality of care. Despite the importance of client assessments to quality measurement. • Residents receive adequate assistance when they need to go to the toilet. We expected to find a clear correlation between the client’s assessment and the benchmarking organisation’s score on care-specific quality indicators. they reflect how clients experience the care they receive. But our analyses consistently found either no . • My care plan was drawn up in consultation with me. In short. The nursing and care home benchmarks. The first type of questions addresses the way in which care providers treat clients (listening to clients. they sometimes give me the feeling that I am a burden. some benchmarks also use other methods to measure the quality of care. These scores are based on multiple measurements of the quality of care: a client survey and an evaluation by the Dutch Healthcare Inspectorate. use care-specific 84 quality indicators based on the Resident Assessment Instrument. The second type includes questions such as whether the number of staff suffices and whether the healthcare providers are easily accessible. some issues feature in all surveys. • I’m happy with the way they use the hoist. care and home care benchmark the quality of care is derived from how benchmark participants score on the standards for responsible care.

namely with a questionnaire directed solely at management. was far too large for the two to be correlated. The debate about the role of quality control has been superseded by more recent insights into the factors that make an organisation successful. We will come back to this in more detail in Section 8. The number of organisations with a high client assessment score and a low score on care-specific indicators. Shortly after we began benchmarking. but when choosing a care provider other aspects appear to play an important role. Watson calls this a basic requirement: clients only consider it to be a dimension of quality if care providers fail to deliver. or vice versa. either general or specific. . They look at the way in which they are treated by their carers and take their professional expertise for granted. It is useful to take note of the fact that the care-specific indicators are not what clients themselves find important.Benchmarkanalysemodel voor zorgbenchmarks % correlation or only a weak one. Might management paint too rosy a picture of the degree to which paper measures are applied in practice? We never found out. This suggests that these two indicators measure quite different aspects of quality. The only correlation found was between the use of quality control systems and care-specific indicators in the 2003 nursing and care home benchmark study. In other words: do organisations that have quality procedures in place offer better quality than those that don’t? Analysis has shown that there was no correlation here either. Clients see a serious medical error as a severe lack of quality. we saw the application of a quality control system as a measure of the quality of care. We asked ourselves whether this could be explained by the way in which quality control was surveyed.

As these surveys were not developed specifically for benchmarking purposes. the possibility of conducting surveys via the Internet has. in 2007. As employees are the most important production factor in the healthcare industry. of course. The survey findings have consistently yielded practical recommendations and were in many respects correlated with other building blocks. for instance in surveys in which total scores could not be calculated. usually written ones. Employee surveys make use of questionnaires. such as wage levels. Despite these drawbacks. been addressed.% Benchmarking in de zorg 5.6 Quality of the job Strategic analyses Input Environment Building blocks Financial performance Strategic themes Results Industry Resources Identification of best practices Quality of care Quality of the job Data measured Organisation Historical context Department/unit Explaining performance Benchmark strategic management information The second quality-related building block in most benchmarks is ‘quality of the job’. we used existing employee surveys to measure the quality of the job. or where the scores for different themes could not be compared with one another. yet so far benchmarking participants have indicated that not all . questions of this kind did not measure an organisation’s performance. it is crucial that the organisation for which they work is a good employer. Participants in the most recent healthcare benchmarks have decided to include employee surveys as an integral part of the benchmarking exercise from the very start. Another drawback was that some of these surveys included questions about aspects of quality on which individual organisations had no influence. This could create problems. employee surveys have always proved to be very useful in benchmarking. In the early days of benchmarking. which is in fact the essence of benchmarking. Now. they were not always fully aligned with the other benchmarking dimensions. In other words.

care and home care benchmark also offers participants the option of completing either a paper version or an online version. have an added value for society. by the same token. the pilot housing corporation benchmark has opted for an Internet version. The most recent nursing. This building block addresses questions such as: ‘How do we work in conjunction with other organisations?’. This easy-to-follow approach encourages them to take part in the – much cheaper – Internet survey. . 5. In other words: quality in the eyes of society. ‘How do we relate to our financial backers?’ and ‘How do we contribute to the employment situation in our region?’ The social dimension of quality is grounded in the belief that organisations in general and publicly funded organisations in particular should act like socially responsible businesses and should.Benchmarkanalysemodel voor zorgbenchmarks %! their employees are able to complete an online survey. employees are provided with a log-on code giving them direct access to a questionnaire. either at work or at home. In a standard introductory letter. The social responsibility building block has been used twice in healthcare benchmarks: once in the 2002 home care benchmark study and once in the pilot benchmark for healthcare administration agencies. By contrast.7 Social responsibility Some benchmarks have included a third dimension of quality (not included in the figure): the quality of the care provider as a player amid its stakeholders. additionally offering employees the possibility to request a paper version. which we will refer to here as ‘social responsibility’.

questions on efforts to deliver good ‘chain care’ produced much higher scores than those on the degree to which organisations informed stakeholders about their policies. The pilot benchmark for healthcare administration agencies did yield information about individual agencies. it might mean that the organisation is a tough negotiator when it comes to production targets. We were therefore unable to calculate a social responsibility score for individual organisations. client organisations and municipal authorities. a national analysis (1. for instance.%" Benchmarking in de zorg Examples of statements and questions in the social responsibility building block are: • The home care provider always does what has been agreed (five answer 85 categories). it tends to indicate that something else is the matter. The questionnaire made a distinction between different types of stakeholders. other healthcare providers. • What is the average number of days between the actual admission of the client and the own contribution as paid by the client? (Facts)86 • What percentage of applications for the awarding of government grants was submitted on time? (Facts)87 This building block appeared to be difficult to measure in the home care benchmark study. Our initial idea was to ask stakeholders to complete a written questionnaire comprising questions that would reflect the care provider’s social responsibility. While the outcomes yield more valuable strategic management information if a distinction is made between different types of stakeholders. such as financial backers. • The healthcare admfinistration agency is well-informed about specific regional circumstances. It . some stakeholders decided not to respond at all and others informed us that they would prefer to have a bilateral meeting with the care provider rather than completing a questionnaire. For example. As this obviously compromises the anonymity of the answers. a financial backer is somewhat negative about an organisation’s performance in this respect. breaking them down into categories may result in a very small number of stakeholders per category – in some cases a mere handful.312 useable questionnaires) indicated that respondents had substantially more positive views about some issues than about others. or even no more than one. but brought to light another issue. And if a client representative body is somewhat negative about the care provider. However. If.

The healthcare administration agencies themselves felt there was a big chance that this would yield the same findings. That said.8 Relationship between building blocks Strategic analyses Input Environment Building blocks Financial performance Strategic themes Results Industry Resources Identification of best practices Quality of care Quality of the job Data measured Organisation Historical context Department/unit Explaining performance Benchmark strategic management information Once the results of the individual building blocks in a benchmark study are known. And so it was decided in both cases not to include social responsibility in future benchmarks. and the vocational education sector has also decided to include this building block in their next benchmarking exercise. the question that arose was what added value there would be in including these same indicators when launching the benchmark country-wide. In other words: the area for learning had already been identified.Benchmarkanalysemodel voor zorgbenchmarks %# showed quite clearly that healthcare administration agencies scored less well on a number of indicators. Whereas some of the correlations found are weak and merely indicative. The housing corporation benchmark has included social responsibility as a building block. . others are significant and have been confirmed by regression analysis. Are employees with a higher average salary more positive about their jobs than employees with a lower salary? If clients rate the way in which they are treated by staff poorly. as these items were the same for all healthcare administration agencies in the pilot. the notion that society’s views matter has not been discarded. could that be explained by the fact that the organisation employs many low-skilled workers? Analysis of the relationships between the building blocks has yielded a wealth of possible relationships. and are consistent over time and across industries. However. 5. the relationships between the various building blocks can be analysed.

the logical choice would seem to be to compare oneself with the organisations that score best on this item. As a rule.9 Best practices Strategic analyses Input Environment Building blocks Financial performance Strategic themes Results Industry Resources Identification of best practices Quality of care Quality of the job Data measured Organisation Historical context Department/unit Explaining performance Benchmark strategic management information The main question that arises after the analysis has been performed is who the benchmarked organisations should try to emulate. Some benchmarks showed a relationship between good financial performance and good quality of care. Similarly. Whereas almost all best performers score well on . as organisations are not one-dimensional. Efficiency and quality are a good match. 5. it would seem logical for it to emulate the organisations that perform best in the client survey. if it’s an organisation’s efficiency that needs to be improved. If a healthcare provider wishes to improve the quality of care. but this was. but need not necessarily go hand in hand. one can say that organisations that score well on quality of care also score well on quality of the job. however. Could it be that a very high score on one building block takes its toll on other building blocks? We found that this was indeed the case. admittedly. healthy staff (the quality indicators measured related mainly to matters of health) may be expected to contribute to financially sound business operations. not always the case. This would suggest that people feel better if they work for organisations that perform well financially. We found a relationship between the organisation’s financial performance and several aspects of quality of the job. Conversely.%$ Benchmarking in de zorg Clients were found to be more positive about the quality of care than were healthcare staff (and more positive than the clients’ contact at the care provider). they do not appear to hold true. Yet however logical these conclusions may seem.

but only the care providers given in light blue also performed well in the employee survey. 8.5 Total score financial survey Figure 5. The organisations in the top right corner of the figure score well on the financial building block and in the client survey.5 9. In our reports.5 8.6 Matrix of total scores home care benchmark Source: 2004 home care benchmark . An exceptionally high level of efficiency tends to go hand in hand with middle-of-the-road quality.4 8.6). providing them with a good overall score.5 6. because they score less well in the employee survey. they tend not to rank among the very best. and vice versa. One could compare it to all-round speedskating championships: the all-round winner is usually found among the top five for most of the individual events. The care providers given in light blue and black are not.Benchmarkanalysemodel voor zorgbenchmarks %% individual building blocks.6 Client assessment 8.5 5. In short.0 7. True best practices are found in organisations that have succeeded in striking a balance between the building blocks. best practices are determined by the total score. The matrix has three dimensions. The blue organisations in the top right corner (BP) are best-practice organisations. but need not win over every single distance in order to become the all-round champion. as it uses colours in addition to showing the position. we usually visualise this in a matrix presenting the different scores (see Figure 5.8 4.2 8.5 7.

This line is arbitrary by definition. the line is drawn for pragmatic reasons. In some cases there is a natural divide. An organisation may never.%& Benchmarking in de zorg We consider the above method of identifying best practices to be a very effective one. We have. the identification of best practices has not always been given the highest priority. . Identifying best practices using the above method also has a drawback: it is only effective with a sufficiently large group of participants. however. And the risk of making comparisons with high-scoring organisations per building block is that the approach becomes too one-sided. We expect that the theory of performance excellence (see Section 8) will offer a solution to this in the future. for example. for example because the group of best practices should be neither too big (as participants would no longer be able to set themselves apart). Once we know which aspects of an organisation lead to performance excellence. Ambitious organisations that compare their performance with ‘good practices’. nor too small (in which case the make-up of the group would likely be too one-sided). always set certain limiting conditions. And the greater the number of building blocks included in the exercise. as benchmarks are not based on absolute standards. for example. the more participants are needed for there to be enough organisations that score well on all building blocks. we would advise participants always to carefully consider whether it would be wise to discard best practices. however. From a research perspective. Given this drawback. It is also for this reason that it has been suggested to make do with ‘good practices’ (a broader group of good performers) or to work with the best-scoring organisations per building block after all. run the risk of not achieving their full potential. be identified as a best practice if it scores below average on a particular building block. it is easier to determine whom to emulate in order to achieve the desired level. In the early years of benchmarking. there was considerable debate about the question of where to draw the line. More often. for instance where a group of organisations scored high and the next best group scored substantially lower.

In short.10 Explaining performance Strategic analyses Input Environment Building blocks Financial performance Strategic themes Results Industry Resources Identification of best practices Quality of care Quality of the job Data measured Organisation Historical context Department/unit Explaining performance Benchmark strategic management information ‘Explaining performance’ (or business operations or process management. or whatever you want to call it) features at the bottom of the model discussed in this section. . it will want to know how these others have succeeded in doing so. actual performance was not usually measured. an organisation may need to gain insight into the business operations of other organisations in order to know what actions for improvement are needed. This information may be enough for an organisation to take appropriate measures. Performance can be explained in part by aspects of performance or facts that feature in the same building block. An example would be the finding that clients are not very positive because they feel they are not sufficiently consulted in the development of their care plan.Benchmarkanalysemodel voor zorgbenchmarks %' 5. Are clients’ opinions about the quality of meals related to the cooking method used? Do care providers that perform well financially have access to more extensive management information than others? Is it true that best-practice organisations are less hierarchical than others? A related issue. this part of the model addresses the question of how the organisation has achieved its financial and quality performance. it will want to know how their performance can be explained. Our rationale for doing so was as follows: if an organisation measures its performance against that of others and concludes that they are performing better. In other words. although not strictly an aspect of business operations. Whereas taking stock of business operations has been a part of healthcare benchmarking for years. Conversely. is the question of whether cultural differences could explain differences in performance.

until we found out that the correlation between a limited set of indicators and poor performance was just as strong. drawing from the most recent literature or yet another round of interviews with well performing organisations. But this gave rise to yet another issue: if we know which factors explain good performance. indicators were developed and tested. The expected relationships between performance and processes failed to materialise. Blood. we did manage to bring to light a number of relationships over the years (see box). organisational set-up. the pilot benchmark addressing care of the disabled and the pilot benchmark for healthcare administration agencies have all made efforts to gain better insight into business operations and culture. to put it more accurately. etc. financial policy. The findings showed that good performers did not all have the same pattern of business operations.& Benchmarking in de zorg The home care benchmark study. The questionnaires used for this purpose range from a small set of questions about aspects considered to be promising to hundreds of questions addressing such issues as healthcare processes. For weren’t we all determined to find an explanation for good performance? New questions were invented and reinvented. to complete them. facility management. In one of the benchmark studies. or certain processes were found to be linked not only to performance excellence but also to exceptionally bad performance. correlations found in one benchmarking study did not reappear in the next. what reason would there be to continue to investigate this? . the benchmark studies in nursing and care. Or. that the results did not yield a clear picture. no doubt. for example. The expertise of specialist agencies was called in. Did this warrant the conclusion that a focus on the big picture and management that rises above the detail lead to better performance? We thought so. hypotheses were formulated. sweat and tears have been put into the questionnaires. And so it was rather frustrating to have to conclude that the results of these efforts were meagre. for a while. not only to design them but also. a significant positive correlation was found between use of a relatively limited amount of management information and good performance. It was only the organisations halfway down the league table that used more extensive information. Still.

it does not appear to negatively affect their work enjoyment and health. • Compliance with arrangements made is an important factor in clients’ overall assessment of the quality of care.89 • The bigger the healthcare teams. Whereas this increases the pressures at work. . the less positive the client 90 assessment. clients are negative about compliance with arrangements made.92 A finding in all benchmarks was that strong guidance and commitment by middle management is related to positive client assessment. the more negative clients are. they report sick more often (which in turn has a negative effect on the organisation’s financial performance). • If staff are negative about the planning system. • The bigger the healthcare teams. by which we mean planning the arrangements made by healthcare staff and clients. • The more insight one has into the cost price. The presence of shared values was also found to be crucial. positive staff assessment and often also to good financial performance. centralised or decentralised. We found the following: • If staff are negative about the planning system. We found the best example of correlations in the 2004 home care benchmark in the area of planning. Whereas the employee survey included questions about the employee’s opinions on planning and rostering. • A decentralised planning system is correlated with higher staff productivity. The process management building block inquired into the planning system – whether it was manual or automated. the less cost-cutting takes 91 place in care and services. the less likely it is that clients will be consulted in drawing up a care plan. This might be explained by the fact that it is easier for staff to respond to unforeseen circumstances. the client survey inquired whether the arrangements made were complied with and how clients felt about this. • The more volunteers.Benchmarkanalysemodel voor zorgbenchmarks & Examples of relationships between performance and business operations Organisations with a limited number of management layers score better 88 on quality of the job.

care and home care) decided to introduce the building block ‘innovation’. enabling benchmark partners to learn from each other in this area too. for example. As discussed in Section 8. Not only do these new insights shed a fresh light on the factors that affect good performance. We shall discuss these insights in more detail in Section 8. fact-based factors from the very start. ActiZ (the Dutch association for nursing. in calculating the scores for the quality of care we adjusted for a number of client characteristics. they also offer the possibility to identify the degree to which these factors are at play. This enables us to determine how far advanced an organisation is on the road to performance excellence. This enabled us to chart client assessments by region and to examine whether there were any differences between urban and rural areas. it will be more difficult to win a positive client assessment than in a rural village. care and home care benchmark. ActiZ sees the capacity to innovate as a crucial factor for success and survival. the debate about business operations has changed course as a result of new insights into performance excellence. In our healthcare benchmarks we have taken account of external. particularly if the organisations benchmarked have no influence on these factors. 5. An organisation located in an area known to be unsafe. this notion is corroborated in the contemporary literature.11 Innovation In developing the new nursing. will have to spend more money on security than one located in a relatively safe area. The influence of external factors One of the questions that should be addressed in a benchmarking exercise is what influence external factors have on the performance of benchmark participants.& Benchmarking in de zorg As far as we are concerned. We are currently co-designing an instrument to measure the innovative power of organisations. Or. in a city known for its critical and articulate residents. We did this by making various cross-sections (if there were a large enough number of participants). Further. such . There would be no point in emulating an organisation that succeeded in performing well merely because its external circumstances were more favourable. the reason being that it is felt to be ‘unfair’ not to take account of circumstantial factors.

in later benchmarks. when comparing aspects of performance on which the organisation has no influence. this relationship became more diffuse. more human scale. It therefore makes sense to take account of these factors when comparing organisations – at least. both in financial terms and in terms of the quality delivered. large organisations might do well to organise their operations on a smaller. However. persuasion and type of organisation. And the good scores found among organisations with a distinct religious persuasion remind us of the discussion about strategic positioning: clear strategic choices that are also implemented in practice seem to be conducive to good performance. The analyses of the circumstantial factors yielded a number of interesting findings: • In most benchmarks we found a relationship between the size of the organisation and its performance: as a rule. • Organisations in urban areas score almost consistently lower on all building blocks than organisations in rural areas. the ‘external’ factors found to be related to performance were the level of urbanisation and the size. sex and whether or not the client was a resident of a care home or 93 nursing home. however. • Clients of care homes rate the quality of care more positively than clients of nursing homes. for example. This does not mean. This could be a valuable finding for all organisations. we can say that small organisations (stand-alone or part of a group) performed better. • Organisations of a particular religious persuasion perform better than others. or even disappeared altogether. even if the number of organisations with a specific persuasion is on the decline. . In summary. If small organisations are found to perform better. that nothing can be learned from unlike organisations.Benchmarkanalysemodel voor zorgbenchmarks &! as age.

and it is therefore important that this information is available. care and home care (VVT). In practice.&" Benchmarking in de zorg 5. however. the allocation of resources to locations and/or departments does influence the performance of the organisational entity concerned. are reported at various levels. query the value of the finding that the average score in the employee survey is 7 if two of the individual organisations score no higher than 5. Healthcare benchmarks therefore usually offer the possibility of reporting at a lower level. enabling . Does it refer to a group comprising several locations or to an individual location? And can a large group of organisations be compared with a foundation or trust that runs a single. In practice. especially not for large organisations. In this case. the data measured and analysed. The discussion tends to focus on financial data: participants sometimes indicate that it is difficult or even impossible to break down data by location or department. The central level is that of the benchmarked organisation itself. Comparing information at the aggregate level is not always useful. A special application has been developed for the most recent benchmarks for care of the disabled and for nursing. One might. for example. the organisation would probably want to have information per location. small organisation? These are questions that need to be answered prior to the benchmark survey to avoid disappointment afterwards.12 Reporting results Strategic analyses Input Environment Building blocks Financial performance Strategic themes Results Industry Resources Identification of best practices Quality of care Quality of the job Data measured Organisation Historical context Department/unit Explaining performance Benchmark strategic management information The right side of the benchmark model shows that the benchmark outcomes. or maybe at unit level. at least for the quality tools. defining this level may not be as simple as it might seem.

The data presented in these generic reports cannot be attributed to individual healthcare providers. we made a habit of reporting aggregate results to the healthcare industry as a whole. nor should they be. for instance. ActiZ is developing an application to benchmark individual products in the nursing. can be identified. or relationships between performance and business operations. The finding that a positive assessment by clients of their daily activities is related to their carers’ positive assessment of their work. Conclusions about individual care providers can only be drawn for very large organisations that have access to data specified per organisational entity. or at least to all participants. but this is usually not possible. . The industry level is also the level at which relationships between the building blocks.Benchmarkanalysemodel voor zorgbenchmarks &# participants to specify their organisational set-up and to indicate the desired level of reporting (which in turn determines the level at which they need to provide the data). Individual benchmarking participants tend to expect that industry-wide reporting will yield an ‘integral analysis’ of their own organisation. of course. From the very start. This is not always clear to all parties concerned. Individual benchmarking participants and their sub-entities are usually not the only parties to receive a report on the benchmarking results. is only statistically valid if the number of observations is sufficiently large. preferably comparing the outcomes with those found in previous benchmarking exercises. The purpose of generic reporting is to present the state of play in the industry as a whole. care and home care benchmark. Generic industry reports contribute to the transparency of the industry concerned and are often widely distributed.

Benchmarking is therefore a cyclical process. . the care provider may decide to make adjustments to its strategy or to existing solutions addressing strategic issues. or it may decide to employ its people and resources in a different way.13 Benchmark strategic management information Strategic analyses Input Environment Building blocks Financial performance Strategic themes Results Industry Resources Identification of best practices Quality of care Quality of the job Data measured Organisation Historical context Department/unit Explaining performance Benchmark strategic management information The last dimension of the benchmark model – presented at the bottom of the figure – shows that participants can use the benchmark results as strategic management information for their own organisation.&$ Benchmarking in de zorg 5. If improvements are needed. This information will enable participants to consider possible actions for improvement.

the analytical models and the parameters (funding. external data collection and analysis. internal data collection and analysis. De Vries and Van der Togt are in favour of a cyclical model comprising four main steps – preparation. data collection and analysis. Some authors cite the Deming Cycle: plan. research. These 97 phases are made up of a total of twenty activities. and aftercare 95 – which in turn are subdivided into a total of nine substeps. find benchmarking partners and develop models and instruments • an implementation phase during which data are gathered. A review of the literature reveals general agreement that benchmarking should at least 94 include preparation.6 The step-by-step benchmarking process This section looks into the different phases of benchmarking. planning and implementation of actions. proposal for improvement and implementation of improvements. .). the indicators to be compared. And Keehley has designed an eleven-step plan. analysis and 98 action. engaging third parties. check and act (improve). 6. do (research). While some researchers present more extensive step-by-step plans than others. The authors begin by describing a design phase in which the partners discuss the purpose of the benchmarking exercise.1 Benchmarking phases in the public sector We shall first address the phases described in Benchmarking in de publieke sector. etc. Bullivant’s approach consists of twelve steps divided into three phases: planning. and follow-up. they all distinguish: • a preparation phase to identify what to benchmark. Harrington and 96 Harrington divide the benchmarking process into five phases: planning. validated and analysed • a phase during which findings are reported and discussed Most researchers see benchmarking as a cyclical process in which benchmark participants use their findings to implement improvements and subsequently to measure the effects via a new benchmarking exercise. publications.

For example. offers a framework that encompasses the organisation’s performance. The second activity during the design phase is the identification of the indicators against which benchmarking takes place. The parameters include such things as the research costs and how these costs are to be divided. as well as organisational parameters concerning leadership. organisation of the primary process. The first is to identify goals and parameters. When identifying the goals. designed by the Dutch Quality Institute. HRM. whether or not to engage third parties during the implementation phase. the following elements should be addressed: learning versus accountability. The last phase in the benchmark process is follow-up. The preparation phase is followed by an implementation phase consisting of data gathering. analysis and reporting. government bodies) and the manner in which the findings should be 99 disclosed. the much-used INK management model referred to earlier. . regulators.&& Benchmarking in Dutch healthcare The design phase consists of three activities. the role of other stakeholders (financial backers. The third activity during the design phase entails designing a comparative and 101 explanatory model to determine the depth of analysis. client assessments. 100 strategy and policy. employee assessments and the views of society at large. and resource organisation. A common framework has proven to be a good starting point for a discussion of the indicators to be selected. and a general description of the indicators to be compared. an integral approach versus a partial approach. when participants are supported in their efforts to work towards improvement and the benchmark is evaluated.

His benchmarking process consists of eleven steps and starts only after an organisation has selected a process. Of course. see Section 7).2 Keehley’s step-by-step plan 102 The most extensive benchmarking method is Keehley’s step-by-step plan. It is judicious to prioritise the processes to be 103 benchmarked using explicit criteria. service or function to be benchmarked. We would also make a case for addressing an organisation’s performance in the broadest sense. ‘Do we have a learning organisation culture. . It should address questions such as ‘How experienced are we in benchmarking?’. PricewaterhouseCoopers’ perspective We would argue in favour of opting for an integrated benchmark rather than selecting individual processes to be benchmarked.The step-by-step benchmarking process &' 6. An example would be a benchmark of an organisation’s treasury activities (for further details. or do we suffer from We Are Different or Not Invented Here syndromes?’ Other factors are the strategic importance of the activities to be benchmarked and possible external pressure from clients. the organisation should take account of the degree to which it is ‘ready’ for benchmarking. Selecting one or a few issues is only useful if these issues exist more or less in isolation and are barely related to other issues. competitors or politicians. We shall look into his step-by-step plan and give PricewaterhouseCoopers’ views on each phase. Step 0: Select the processes to be benchmarked Keehley sees the selection of the processes to be benchmarked as the very first step. In doing this. product. which should not be taken lightly.


Benchmarking in Dutch healthcare

this would also entail a selection, as it is not possible to include all aspects of an organisation’s performance in a benchmarking exercise. A selection that is made as part of an integral approach is different, however, in the sense that it entails a selection of an organisation’s core activities, or the (temporary) exclusion of issues for which information is lacking.

Step 1: Determine the purpose and scope
After having selected what is to be benchmarked, Keehley’s first step in the actual benchmarking process is to determine the purpose and scope of the project. The organisation concerned should explain why they want to benchmark and what they hope to achieve. It is crucial that all parties involved discuss and document the reasons for benchmarking and the expected results. They should also meticulously define the scope of the project in terms of breadth, depth, time and resources, so that participants do not lose sight of the original brief during the course of the exercise. If one fails to do so, there is a danger of taking on board other processes that are also worth examining, and before you know it the project becomes unmanageable and loses its focus. The purpose and scope of a project can be recorded in a document that will serve as a contract between the organisation benchmarked and the benchmarking team. PricewaterhouseCoopers’ perspective In our view, two aspects should be distinguished: involving all relevant actors in the organisation, and staying focused. The first element is addressed in more detail elsewhere in this report. As for remaining focused, we share Keehley’s view that this should feature prominently during the preparations. It is vital that the scope of the project is spelled out in a plan of action, for instance, as well as in agreements with individual participants. Only then is it possible to keep control of the process and avoid disappointment afterwards. This does not mean, of course, that we should stick rigidly to what has been agreed at the start of the benchmarking exercise. Each benchmarking study should be a voyage of discovery for the participants and for ourselves. The need to incorporate innovations in a benchmark or to respond to changing legislation means that there will always be things that have to be reinvented. In such cases parties would do well to give each other the space to discard categorisation that has no basis in reality, say, or to re-examine that ‘one and only’ promising explanatory factor one more time.

The step-by-step benchmarking process


Step 2: Know yourself, or: understand your own processes
Keehley underlines the need to make adequate preparations and take stock of the existing situation. When this step has been completed, the benchmark team will have a detailed flowchart of its own processes and some idea of potential performance indicators, bottlenecks and solutions. This introspection will yield a project plan for the remaining benchmark project. The plan details what needs to be done when, and by whom. PricewaterhouseCoopers’ perspective As the healthcare industry tends to use ready-made instruments, some of the work in this step does not always need to be done. Even so, it is important in any organisation that a dedicated team focus on the benchmarking exercise so that the organisation will not be taken by surprise when it is asked to provide information or, at a later stage, by the benchmark findings themselves.

Step 3: Search for potential benchmarking partners
This step begins with the organisation drawing up an extensive list of potential partners. Important sources of information are research literature, personal contacts, benchmark databases (such as the Benchmarking Exchange of the American Society for Quality Control) and the services of consultants specialised in benchmarking. The initial list will then be trimmed using carefully chosen selection criteria, leaving only the ‘perfect few’. One would, for example, need to check whether a potential partner actually has a best practice and whether the best practice is not only proven to be successful, but is also transferable and repeatable in the sense that it is not linked to unique 104 circumstances. Other criteria should also be borne in mind, such as the degree to which the potential partner is similar to, or differs from, one’s own organisation. Organisations initiating a benchmarking study for the first time would do well to restrict the exercise to internal or similar partners. As a rule, one could say that the more experienced an organisation is in benchmarking, the more capable it will be of benchmarking against unlike partners. Note that real breakthroughs are most likely to happen when unlike partners are compared.


Benchmarking in Dutch healthcare

PricewaterhouseCoopers’ perspective In healthcare benchmarking the search for partners is usually initiated by the industry associations: they announce to their members when the next benchmark study is due to take place, provide information and recruit. By contrast, the benchmarking exercise for housing corporations opted for the establishment of an independent benchmarking body charged with recruiting the benchmarking partners.

Step 4: Select indicators
This step comprises selecting the performance indicators to be used to compare one’s own benchmarked processes with the performance of the benchmarking partners. As was the case with the selection of potential partners, a list of all possible indicators is drawn up and subsequently pared down to manageable proportions. The indicators should at least cover the most important steps in the benchmarked process, as well as the key influencing variables. In selecting the performance indicators, the benchmarking team should align the organisation’s own information needs with practical considerations such as cost, time and the willingness of the partners to gather information – for they are not endlessly willing to do so.

PricewaterhouseCoopers’ perspective
In healthcare benchmarking, selecting indicators is a process that necessarily involves only a representative selection of the participants (brought together in a sounding-board group). Whereas we agree that the list of indicators should stay manageable, it should not become too short. Organisations truly striving to launch actions for improvement should not expect to be able to identify such actions on the basis of rough outcomes. Too small a number of indicators could lead to disappointment (‘We already know this’) or to distorted findings (‘We didn’t take into account the outcomes that were not AWBZ-related because that would have been too cumbersome’).

Step 5: Collect internal information
An organisation needs to review its own performance before asking other organisations to provide information about how they are performing. In this step, the performance indicators selected in the previous step are therefore applied to the process, function or service benchmarked.

deepen knowledge and understanding of the findings. If the best-performing organisations also meet the other selection criteria. assuming that it has access to a database of recent comparative data. Step 6: Collect information about partner organisations Information about performance indicators is gathered about every organisation included in the list of selected potential partners. in the sense that it is important for organisations to closely examine their own processes. The performance of one’s own organisation is then compared with the information about the potential partner organisations. Such comparisons will show up which partners perform significantly better than others. Keehley suggests that. each organisation should be able to decide for itself when it wishes to embark on a benchmarking exercise. data collection and data analysis in healthcare benchmarking are coordinated and carried out by research agencies in consultation with the principal and the sounding-board groups. Going on excursions ‘just to kick the tires’ adds very little value. Step 7: Analyse the gap In the previous steps. Additional information can be gathered with the aid of questionnaires or telephone interviews.The step-by-step benchmarking process '! PricewaterhouseCoopers’ perspective In healthcare benchmarking. or – better still – by visiting the partner organisations. We fully support Keehley’s idea. that they are willing to participate and to assist the benchmarking organisation in the more laborious data collection phase that follows. the benchmarking participant gathered information about its own organisation as well as provisional data about potential partners and additional. PricewaterhouseCoopers’ perspective As things stand now. the more comprehensive the information provided. information relating to all participating organisations is requested simultaneously to allow for comparison. Participants in healthcare benchmarks typically come into contact with each other later on in the process – usually after the results have been released. Such site visits. the better the comparison will be. Step 7 . of course. in future. in-depth information about the actual partners. provided they are well-prepared. they will become the benchmarking partners – provided. Whereas the comparisons do not need to be exhaustive in every respect.

• Take a systematic approach to selecting recommendations. There is a danger of getting bogged down in peripheral details. or at least with an invitation to take part in such workshops. That is why healthcare benchmarking exercises are often concluded with a series of workshops involving a limited number of participants. The start of this phase is the development of an action or implementation plan that provides an answer to the key questions: who does what and when? Resistance to change can be reduced by ensuring the active participation of all parties concerned in the implementation process. you will be more likely to detect opportunities for fundamental improvement of your own processes. 105 PricewaterhouseCoopers’ perspective The participants in healthcare benchmarking are given feedback on these analyses. These data allow the benchmarking team to identify performance gaps and best practices. • Keep an open mind for the unexpected. The analyses have a quantitative and a qualitative component. Cloning the practices and processes of model organisations can have dramatic repercussions that widen the performance gap rather than narrowing it. Select various excellent practices on the basis of explicit criteria. But the real pointers for improvement only come to the fore in consultations with other organisations. that simply adopting a best practice will automatically lead to the desired results. the main focus of the project is to incorporate opportunities for improvement. If you focus on the differences and exceptions. One should not assume. Keehley argues that a copycat approach tends to result from superficial internal research and anecdotal 106 external research. a qualitative analysis is performed to answer key questions such as: what does the benchmark partner do in the same way as we do. and what does it do differently? Keehley gives the following tips: • Avoid paralysis by overanalysing. . however. After a performance gap has been quantitatively identified. also referred to as stop-and-shop benchmarking.'" Benchmarking in Dutch healthcare addresses the latter category of information. Step 8: Implement measures to close the gap Once best practices have been identified and selected and management has adopted the recommendations of the benchmarking team.

The step-by-step benchmarking process '# PricewaterhouseCoopers’ perspective Strictly speaking. It is therefore important that. Do the performance indicators reflect improvements? PricewaterhouseCoopers’ perspective See step 11. Step 11: Restart search for best practices Benchmarking is an ongoing process. during the implementation phase and the initial stages of change. Step 9: Monitor the results Improvement measures need special care and attention shortly after they have been introduced. familiar practices. the search . implementing actions for improvement may not be a part of the benchmarking process. As soon as a change has been implemented. Actions for improvement that fit logically into the planning and monitoring cycle are more likely to become firmly embedded in the organisation than actions taken independently of the cycle. as the introduction of new practices often changes the organisation and the environment in which it operates is not static. and as organisations gain more benchmarking experience. there is scope for further improvement. the participants monitor whether the measure is being introduced as planned and whether the desired results are being achieved in practice. to have access to the benchmarking findings when the budget and policy plans are being drawn up. it may well be the most important phase for the organisations concerned. Perfect organisations do not exist. Appropriate timing of actions for improvement is just as important as a sound approach. PricewaterhouseCoopers’ perspective See step 11. That said. of course. Re-evaluating the process essentially means carrying out a mini-benchmarking exercise to measure the effects of the changes implemented on everyday practice and on progress towards the goals set. Organisations are always at risk of slipping back into old. Step 10: Adapt based on the results The benchmarking process needs to be adapted from time to time. It is more efficient.

6.1.'$ Benchmarking in Dutch healthcare for best practices can be extended to include lesser-known organisations and more complex processes. . We have been commissioned to do so by ActiZ. or at least one or several aspects of it.3 A phased approach to healthcare benchmarking We will now discuss the phases we generally apply in healthcare benchmarking. more frequently. PricewaterhouseCoopers’ perspective We have noticed that the call for continuous benchmarking is becoming ever louder. Participants want a better understanding of the impact of the measures taken on their performance and position. there are also partners who would like to repeat the benchmarking exercise. adapted for our purposes. Whilst most benchmarking partners are in favour of a frequency of once every two years. We have summarised the phases in Figure 6. This is one of the reasons why we are now designing a system in which the organisations themselves can determine the frequency and starting dates. These coincide with those described in Benchmarking in de 107 publieke sector and with Keehley’s step-by-step plan.

The step-by-step benchmarking process '% Further development of the benchmark model Gathering. the research itself (phases 2. Needless to say. structuring and validating data Analysing data Reporting the findings Discussing the findings Figure 6. 3 and 4) and actions to be taken in response to feedback on findings (phase 5). Phase 1: Further development of the benchmark model We tend to speak of further development as we are no longer starting from scratch. As the actions for improvement no longer strictly form part of the benchmarking process. The sounding-board groups in which participants from the organisations benchmarked are represented play a central role in this . every exercise has new elements that need to be incorporated in the model. various benchmark studies The five phases in the figure can be clustered into three main phases: preparations (phase 1). and are able to build on experience gained in earlier benchmark studies. The latter phase feeds into the actual implementation of improvements. That said. they have not been included in the figure. bringing about organisational improvement is the very reason for benchmarking in the first place.1 Benchmarking phases Source: PricewaterhouseCoopers.

gathering the requested data can be a very time-consuming affair and in some cases they may even be unable to provide the information at all. Sanford Berg (University of Florida) does not mince his words: ‘If managers do not have the date required for such comparisons.’ 108 . they tend to say they will put in an effort to provide the information in future benchmarking studies. Using sophisticated ICT and well-structured data can simplify analysis considerably. The question that arises in such cases is what to do in the next benchmarking exercise. during which partners register for participation in the benchmark. The sign-up procedure. They contribute to the development of innovations. structuring and validating data Once the benchmark model and its constituent building blocks have been developed. much time was spent on defining the information needed in terms of whether organisations were able to provide this information. If the participants are unable to retrieve information directly from their own records. then one must question what they are actually managing. evaluate new models and questions. but even if clear and accepted definitions are available. Phase 2: Gathering. this may not always be possible. and if necessary test new instruments. In our experience.'& Benchmarking in Dutch healthcare phase. In the early days of healthcare benchmarking. concludes the first phase. Reliable and comparable data can only be supplied on the basis of clear-cut definitions. Stop requesting this information altogether? The answer usually depends on how important the organisations rate the information for their own processes. such as: • conducting interviews using questionnaires (in particular in the client survey) • asking client and employee contacts to complete questionnaires • asking the organisation benchmarked to provide information • requesting data about the organisation from other information sources All data are saved and structured in a dedicated benchmark database. This can be done in a variety of ways. particularly data relating to the financial building block – probably because they feel they could not afford to do without this information. participants will begin to gather information for the benchmark.

tables and figures to web-based reports in which recipients can click on any subject they would like to investigate further. however. the outcomes lose all meaning (‘normal is simply the average of deviations from the average’). If the level of aggregation in reporting is too high. which is distributed digitally or in printed format. the client and the sounding-board group. of course.The step-by-step benchmarking process '' Phase 3: Analysing data With the aid of software analysis tools and the assistance of research agencies. The solution lies in adjusting the level of detail to the specific target group: whereas a company’s board of directors is interested in highly aggregated information. where participants are able to decide for themselves when they wish to carry out particular elements of the benchmarking exercise. . that the information is clearly and accessibly presented). or even conduct their own analyses. Phase 4: Reporting findings In healthcare benchmarking we make a distinction between individual and generic reporting. which requires greater detail. In a separate development. the division or company managers want more detail. individual reporting is evolving from straightforward Word documents with text. and the managers of individual teams or offices want even more detailed information tailored to their own units (provided. At the same time. we first carry out analyses for each individual building block and subsequently analyses that go beyond the level of individual building blocks and address the relationship between the building blocks and the identification of best practices. We find that benchmark participants are asking for general outcomes and the big picture so that they can set priorities. The analytical phases are what make research agencies tick. generic reporting tends to consist of a ‘straightforward’ report in Word or sometimes in PowerPoint. They do not want to get bogged down in detail. We are always interested in knowing what the outcomes are and hope to discover unexpected relationships and further insights. Industry-wide. Reporting to individual benchmarking partners is in a state of flux. Web-based reporting is the only solution in the case of continuous benchmarking. they want strategic management information in order to implement concrete actions for improvement.

Although this is possible in theory. Final score 10 9 8 7 6 5 4 1 5 9 13 17 21 25 29 33 37 41 45 49 53 57 Ranking final scores on financial performance (FP) building block 61 65 Figure 6.2 Example representation of an organisation’s position Source: 2004 home care benchmark study . but is not always what the benchmarking participants are looking for. We make an effort to let figures and tables speak for themselves. An individual organisation’s relative position is illustrated by marking it in a different colour (see Figure 6. for instance by using colours: red represents a below-average score. The client-specific information as well as data about other participants is presented in tables and figures. Use of standard texts is unavoidable. This reduces the need for individually tailored reporting. green an above-average score. it would be an extremely costly exercise. but 109 specific outcomes may cause standard texts to pop up.2). They sometimes expect a report tailored to their specific organisation. Any additional notes are the same for all participants. Benchmarking in Dutch healthcare The reports are automatically generated and subsequently filled with information from a benchmark database using a fixed format.

The findings typically help further improve subsequent benchmarking exercises. The tool – Esculapio – presents historic trends and forecasts alongside current data. As the data in this tool were not linked to a database. If they wish. A precursor to this tool is an Excel application known as the click model. PricewaterhouseCoopers Italy recently presented a more fleshed-out example of a dashboard report for a one-dimensional benchmark. used to report on the financial building block of the 2004 home care benchmark study. Participants are presented with general information. and have on-screen access to more detailed information.The step-by-step benchmarking process  The most future-proof method now available is reporting of information that can be accessed by a benchmarking partner at any point in time and subsequently updated with the most recent organisation-specific outcomes as well as the most up-to-date national figures. . The tool should offer the possibility of drawing up. they could be used only once. saving and printing one’s own reports. This system allows the incorporation of as many levels as desired. Phase 5: Discussing the findings The benchmark findings are discussed with the participants in nationwide meetings as well as in small-scale workshops. using tools such as dashboard or cockpit reporting. In healthcare benchmarking. the benchmarking partners can then initiate discussions or workshops themselves. Users can opt either to retrieve ready-made analyses or to carry out further analyses themselves. a tool of this kind is still being developed. These discussions should always include an evaluation of the value and user-friendliness of the benchmark.

which jointly participate in the benchmark. PricewaterhouseCoopers In 2006 a new benchmarking model was set up for the nursing. with the basic package offering data-gathering and reporting at . is a feature of this benchmark. care and home care benchmark Source: VVT continuous benchmark. care and home care sectors. 7. This section discusses key aspects of benchmarking surveys in the Dutch healthcare sectors.7 Healthcare benchmark: notable features All benchmarks have a number of distinguishing features – features that might be a source of inspiration to other benchmarks. with the exception of the benchmarking exercises focusing on the hospital sector.1 Benchmark model: continuous nursing. The new building block. PricewaterhouseCoopers is or was involved in many of these surveys. care and home care benchmark Figure 7.1 Nursing. It has several modules. known in the Dutch healthcare sector by its acronym VVT. along with new ideas on performance excellence. the mental healthcare association GGZ and the healthcare chain. Financial performance Staff Clients HPOs Innovative strength Figure 7. ‘innovation’. care and home care benchmark.1 shows the benchmark model for the new nursing.

at the same time. for instance. will build on from the Responsible Care Standards Project as soon as this is feasible. draws on a survey of clients and/or their families as well as the views of the Dutch Healthcare Inspectorate. the subject of a lengthier discussion in Section 8. The ‘innovation’ building block is being developed. as we have noted.g. the benchmark is based on products as much as is practicable: for inpatient care these are care complexity-derived products and for outpatient care the focus is on care as provided under the Social Support Act and the Exceptional Medical Expenses Act. Nearly all building blocks include innovations. This project aims to arrive at a broad-based description of the quality of care and. no compulsory time-keeping – it has. child healthcare is the remit of home care providers and the GGD community health services." Benchmarking in Dutch healthcare overall organisation level plus one level below – all other levels are optional. ‘Quality of care’. and between expected costs and expected budget. but rather on the ability to innovate and the parameters for innovation. 7. debt servicing charges and gap analyses. While the ‘financial performance’ building block itself has been simplified – e. but also involves the ability to adopt solutions devised by others and/or adapt these to the organisation’s needs. The staff monitor includes an employee survey and data gathered via ‘financial performance’ such as FTE breakdown and sick leave. ‘Quality of work’ has been simplified and alignment with other building blocks improved.g.2 Child healthcare benchmark In the Netherlands. been extended to include treasury data – e. A start has also been made on benchmarking performance excellence. Lastly. It has been agreed to focus not so much on innovation projects. This ability is not restricted to finding totally new solutions. which they provide . Innovation has been defined as the organisation’s ability to keep reinventing itself and so adapt to changing circumstances. as every organisation would interpret this differently. Time-keeping is a separate option. These gap analyses compute the difference between the current budget and the budget based on care complexity. Home care organisations are responsible for healthcare to children between 0 and 4 years of age.

Welfare and Sport and directed by ZonMw. The differences between the Dutch home care providers and the GGDs also have a part to play here. some organisations were already indicating that this would not happen again: they wanted to allocate costs to products for the sake of their own business operations as much as for any subsequent benchmark. . At the time. GGD Nederland (the association of all community health services) and the Association of Netherlands Municipalities (VNG) are running a project called Beter Voorkomen (Prevention is Better) until 2008. as records still reflected the system of input funding that had been in force until 2003. Moreover. The GGDs provide healthcare to children and young people aged between 4 and 19 (school healthcare. 7. the industry organisation representing providers of care insurance in the Netherlands. At the same time. counselling. To encourage alignment between the two types of child healthcare and improve overall child healthcare in the Netherlands. the Netherlands organisation for health research and development.Healthcare benchmark: notable features # through the baby health clinics (‘consultatiebureaus’) well-known to all parents in the Netherlands. For the 2004 home care benchmark we conducted a financial benchmark of the child healthcare supplied by the country’s home care providers. customised care. One of its key features is a benchmark that is designed to contribute to a joint frame of reference. Working with Van Naem & Partners. with resource allocation criteria deciding what goes where. sponsored by the Ministry of Health. This benchmark showed that organisations found it very difficult to attribute revenues and costs to individual products. etc. the community health service is embedded in the municipal set-up. ActiZ. One key consideration underlying this benchmark is that these organisations’ funding is more diverse than that of many other healthcare providers. with each municipality deciding its own financial performance and care targets. A number of organisations do both. municipalities are free to buy additional. the Health Care Insurance Board (CVZ). we drew up a plan of action for this benchmark in 2006.). later getting on board as co-sponsor. with Zorgverzekeraars Nederland.3 Healthcare administration agency benchmark The benchmark for healthcare administration agencies was commissioned by their regulator.

at least in part. the 2005 benchmark produced many instructive outcomes. ES as employee survey and FP as financial performance.g. The benchmark also contained elements that might prove very useful in benchmarking healthcare insurers. one of them being greater insight into FTE breakdown. it proved impossible to identify the choices that contributed most to quality. And with the client survey inadequate as a proper quality gauge.$ Benchmarking in Dutch healthcare This benchmark started off as a broadly-based instrument with a financial component. 7. Comprising such building blocks as ‘quality of care’. As a result. . even when they have recently had dealings with one. the Vereniging Gehandicaptenzorg Nederland (VGN. ‘quality of the job’ and ‘financial performance’. the benchmark is currently in full swing. e.4 Benchmarking care for the disabled Having run a pilot in 2004. priorities shifted and the national benchmark ended up as a single client survey and financial performance investigation. The financial performance survey showed up the difficulty such agencies had in deciding who to compare themselves with. additional comparisons with self-selected reference groups or rather more customised notes to individual reports. Figure 7. The client survey revealed that many clients have no real concept of the role of a healthcare administration agency. Customised benchmarking is also possible: organisations are able to receive more detailed. many found it impossible to adequately rate the performance of healthcare administration agencies. on-demand reports. All that said. Costs varied. sector-wide benchmark in 2006. Association for Care of the Disabled in the Netherlands) commissioned a full-fledged. A successful pilot was completed and the aim was to launch the benchmark countrywide. with 108 organisations participating. with CS standing for client survey. However. waiting-list mediation. quality building blocks and research into operations. Healthcare administration agencies also found it a challenge to define new products in uniform and measurable terms.2 captures the planning process for the benchmark. but this was found to reflect policy choices.

Healthcare benchmark: notable features % Preparing benchmark survey Figure 7. It also includes workshops to discuss benchmark outcomes. with the same applying to the different services provided: outpatient day programmes. slightly mentally disabled. If relevant. data are sorted by client group. residential living. A product/market matrix thus emerges. and heavily behaviourally disturbed. slightly mentally disabled. sensory disabled. mentally disabled. outpatient day care. treatment. Integration and developing building blocks Workshops with peer groups Data set for annual review Organisation reports and countrywide presentation Integrated analyses . and inpatient day programmes. All building blocks making up this benchmark take account of a breakdown into client categories: physically disabled.2 Care for the disabled: a review of the benchmarking timeline Source: 2004 care for the disabled benchmark The timeline coincides with the one set out in the annual healthcare review.

sample survey. like many others before it. In selecting their participating units. and wonder when the next step will come. an organisation would sometimes find that it did not have the requisite data at the level of the relevant unit. At the end of the day. Also.3 Care for the disabled: product/market combinations Source: 2004 care for the disabled benchmark Starting the benchmark is projected to require a time investment on the part of the organisation of 23 working days on average: • 8 days for general project coordination • 6 to 12 days for the client survey (information. encouraging participation) • 5 days for the provision of financial data We have found a careful sign-up procedure to be key – and time-consuming. setting up appointments) • 2 days for the employee survey (information. the final outcome – sector-wide recognition of the benchmark – proved worthwhile. Keeping its member . participants are unable to see all the hard work going on behind the scenes. we extended the sign-up term by a number of weeks. As the aim is to embed the benchmark’s results throughout the organisation. organisations were free to decide which units to enrol in the benchmark – and that also required time. And then the process would start all over again. this took time. as we had failed to factor in that these aspects would cause time pressures. we agreed it was a good thing to discuss the decision to participate at various levels. This benchmark. However. was testimony to the importance of frequent communication with participants to secure ongoing commitment. After all. However.& Benchmarking in Dutch healthcare Service groups Target client groups VG Children Adults LG Children Adults ZG Children Adults LVG SGLVG Non-residen Outpatient care tial care at home Outpatient day programmes Outpatient treatment Residential care Residential living Day programme Treatment Figure 7.

organisations participating in the survey received boxes of questionnaires for each internal unit enrolled during sign-up. of course. In January 2007.000 interviews will be required. Posters are sent out to draw attention to the client and employee surveys. Contrary to expectations. but can also use the tool. the companies conducting the survey. also to identify the ‘real’ gap. Results have since come in: a response in excess of 50 per cent. and all centred on the idea that participants will stay on board only if they know what is going on. a web-based click-to-calculate service. which has undergone a few minor changes since then. . Admittedly. Organisations can opt either to write to all parents/relatives or to use a sample of half the number of these. the survey uses the ‘quality of the job’ questionnaire. The tool was included to help show up the difference between an organisation’s current revenues and the revenues it might look forward to after the introduction of care complexity packages. As in 2004. have now carried out 10. Organisations may find their results in their individual reports. but organisations end up hearing how clients feel and not just other people’s opinions of what clients are likely to feel – however well-meaning those others may be. Quite a package. by letter and email. This time. to work out the effects of their own additional scenarios – and. the organisations’ contacts are invited to attend a series of informative meetings. a solid percentage. when the time comes. Klanq and Perspectief. while contacts receive sample letters they can send to clients/parents and employees.Healthcare benchmark: notable features ' organisations regularly updated through its network.000 interviews. this is a time-consuming and costly affair. Adapted questionnaires have been developed specifically for parents or relatives. employees were also given the option to complete the questionnaire online. The benchmark’s financial building block includes what is called a gap analysis. compensation by care complexity package is as yet unclear. Numbers recorded at sign-up suggest that a total of 18. An unusual feature of this particular benchmark is that client surveys are primarily conducted through one-on-one interviews. so a number of scenarios have been included instead. VGN has also plumped for its own benchmark logo. making all communication immediately recognisable as benchmark-related. VGN publishes a newsletter as well as issuing brochures on different building blocks. And to top it all off. The only benchmark tool that has seen sector-wide application before is the employee survey. then. a tool for calculating the gap between revenues and costs based on current prices.

that the organisation had participated in the previous survey. lifestyle training results • overhead charges . Lastly. but it’s a start. Noting ten – not sector-wide – projects over the past ten years. Benchmarking in Dutch healthcare Another noteworthy feature of this benchmark is its alignment with regulatory accountabilities under the banner ‘one-time delivery only’. then. Following the end of the benchmarking exercise in September 2007. 7. all organisations received their individual reports connecting the various building blocks and comparing the employee survey outcomes with those for 2004 – provided. the brochure lists benchmarks on: • the health information system ZORGIS’s key figures • key facility figures. GGZ Nederland sees benchmarking primarily as a tool for learning – and not therefore as an accountability-driven instrument. we 110 will draw on a GGZ Nederland brochure. benchmark survey results informed the sector-wide benchmark report presented on 13 September 2007. PricewaterhouseCoopers has not been involved in the benchmarking efforts by the mental healthcare association GGZ. which also included all other benchmark building blocks.5 Partial benchmarks in mental healthcare To date. with an annual cycle covering a specific theme and identifying relevant performance indicators • clinical psychotherapy (care provision outcomes) • inpatient rehabilitation centres (addiction care. care provision outcomes) • addiction care. Observing that benchmarking is now firmly rooted in the sector and no longer needs the association to act as a key driving force. of course. Granted. For this subsection. allowing organisations to use these data for their social reports. It also feels that benchmarking has a part to play in innovation. the brochure provides a guide encouraging organisations to grasp the nettle. this involves only a small batch of data at present. as a quality tool that can help generate and spread new knowledge. Which is why we try to provide feedback on client and employee surveys as fast as we possibly can.

The feasibility study’s key conclusions were: . with outcome defined as both the difference before and after the intervention or treatment.e. the Dutch Healthcare Inspectorate and the Ministry of Health. 7. and the difference with and without intervention. The Erasmus Medical Centre’s Institute of Health Policy and Management (iBMG) and Prismant were commissioned to develop the benchmark by ZonMw. including quality. the care that stroke patients receive. The first benchmark tool for chain care across sectors is now in place for the care given after a cerebrovascular accident or CVA. This set of indicators was agreed by care providers. its authors present a comprehensive benchmark model covering both costs and effectiveness. this also applies to an example of mental healthcare benchmarking outside Dutch borders: in Towards National Benchmarks for Australian Mental Health Services. Incidentally.Healthcare benchmark: notable features  Project structures in GGZ’s benchmarks range from independent project organisations to a platform for discussion to (in some instances) external consultants. the mental healthcare sector has not yet opted for multidimensional benchmarks. We are no longer talking individual performances by individual organisations here. European Stroke Day. As the brochure reveals. Their model comprises a set of performance indicators and outcome indicators. home care and general practitioners. which discusses the creation and operation of the chain benchmarking tool in great depth. i. we have found a sector-wide set of performance indicators for the quality of care provided. In addition to the GGZ Nederland brochure. professional associations. insurers. On 10 May 2005.6 Benchmarking the healthcare chain Measuring ‘chain care’ adds a whole new dimension to benchmarking. but the joint performance by all the healthcare organisations that make up the chain. client organisations. nursing homes. Welfare and Sport. this set is very suitable for benchmarking indeed. but it does run benchmarks that explicitly investigate care outcomes. Featuring in the annual healthcare review. The discussion paper also introduces an index of case complexity as a performance indicator. the Netherlands organisation for health research and development. rehabilitation centres. they released their research report Stroke services 111 gespiegeld (‘Comparing stroke services’). The benchmark measures cooperation between hospitals.

• All care chains are unique. Geographical regions can compare benchmarking outcomes with their own working practices and use these to make improvements. . Benchmarking in Dutch healthcare • Benchmarking care chains and best practice selection are feasible. The fact that chains were willing to invest in the benchmark was a major achievement in itself. The report also suggests that government and regulators such as the Dutch Healthcare Inspectorate should make the benchmark compulsory but. • Both leaders and laggards have room for improvement: chains have much to learn from sharing. It simply will not do for only a section of the chain to benchmark. • Benchmarking would appear a distinct possibility for other care chains as well. but benchmarking helps to achieve better results and reduce regional differences. The report argued the case for standardisation of general repeat elements such as employee motivation surveys. that they should nurture and promote it. • Benchmarking a care chain requires a great deal of commitment from every single link in the chain. even if they do not.

at least. sector-wide.snellerbeter. Dutch comparative studies of the hospital sector A – non-exhaustive – list of comparative studies into the Dutch hospital sector yields the following: • • • • • • • • • • • the NVZ database (NVZ being the Dutch Hospitals Association) the Dutch Healthcare Inspectorate’s performance indicators annual performance comparisons by Algemeen Dagblad and Elsevier a survey into the best care per condition by Consumentenbond. Insurers’ websites also 112 disclose comparative studies. . For instance.g. radiology and laboratories Prismant management features www. integrated benchmark is not available just yet. That said. a generally accepted. Partial surveys cover the usual comparisons of waiting periods and the effects of hospital care that typically make it into the country’s magazines and newspapers. e. the Dutch consumers’ association Ernst & Young financial key figures treasury and interest comparisons by PricewaterhouseCoopers comparisons of operations between leading clinical hospitals.7 Benchmarking Dutch hospitals The hospital sector is benchmarked by a wide range of partial benchmarks – or comparative studies.Healthcare benchmark: notable features ! 7. Zorgverzekeraars Nederland reports that healthcare insurer Agis’s website provides links to recent surveys by newspaper Algemeen Dagblad. and supports comparisons between survey outcomes. weekly magazine Elsevier and Roland Berger Strategy Zorgverzekeraars Nederland’s purchasing guide costing model used by the Diagnosis Treatment Combinations (DBCs in the Dutch acronym) NVZ database The Dutch Hospitals Association NVZ runs a database storing a plethora of data on hospitals that allow for comparative analyses. facility management.

e. drawing on the hospitals’ annual accounts. . these reviews have become more comprehensive over time. The association wrapped up its review with a league table revealing which hospitals typically scored best and which lagged behind. It also gave hospitals brownie points for each time they agreed to participate. All questionnaires focused on a different condition and investigated the quality of care." Benchmarking in Dutch healthcare Dutch Healthcare Inspectorate performance indicators The Dutch Healthcare Inspectorate has developed a set of performance indicators that all Dutch hospitals report on and that are publicly disclosed.g. The same indicators inform hospitals’ annual quality reviews/annual reviews. on hospital websites. Industry associations initially came out against these surveys. PricewaterhouseCoopers’ treasury and interest comparisons PricewaterhouseCoopers carries out annual treasury and interest comparisons for hospitals. Although controversial. Consumentenbond questionnaires In 2005 and 2006 the Dutch consumers’ association sent out questionnaires to hospitals and independent treatment centres a total of eight times. The hospitals themselves do not initiate these comparisons. as they felt that fragmented investigations such as these put too great a burden on the organisations. Measuring both quantitative and qualitative data. Algemeen Dagblad’s and Elsevier’s annual performance comparisons Both the newspaper Algemeen Dagblad and the weekly Elsevier publish annual comparisons of Dutch hospitals. discussed at greater length below. Ernst & Young’s financial key figures Ernst & Young periodically releases a review of financial key figures per hospital. these indicators allow for comparisons between hospitals by stakeholders and hospitals alike.

To all intents and purposes. e. capacity and finance. the insurer may wish to agree quality improvement. Zorgverzekeraars Nederland’s purchasing guide The sector organisation representing the providers of care insurance in the Netherlands has released a guide to hospital quality indicators. with participating hospitals having access to benchmark data. capacity and care provided – thus allowing them to select which group of hospitals they would like to be benchmarked against. Accurately indicating how costs may be . which informs purchasing agreements between care insurers and care providers. The indicators were uniformly defined countrywide. Prismant management features Prismant. www. Drawing on the indicators. Comparing hospital scores is part of this: if a hospital has come in at a below-average score. where insurer and organisation are free to negotiate price and quality.g. DBC costing model A number of market leaders have developed a costing model in preparation for the introduction of DBCs in hospitals. staff deployment. the insurers measure the care provided and take this into account in their negotiations. Indicators cover eight Diagnosis Treatment Combinations (widely known by the Dutch acronym DBC) in the so-called B segment. provide the means for hospitals to run their own business comparisons in different areas – finance. the healthcare management consultants. and three A-segment DBCs. laboratories and parts of facility management. even though it was not created with benchmarking in mind. Specific departments of the hospitals are compared.Healthcare benchmark: notable features # Operations at leading clinical hospitals compared The country’s leading 19 clinical hospitals have agreed on participation in comparisons and benchmarking. The Sneller Beter programme sees Dutch hospitals share best practices in the field of business operations. Areas benchmarked typically include care provided. the purchasing guide may serve as a benchmark.

the model serves as a very appropriate vehicle for comparing hospitals. with focuses including customer satisfaction. outreach activities and the use of sophisticated ICT to prevent medication errors. A few international examples of benchmarking or comparative studies: International benchmarking An unusual example of a benchmark focusing on a specific condition is the international benchmark on cataract operations. Kaiser Permanente. The purpose of this comparative study of eye hospitals in Europe is to further improve the quality 113 of cataract treatment. The authors feel benchmarking has an increasingly important part to play in gaining insight into quality of care improvement. As it started years ago. the process has given rise to extensive databases and independent agencies that analyse these data per hospital and in many cases even per doctor. This also makes it very useful for analyses of factors explaining cost differences. United States In the United States hospitals are benchmarked on the basis of a set of parameters imposed by the government.$ Benchmarking in Dutch healthcare allocated to individual DBCs. found that professional debate and knowledge exchange had given a sharp boost to the quality of treatment. As it is based on uniform definitions and allocation. but even here benchmarking – and particularly multidimensional benchmarking – is not always the generally accepted methodology. does a lot of internal benchmarking. Kaiser Permanente also benchmarks itself against nationwide results. The Rotterdam Eye Hospital is a benchmark partner. an American health maintenance organisation (HMO). Putters et al. the model has been widely implemented across the hospital sector. while providing comprehensive transparency on the breakdown of costs. with areas covered including treatment results. protocols and . comparing doctors and sharing best practices. A comparison between American hospitals will throw up a number of best practices. Comparative studies of the hospital sector outside the Netherlands Outside the Netherlands there are examples of best practices in benchmarking.

Healthcare benchmark: notable features % operations. broken down into teaching hospitals and general hospitals in three different size categories. The rating is somewhat similar to the Dutch Healthcare Inspectorate performance indicators. is that organisations are unable to sit down and exchange experiences. The benchmark draws on the hospitals’ annual accounts. of course. for which additional data need to be submitted. . the hospitals receive a report comparing their data with those of other participants in their subgroup. Participants do have the option of having further analysis done. with hospitals awarded stars for their performance. All data are broken down by speciality and the software incorporates a large number of automatic checks for consistency. care provided and FTE breakdown. without being told who those others are. completeness and probability. Kaiser Permanente is able to use internal benchmarking as a strategic management instrument. The drawback to this low threshold. Every year. it is repeated every year and has well over 120 hospitals participating. As it owns a large number of hospitals. Germany PricewaterhouseCoopers in Germany runs a hospital benchmark covering data on finance. which means that the hospitals need not make any extra effort other than granting permission for the use of their data. albeit that the NHS’s is a centralised rating system applicable to all NHS hospitals. United Kingdom The National Health Service compares hospitals on a standardised set of parameters known as the NHS performance rating. Launched in 2001.

gnash your teeth tomorrow. Market forces should come increasingly into play and drive the need for . making improvements difficult to implement. The reports take two to four hours to produce. knirscht morgen mit den Zähnen (‘Head in the sand today. care insurers. With chains of hospitals being formed. such learning is likely to focus on single areas but as time goes on the call for multidimensional benchmarking will increase. multidimensional benchmarking is necessary to show up the interrelationships between results: overall results and/or the way results are connected may not become visible from a single-dimension benchmarking exercise. patients and the media to present transparent results and as they themselves cast around for opportunities to learn and improve their operations.’) Greater interest in hospital benchmarks We see benchmarking garnering more interest on the part of Dutch hospitals in the near future. In complex organisations like hospitals.& Benchmarking in Dutch healthcare Clinical treatment Average length of stay Benchmark in baseline year 2005 Development of median Figure 7.5 Sample report from German benchmark Source: PwC Deutsche Revision The most notable feature of this benchmark is the brevity of its timeline: as soon as a subgroup has enough members to enable the creation of reliable comparisons. the need for benchmarking as a strategic management tool will also increase. participants can obtain their reports with a single mouse click. Initially. as they come under greater pressure from government. The German benchmark’s motto: Wer heute den Kopf in den Sand steckt. as the holding company will want to raise performance by comparing the hospitals that make up the chain.

the hospital sector has its own dynamics and simply adopting a model and tools will not do. and to implement improvements within the hospital itself. for another. the partnerships of medical specialists so typical of the Dutch hospital landscape might need to be brought on board. In mid 2006 it transpired that hospitals were losing interest income as they were invoicing later than had been assumed in the standard interest rate payment. The benchmark involves small groups that really allow their fellow participants to look behind the scenes. care and home care benchmark (innovation. particularly if it takes on board the additions currently being developed in the nursing. . Of course. We feel that the multidimensional benchmark model discussed at great length in Section 5 would be appropriate for hospitals. born out of the treasury benchmark we briefly touched on above.Healthcare benchmark: notable features ' improved performance – albeit that hospitals’ willingness to share information might decline as a result. Benchmarking the DBC process We will end this section with an example of a process benchmark. and do so in great depth. Participants leave with a customised assignment and return at a later date to discuss progress. Following data collection and processing. hospitals are large enough to be able to use benchmarking internally to compare performances by department and doctor. but the groups only include organisations from the same sector. For one thing. best practice and how to help other organisations achieve these standards. Treasury benchmarks are conducted across the healthcare spectrum. hospitals will need to be appropriately categorised for benchmarking purposes. performance excellence). workshops are held where benchmark partners discuss their own results. Treasury benchmarks The treasury benchmarks run by PricewaterhouseCoopers focus on treasury aspects only. A number of hospitals then decided to compare their invoicing processes. And lastly. benchmarking might need more than the cooperation of hospital boards: depending on the precise nature of the benchmark. The aim of these comparisons is to improve the treasury function and to optimise net interest income.

the benchmarking hospitals discuss the process from opening to collecting. collection periods. participants are creating their own opportunities for improvement. Major differences have emerged. agreeing and measuring key figures for every stage of the process.  Benchmarking in Dutch healthcare DBC process: key features of multiple processes DBC process: key features of multiple processes Opening DBC Closing DBC Validating DBC Invoicing DBC Collecting DBC Work in progress Accounts receivable Figure 7. . Major areas include market share versus share of accounts receivable.6 DBC process benchmarked Source: PricewaterhouseCoopers (DBCs) Assisted by PricewaterhouseCoopers. past-due status of accounts receivable. and of course the length of the entire process – all of which are broken down by insurer and DBC. By comparing the outcomes and exchanging tips.

After all. 114 Accenture’s international study puts it among oft-used instruments.1 lists key 115 features of excellent performers on the basis of published results. Organisations boasting superior performance show better long-term results than others in terms of healthy financial operations and satisfied customers. in terms of both content and tools. with the literature and real-world benchmarking exercises scoured for examples of less taxing methods or more successful reporting.1 Towards performance excellence We have briefly touched upon the challenge of creating a benchmark that does not just underpin performance improvement but actually provides the impetus for excellence. Reviewing and analysing over 90 studies with a view to identifying the characteristics of excellently performing organisations. but no less than 78 per cent of respondents – government bodies in the main – feel they are not putting the tool to optimum use. Benchmarking should be more than a good way to bring the laggards up to speed: it could also help the ambitious achieve superior excellence. There are no reasons to assume that benchmarking is perfect as it is. Granted. De Waal found a number of recurrent themes and organised these into clusters demonstrably linked to performance excellence. . 8. benchmarking should be benchmarked as much as anything else.8 Innovations in benchmarking This final section reviews innovations in benchmarking. he argues. Table 8.

The organisation only hires exceptional people of entrepreneurial spirit – provided they are a good cultural fit – and deals swiftly and effectively with non-performers. Processes The reward system is seen as fair. 2006 Excellence is a relative concept. you stop being good. Culture The organisation has strong and meaningful core values. Robust plans are in place for achieving objectives. Technology Flexible ICT systems have been implemented throughout the organisation. Sharing best practices within the organisation is actively encouraged. Benchmarking in Dutch healthcare Cluster Organisational design Key features (examples) The organisation is straightforward and flat and has no barriers between units. Systems are exceptionally user-friendly.1 Hallmarks of excellence Source: A. Organisational members The organisation sees and treats its members as its main instrument to achieve its objectives. Leadership Management has an effective.A. As soon as other organisations achieve the same level of excellence. Strategy The strategy is absolutely clear to all members of the organisation. ‘If you stop getting better. Management challenges itself and others to always achieve better results. Outward-looking The organisation is committed to adding value for its clients. focused and strong leadership style. de Waal. openness and trust are key to the organisational culture. Characteristics of a High Performance Organisation.’ . an organisation will have to excel even more if it is to 116 stand out from the crowd. Organisational assets are put to highly effective use. Transparency. Table 8. The organisation has unreservedly opted to benchmark against best-in-class performers in the industry.

Innovations in benchmarking 


To illustrate how the drive for performance excellence can spark far-reaching 117 changes in company cultures, here is what management guru Tom Peters had 118 to say on several visits to the Netherlands: • ‘Reward brilliant failures, punish mediocre successes.’ • ‘Find people who don’t live by the rules. Excellence can only be obtained if you risk more than others think is safe and if you dream more than others think is practical. Or do you want your tombstone to say “He always made budget”?’ • ‘Talent is all a company has got. HR should sit at the head of the table. A real leader doesn’t do finance or sales, a real leader does people.’ • ‘I hate mission statements.’


Excellence and innovation

De Waal, like many others, sees a direct link between excellent performance and innovation. In fact, innovation is considered an essential ingredient of performance excellence. Watson’s Strategic Benchmarking has many examples of the success of innovative companies and of how less innovative rivals have fallen behind. He links innovation to creativity and to exceeding customer expectations. Imaginative understanding of customer requirements creates wildly enthusiastic customers who will do whatever they can to lay their hands on the product. Innovation requires being able to think out of the box and even sometimes forgoing the backing of others. Citing Compaq as a successful example, Watson describes how the computer manufacturer started focusing on design and making computers smaller and lighter, while all its rivals were still engaged in improving stability and reliability. Though technically no better than the competition, Compaq’s products were undoubtedly more 119 innovative – and a massive hit. Watson also points out that innovation is an inflationary phenomenon: every new development in the outside world requires fresh innovation. In its 2005 Management Tools & Trends study, Bain & Company identifies innovation as the next big organisational challenge reported by its respondents. Our own review of excellently performing organisations reveals that many of their key characteristics coincide with the best practices that emerge from our


Benchmarking in Dutch healthcare

benchmarks – albeit that the benchmarks paint a fragmented picture, as we have never investigated all characteristics at the same time. But best practices such as flat organisation structures, a focus on employee development, shared values and robust, dynamic leadership confirm De Waal’s findings. In its VVT benchmark, industry association ActiZ decided to start measuring a number of characteristics of excellence. To a large extent, the benchmark’s current building blocks already capture these characteristics. Changes to be introduced will involve a different clustering, often transcending building blocks, but will limit the need for new data collection as the answers to questions in the current building blocks are simply rearranged. The idea is that performance excellence scores will provide greater insight into ways of improving operations and more information about any gaps between an organisation’s own operations and those of excellent performers. Needless to say, we are very eager to see the first results emerge in the autumn of 2007.


Benchmarking outside the box

The literature on benchmarking often argues the case for studying 121 organisations outside one’s own industry or in other countries (Watson ), a point frequently raised in the healthcare benchmarks as well. Home care organisations, in particular, many of which have participated in benchmarking exercises three times, are keen to take this next step, and we definitely plan to investigate the possibilities in our current benchmarks.


More research into cost-to-reward ratios

Up to this point, we have primarily focused on the potential rewards of benchmarking and not on its costs, which, of course, there are – and not just the direct costs of participating, but also of time invested. Organisations provide data, employees and clients spend time completing questionnaires. In this report we have refrained from quantifying these costs, as there are too many differences between the benchmarks reviewed. Obviously, a benchmark accessing client views through questionnaires via the Internet or on paper will be much less costly than a benchmark involving face-to-face client interviews. Besides, in many cases the relevant data are not intended exclusively for the benchmark. The VVT benchmark, for one, no longer uses separate client

Innovations in benchmarking 


surveys, but draws on surveys carried out as part of its responsible care programme and accountability to the Dutch Healthcare Inspectorate. Most benchmarking exercises in the past few years have asked participants for their post-benchmarking views on cost-to-reward ratios among other matters. On the whole, respondents were positive – if they had not been, we would not have been able to continue the benchmarks – but that is not to say there were no critical comments. Invariably, some organisations felt a benchmark was too general and not sufficiently instructive, while others cited too much focus on detail. These differences perhaps reflect the purpose of the benchmarking exercise. A more generalised benchmark would be better suited to gauging one’s own position relative to others, while a rather more detailed benchmark would also be appropriate for generating concrete strategic management information. Designing a modular benchmark might be the answer here, combining a generalised benchmark as its backbone and more detailed modules at the customer’s choice. The latest benchmarks are testing this approach to some extent, but it would be rash to say that the optimum balance has been found. This will require more research. Whatever balance is struck, benchmarking remains a matter of one thing affecting another. The desire of some – a concise benchmark providing copious amounts of concrete, strategic management information – is likely always to remain a utopian ideal.


More benchmark partner involvement

Section 3 highlighted the disappointing phenomenon that organisations spend a great deal of effort participating in benchmarks but that learning and improving based on benchmark results is decidedly lagging. Our observation has been no different. To date, we have typically responded by attempting to enhance our benchmark reports, making them more accessible by capturing key results in scorecards, developing applications that allow participants to click on the information

if applicable. it is virtually impossible – and in any case very. Without dismissing the importance of benchmark reporting out of hand – listing points of action. Customising questionnaires for each individual organisation is one idea. Participants electing to include additional questions would receive individual reports on these and. After all. which would be clearly defined so that all users know exactly what is meant. organisations have been pretty clear in their desire to see flexibility and. $ Benchmarking in Dutch healthcare they wish to see. but add the option of additional questions that individual participants can select. Real-world experience shows that flexible questionnaires work: PricewaterhouseCoopers in Germany created a benchmark for hospitals using precisely such questionnaires. very expensive – to completely tailor a questionnaire to a single organisation and thus individually process and report on the response. listing points of action – but all to no avail. we are talking additional questions here. if other participants were found to have answered a specific question in this or any previous rounds. Nonetheless. not completely different questionnaires. Perhaps the following suggestion might prove useful: keep the backbone of the benchmark unchanged with identical questions for all participants. This approach would ensure the continued quality of the information. the relevant subsection of the building block. applying colour codes to indicate specific areas for improvement. Nor did the exercise involve a small group: their number can easily top 100 participants. In addition to setting up a library of customised questions. The requirements of the individual organisations would underpin the creation of a library of such additional questions. would also receive comparative data. interactive and communicative methods might help enhance participants’ involvement and . if a hotly debated one. benchmarking requires shared sets of data to really classify as benchmarking. is an encouraging road to pursue further – the solution would appear to lie in another direction entirely: getting participants more involved in conducting the benchmarking survey themselves. Single questions would indicate their building block category and. Moreover. retain the nature of comparative data and still meet individual requirements. when all is said and done. we reckon.

In addition to identifying the relationships between outcomes. At most. A word of warning here: causal or statistically significant relationships have not been found to exist in by any means all cases. an organisation should also be able to get the five best scores for each building block at the click of a mouse or make its own analysis . of course. Benchmarking will never be an entirely mechanical process. as any correlation at organisation level might be coincidental. One idea mooted is to design a simple tool that would enable benchmarking teams in an organisation to indicate what they think their scores will be and what scores they are aiming to achieve.6 More dynamic reporting Benchmark reporting to organisations can and should be more dynamic. web-based reporting should enable organisations to carry out these types of analyses with the aid of their own computer models. What would be the best group size? Should the organisations themselves put the groups together? What role. if any. predicated on the assumption that the data for all participants show up sufficient interrelationships for the computer models to actually work. But there is definitely room for improvement in the reporting tool. this does not mean that these apply to all organisations.Innovations in benchmarking  % commitment. have the researchers and the industry association to play? 8. We are occasionally asked to produce integrated analyses at the level of the organisation – but that is simply not possible. Lastly. And even if such relationships are uncovered at the aggregate level. All this is. What we can do is offer tools to show what would happen if the same interrelationships applied to the organisation as to the aggregate. This will make it easier to identify actions for improvement. we may be able to say what the chances are that something or other happens if the organisation pushes this button or that. we work together with industry associations and sounding-board groups on the best approach to the workshops that discuss benchmark outcomes. These figures could then be compared with the actual scores later in the process. but also to get a handle on the effects of any such action. What button should we push to reduce costs by x per cent? Which other aspects would that affect? How do we see the impact of our efforts after the previous benchmarking exercise? How much could our client score go up or down if we raised overheads by x per cent? Dynamic.

allowing for a pre-agreed timeline. even if they are using ICT. always needs a certain amount of time for sign-up and data collection. What it . the system should calculate best practices and countrywide trends at regular intervals. Moreover. nothing need keep it from investigating this by starting a new benchmarking exercise – even when other organisations benchmark less frequently. The system should also reduce the costs of benchmark participation – an advantage not to be underestimated. One-time benchmarking. Continuous benchmarking implies a system allowing organisations to decide when to start the benchmarking exercise themselves. either immediately after input or periodically. making for optimum embedding in their own management cycles. perhaps the most important being that the process will be shortened considerably. 8. The aim is to create a database enabling automatic processing of data. A third benefit is that there is no longer any need for organisations to wait for the next benchmarking exercise to roll around. or to select dummy tables to help print outcomes that include columns to indicate the organisation’s proposed actions for improvement and who will be implementing them. & Benchmarking in Dutch healthcare of differences between organisational units.7 Continuous benchmarking ActiZ’s request for a new VVT benchmark was not just about the integration of the various benchmarks for nursing homes. care homes and home care but also about the creation of a continuous benchmark. a continuous benchmark allows organisations to benchmark when they see fit. In the case of repeat benchmarks. It should also be able to access suggestions for improvement if it scores low. organisations may choose not to participate in all building blocks but to use their most recent benchmark figures as a reference point. Data submitted are automatically compared with averages and best practices in the database. by contrast. In addition. There are many advantages to such a system. and organisations have immediate and automatic access to these analyses. Researchers subsequently need time for analysis. If an organisation wants to know whether its improvement measures have had any effect. using a web-based application to complete and return questionnaires without any intervention from consultants.

however. with older data periodically removed. A number of issues still need resolving. This means that at least one group of organisations will have to benchmark simultaneously for the database to be filled with their data.Innovations in benchmarking  ' will look like in detail is as yet unknown. is that enough data be available to set up a reliable database. A primary requisite of a continuous benchmark. This issue might perhaps be addressed by adding another comparison based on fixed values alongside that based on performances by other organisations. the database can add fresh data from repeat participants and/or data from new participants. such as data validation – in-built automatic testing? – and the vexed matter of constantly moving averages: even if it records the exact same performance. Once up and running. as numerous changes have since been made to both organisational funding and benchmarking tools. Previous benchmarks are of only limited use here. . an organisation might score above average in one benchmarking exercise and below average in the next.

the continuous benchmark could – with the permission of the organisations themselves. a number of organisations have been arguing the merits of introducing the XBRL open standard. of course – draw on other sources. and as long as definitions are the same it should be less of a problem for organisations to supply these data than any other information that might be requested. aggregate data are often collated from lower-level data. Another problem we see here is that the annual care review has data at the aggregate level of the organisation. trust offices. the annual healthcare review database. Working together with Dutch trade and industry. All that said. e. Plus which. Welfare and Sport has now joined the Dutch Taxonomy Project. initially only for financial data but eventually also for other quantitative data.8 Simplified data supply To simplify data supply for participating organisations. 8. tax advisers and software suppliers. Expectations should not run too high. intermediary bodies such as accountancy firms. XBRL allows for rapid and efficient collation of data for accountability . This taxonomy is a data dictionary that is built into financial software. even if featuring in the annual care review. whereas benchmarking requires lower-level data. Marking the relevant data in an organisation’s records. At the time of writing this report.! Benchmarking in Dutch healthcare 8. The Ministry of Health.g.9 Introduction of XBRL Because an organisation’s records should be as closely aligned to national data requests as possible. VGN and ActiZ were scouring the annual healthcare report database for data that might be useful in a continuous benchmark. a collaborative venture between the Dutch justice and finance ministries that is looking to standardise and simplify the financial information that organisations are expected to supply to the government. But even before this. further streamlining is a good thing. the ministries are developing a Dutch XBRL taxonomy (XBRL stands for eXtensible Business Reporting Language). though: annual healthcare reporting is not yet compulsory and some data. will not be available in the short run. healthcare providers were expected to be obliged to publish an annual care review with effect from the 2007 financial year.

annual accounts in more than one language.123 The annual healthcare review and benchmarking also require other data. including PricewaterhouseCoopers. exchange electronically. An instance document is not readily readable and needs to be presented correctly. Style sheets may be used to present information in a specific way. Partners in the project. • Instance document: The XBRL-generated document or ‘instance document’ contains the actual financial data as described in the taxonomy.g. e. the language underpinning XBRL) to allow relatively easy integration of all the requisite information from a care provider’s systems. Style sheets also allow for conversion into other formats such as PDF or HTML webpage. XBRL breaks down into three distinctive elements: • Taxonomy: A taxonomy is a bit like a dictionary or an index. . A taxonomy describes which data feature in the document and where. Data users can put together their own reports by reading instance documents in their own. • Style sheet. Data become easy to collect.Innovations in benchmarking ! purposes and simplifies electronic data exchange with the government 122 through re-use of information. such as client and employee numbers. XBRL-compliant applications. The objective of the project is to ease the administrative burden through the creation of a basic structure by the government and a timely adjustment to the taxonomy of their own organisation and infrastructures by intermediary bodies. analyse and. if need be. It is possible to develop software drawing on XML (eXtensible Markup Language. It describes all the potential financial data or elements as well as the relationships – statistical or otherwise – between them. have signed a covenant committing to the creation of an XBRL taxonomy – and the reduction in accountancy fees that this makes possible. process further.

Appendices .

Born Sector: Mental healthcare G.J.J.J. Nijholt Sector: Care for the disabled G. Poerstamper G. van Berlo H.A Steering committee and sounding-board group Members of the steering committee Representing HEAD G. Postma PwC PwC Utrecht Utrecht Alysis Zorggroep Trivium Zorggroep Delfzicht Ziekenhuis Arnhem Hengelo Delfzijl Members of the sounding-board group Representing HEAD A. Jansdal Harderwijk . van der Wijk Representing PwC R.A. Meijerhof De Geestgronden Bennebroek ASVZ Groep Leerdam VZC LindeStede Wolvega Ziekenhuis St.A. Naaktgeboren Sector: Nursing and care A. Members Sector: Hospitals and rehabilitation centres J. Bonté P.

Straks M. Dopper E. De Braal Representing PwC G. Gerritsen Alysis Zorggroep Arnhem +=HA M. van den Dungen Viataal Sint Michielsgestel +KHA L.J. van Eijck D. Veltman PwC PwC Utrecht Utrecht GGZ Regio Breda Breda ActiZ VGN Utrecht Utrecht .!$ Benchmarking in Dutch healthcare B. Quality managers +KHA G. Postma A. Personnel manager L. Roelink C. Industry associations Alysis Zorggroep Arnhem +=HA M. Board Amerpoort ASVZ Baarn +=HA B.

London. Prestaties van zorgaanbieders gemeten. T. 1998. et al. et al.B. D. (2004/2005 nursing and care homes benchmark: Measuring performance of care providers. General report. Utrecht. Longman. review of applicability benchmarking tools in outpatient care. Algemeen rapport. 2004) Benchmark verpleeg.B Bibliography Literature/reports • Accenture study entitled Assessment of Benchmarking Within Government. 2002) Tweede test benchmarkinstrumentarium. R. Kluwer. (1989) Benchmarking: The Search for Industry Best Practices that Lead to Superior Performance. Algemeen rapport. (1998) Benchmarking for competitive advantage. in Dutch only) • Arcares (November. in Dutch only) • Arcares (May. quoted in Accenture press release dated 31 July 2006. Utrecht. and Schon. Utrecht. S. (1994) Benchmarking for Continuous Improvement in the Public Sector. in Dutch only) • Berg. 2003) Tweede test benchmarkinstrumentarium. (2003 nursing and care homes benchmark: Measuring performance of care providers. General report. General report. C. • Argyris. Utrecht. in Dutch only) • Arcares (February. • Arcares (April.. ASQ Quality Press. University of Florida. . • Camp. • Bendell. 2005) Benchmark verpleeg. 2006) Water Benchmarking Support System: Survey of Benchmarking Methodologies (abstract).. Prestaties van zorgaanbieders gemeten. 2006) Rapportage vooronderzoek continue benchmark V&V. onderzoek toepasbaarheid benchmarkinstrumentarium in extramurale zorg. De excellente overheidsorganisatie. Addison-Wesley. in Dutch only) • Bullivant.. (First pilot of benchmarking tools. Utrecht.A. J. in Dutch only) • Arcares (November. (Second pilot of benchmarking tools. 2002) Eerste test benchmarkinstrumentarium.en verzorgingshuizen 2004/2005. (Second pilot of benchmarking tools. (Report on preliminary investigations into continuous benchmark of nursing and care. (The excellent government organisation. in Dutch only) • Arcares (February. • Bentlage F.M. (1978) Organizational Learning: A Theory of Action Perspective. Harlow. Public Utility Research Center. and Kip J. Utrecht. Algemeen rapport.en verzorgingshuizen 2003. Boelens J. (March.

van and van der Zwart R. Hoogwout. M.) Management Accounting Handbook. December 2006 issue of its consumer guide magazine. (1996) ‘Benchmarking’. J. (August.A. Evaluation and Strategic Management in the Public Sector. (Benchmarking in the public sector: using performance comparison as management tool. K. F. Nr. (Dutch consumers’ association. C. (2004) Benchmarking in de publieke sector.5. Amsterdam. S. . (Showing the value of your care. OECD. International benchmarking experiences from OECD countries. 20-21 February 1997. R. Jong. Public Controlling Reeks. dissertation. and Samuels. McGraw Hill. M. ‘Back to the basics of benchmarking’. Patterns of acculturation in technology acquisitions. Oxford. 4. 2005) Benchmarken in de Openbare Sector. J. Conference on international benchmarking. R. New York. 2003) Health Sciences Digest. (European benchmark in elderly care homes (benchmarking three organisations). 33 No. Consumentengids. Erasmus Universiteit. How benchmarking contributes to organisational learning. • Harrington. Jackson Jr (1994). pp. • Ernst & Young (December. 1995. and Harrington J. in Dutch only) • Healthcare Commission (2006) 2006 Community Mental Health Survey. and Poerstamper. in Dutch only) • Grayson. H. Goudriaan. Information Strategy Committee Discussion Paper No. • Eagar. Quality. (ed. December 2006. (1996) High Performance Benchmarking. 2000).. et al. S. F. Rijksuniversiteit Groningen. South London & Maudsley NHS Trust. • Eenennaam. R. 20-2. In Dury C. • Gangelen. Butterworth-Heinemann. 2004): Europese benchmark in zorginstellingen voor ouderen (test in drie instellingen). A. Understanding Management. Vol.) Benchmarking. (1996) ‘Performance benchmarking in the public sector: the United Kingdom experience’. ‘Towards National Benchmarks for Australian Mental Health Services’. de. (Benchmarking in the public sector.J. Oxford.. • HEAD Association and PricewaterhouseCoopers (2006) Laat zien wat uw zorg heeft betekend. Alexander . H.!& Benchmarking in Dutch healthcare • Consumentenbond. In: Trosa. van (May. • Helgason. (2001).. in Dutch only) • Cowper. Copenhagen. Fort Worth. 4. Rotterdam. • Grotenhuis. Sdu Uitgevers.. In Dutch only) • Groot. • Customers Choice and PricewaterhouseCoopers (December. De bijdrage van benchmarken aan organisatieleren. in Dutch only) • Daft. vergelijken van prestaties als management tool. (ed.

Paris. Utrecht. • PricewaterhouseCoopers and Berenschot (Autumn. and De Loor. et al (2005) Benchmark CVA-ketens. Handreikingen voor benchmarking en het gebruik van benchmarks. G. • Kets de Vries.’ Healthcare Financial Management. and Metcalfe. Erasmus MC/Prismant. (Comparing stroke services. 1998) Toepassing benchmarkanalysemodel voor sector verpleging en verzorging gefaseerd en onder randvoorwaarden haalbaar. in Dutch only) • Mulder.R.. D. Main report on feasibility study into CVA chain benchmarking. Labour Market and Social Policy Occasional Papers no. (1997) ‘Benchmarking of public services in the United Kingdom and Sweden – Commentary’. in Dutch only) • Organisation for Economic Co-operation and Development (January. and Van Vijfeijken.S. Guide to benchmarking and the use of benchmarks. In: OECD/OCDE: Benchmarking. Een handreiking voor de ggz. Amersfoort. • Klages. CPS. (Guide to performance comparison in the public sector. Utrecht. • Keehley. 2005) Leren door benchmarken. 57. Erasmus MC/Prismant.R. San Francisco/London. E. H.Bibliography !' • Kaczmakre. Final report. 2005) Stroke services gespiegeld. (Application of benchmarking model in nursing and care feasible – phased and within set parameters. 2005) ‘Benchmarking supply expenses the devil’s in the definition: why hasn’t the healthcare industry been able to fix the disconnect between the supply chain and the revenue cycle? Maybe a generally accepted definition of “supply expense” would be a start. in Dutch only) . (June. The Hague. (October. in Dutch only) • Nieboer. and local agencies. in Dutch only) • Odenthal. 2006) Verschillen vergelijken. L. P. Evaluation and Strategic Management in the Public Sector. Amersfoort. (Benchmarking the CVA chain. (January. Jossey-Bass. (1996) Benchmarking for best practices in the public sector: Achieving performance breakthroughs in federal. Final report. et al. 2004) Handreiking Prestatievergelijking binnen de Openbare Sector. Eindrapportage. H. A guide for the mental healthcare sector. in Dutch only) • Nieboer A. • Ministerie van Binnenlandse Zaken en Koninkrijksrelaties (Dutch Ministry of the Interior and Kingdom Relations) (May. (1993) Organizations on the Couch: Perspectives on Organizational Behaviour and Change. M. 2002) ‘Improving the Performance of Health Care Systems: From Measures to Action (A Review of Experiences in Four OECD Countries)’. Hoofdrapport haalbaarheidsstudie benchmark CVA-ketens. Utrecht. Eindrapportage. A. (Learning through benchmarking.F. GGZ Nederland. M. (Comparing differences. et al. state. (April.

Utrecht. Brancherapportage. in Dutch only) • PricewaterhouseCoopers (September. 2003) Analysemodel. Instellingsspecifieke rapportage. instrumentarium en pilotbenchmark bieden solide basis voor integrale benchmark zorgkantoren. tools and pilot benchmark provide solid basis for integrated benchmark for healthcare administration agencies. Organisation-specific report. 2005) Medewerkers positiever over werkomstandigheden. Utrecht." Benchmarking in Dutch healthcare • PricewaterhouseCoopers and Berenschot (March. in Dutch only) • PricewaterhouseCoopers and IWS (April. 2002) Benchmarkonderzoek zorgkantoren fase II omvat ontwikkeling analysemodel. Met appendix. Phase II. in Dutch only) • PricewaterhouseCoopers (June. Almere. 2005) Continue benchmarkverpleeg. Utrecht. 2005) Testbenchmark gehandicaptenzorg 2004. in Dutch only) • PricewaterhouseCoopers (May. 2003) Gedragen kostprijsmodel gehandicaptenzorg vormt basis voor bruikbare en integrale kostprijsberekening. (Employees more positive about working conditions. (2004 test benchmark in care for the disabled sector.en verzorgingshuizen en benchmarkdata 2002 leiden tot eerste inzicht in parameters en kostprijzen. key figures and tools. Utrecht. in Dutch only) • PricewaterhouseCoopers (June. (Cost price model for nursing and care homes and 2002 benchmark data provide early insight in parameters and costs. In Dutch only) • PricewaterhouseCoopers (October. 2003) Kostprijsmodel verpleeg. kengetallen en instrumentarium. Utrecht. Utrecht. (Continuous benchmarking nursing and care homes. (Benchmark study of home care sector offers opportunities for government and organisations. Lidinstellingen LVT scoren bij medewerkerraadpleging 2004 hoger dan bij survey 2002. in Dutch only) . Including Appendix. (Benchmark study of healthcare administration agencies.en verzorgingshuizen. (Application of cost price model for disabled care in test benchmark provides insight in costs and relevant comparative data. Utrecht. in Dutch only) • PricewaterhouseCoopers (May. (Model. consisting of developing model. 1999) Benchmarkonderzoek thuiszorg biedt aanknopingspunten voor instellingen en overheid. LVT member organisation score higher in 2004 employee motivation survey than in 2002 EMS. 2004) Toepassing kostprijsmodel gehandicaptenzorg in testbenchmark biedt inzicht in kostprijzen en relevante spiegelinformatie. Industry report. Generiek eindrapport fase III. Amstelveen/Utrecht. (Cost price model in care for disabled sector serves as basis for useful and integrated costing. Generic final report Phase III. in Dutch only) • PricewaterhouseCoopers (January.

Almere/Utrecht. (Benchmarking vocational education: completion first phase. and Foekema.Bibliography " • PricewaterhouseCoopers (June.. 2006) Eerste fase benchmark middelbaar beroepsonderwijs afgerond. (Z-org industry report on 2004 benchmark study into home care. 2005) Brancherapport Z-org benchmarkonderzoek thuiszorg 2004. H. (2005) Management Tools and Trends 2005. optimum perspective as model for growth. (Individual report first benchmark healthcare administration agencies. Bain & Company. Utrecht. in Dutch only) • PwC Deutsche Revision (2005) Benchmarking Ablauf und Übersicht. in Dutch only) • PricewaterhouseCoopers (June. 2005) Benchmark Beroepsonderwijs. in German only) • Rigby. and Bilodeau.. Frankfurt. (Benchmarking vocational education: management information for strategic themes. (Initial benchmark results 2005. in Dutch only) • PricewaterhouseCoopers (November. (Benchmark execution and findings. Frissen. Stuurinformatie voor strategische thema’s. Utrecht. optimaal perspectief als groeimodel. D. in Dutch only) • PricewaterhouseCoopers (January.A. in German only) • PwC Deutsche Revision (September. Kostenmeting 2003 en cliëntensurvey 2004. B. Utrecht. . (Z-org industry report on benchmark study into healthcare to children between 0 and 4 years of age. 2005) Individueel rapport eerste benchmark zorgkantoren. in Dutch only) • PricewaterhouseCoopers (December. P. (Generic report first benchmark healthcare administration agencies. 2003 cost check and 2004 client survey. 2002) Benchmarkonderzoek 2000 verscherpt inzicht in prestaties en bedrijfsvoering thuiszorginstellingen. University of Tilburg (Care for innovation. in Dutch only) • Putters. in Dutch only) • PricewaterhouseCoopers (September. 2006) Erste Ergebnisse Benchmark 2005. Kostenmeting 2003 en cliëntensurvey 2004. K. Utrecht. Outcomes benchmark study at sector level. TNS NIPO/Tilburg School for Politics and Public Administration. (2006) Zorg om vernieuwing. (2000 benchmark study hones insight into performance and operations at home care organisations.H. 2006) Brancherapport Z-org benchmarkonderzoek jeugdgezondheidszorg 0-4-jarigen. Utrecht. Frankfurt. in Dutch only) • PwC Consulting (March. 2003 cost check and 2004 client survey. Resultaten benchmarkonderzoek op sectorniveau. Utrecht. 2005) Generiek rapport eerste benchmark zorgkantoren.

J.A. in: OECD. in Dutch only) • Waal. Doubleday. M. May 2002. Deventer. een strategie om concurrentievoordeel te behalen. Background paper. (Social enterprise in Dutch healthcare. De Vrind. A.• Pollitt. in Dutch only) • Vries de. Utrecht. de and Timmerman. Kluwer Management. Performance measurement in government . (1990) The Fifth Discipline: The Art & Practice of the Learning Organization. K. Paris. Narrow Your Focus.K.A.A. Hyperion Solutions Nederland BV.A. (1992) Becoming a Learning Organisation. in Dutch only) • Swieringa. . and Joss. (Benchmarking in nine steps. (Learning to communicate. de (2006) The Characteristics of a High Performance Organisation.M. • Steehouder. A. 2006. Addison Wesley. C.J. • Swinkels. • Treacy. (2004) ‘Towards a Model for Measuring the Performance of e-Procurement Initiatives in the Australian Public Sector: A Balanced Scorecard Approach’. Kluwer Bedrijfsinformatie. in Dutch only) • Waal. EMEA HC. A. Dominate Your Market. Cave. J.. in Dutch only) • Senge. de and Ardon.M. and Wierdsma. (1997) Benchmarking. and Boerkamp. April 14-15. et al. presentation May. P. A. (The time of external competition has come. Proceedings of the Australian Electronic Governance Conference. de (2005) ‘Het tijdperk van de externe concurrentie is aangebroken’.F. G.A. Association for Care of the Disabled in the Netherlands) (2007). de (2003) On the road to Nirvana. • Waal. • Vriend. A. November/December.J. • RVZ (1998) Maatschappelijk ondernemen in de zorg. A. New York. (1984) Leren Communiceren. A.P. J. M. (2002) ‘Hoe prestatiegedreven is uw zorginstelling?’ Zorginstellingen. Various benchmarking brochures and information packs. • PricewaterhouseCoopers. H. (How performance-driven is your healthcare organisation? In Dutch only) • Waal. Melbourne • Vereniging Gehandicaptenzorg Nederland (VGN.A.A critical overview’. M. Benchmarking Hospitals. and Van der Togt. and Wiersema. J. F. (1998) ‘Benchmarking: Stimulation or Control?’ Primavera Working Paper 1998-16. R. (1996) The Discipline of Market Leaders: Choose Your Customers. G. Wolters-Noordhoff. Groningen. (1995) Benchmarking in 9 stappen. (1994) ‘International benchmarking as a tool to improve public sector performance . M. Perseus Books. et al. Universiteit van Amsterdam Business School. (Benchmarking for a competitive edge.Issues and illustrations. • Vaidya.

aspx?id=10003&cNode=5K3B4O • PricewaterhouseCoopers (various years) Territory benchmarking examples.cfm?issueid=61 A.uow. Websites • • • • • • • • • • • • http://www.ittefaq. (1993) Strategic Benchmarking: How to Rate Your Company’s Performance against the World’s Best. ( 2006) Benchmarking in the Dutch Healthcare Sector. Wiley.webebi.shtml html http://hcro. Internal PricewaterhouseCoopers presentations • Alexander-De Jong. G.shtml http://www.html . Internal materials • Internal materials of the industry associations and http://www.H.allbusiness.•

V. Cliënt & Kwaliteit. Arcares Healthcare 2002 administration agencies Nursing and care homes 2002 Nursing and care homes 2003 119 VWS. Van Loveren & Partners B. LVT.V. NVVZ. BTN Berenschot. Economic Programs B..V. VWS CvZ VWS. Customers Choice. Van Loveren & Partners B. Arcares Nursing and care homes 2003 21 VWS. NIVEL Applicability outpatient care Pilot Home care 2002 106 NA 90 LVT. Prismant ATOS Second test Beleidsadvies en -onderzoek B..V.V. Cliënt & Kwaliteit. First Stichting Kwaliteit benchmark In EigeN huis (KIEN) NIVEL Second benchmark Developing model and tools ATOS First test Beleidsadvies en -onderzoek B.V.V. Economic Programs B... Customers Choice.. Arcares Healthcare 2003 administration agencies 7 CvZ . BTN. Wzf VWS. Prismant Customers Choice. Van Loveren & Partners B.C Benchmark studies Table C-1: Healthcare benchmarks with involvement of PricewaterhouseCoopers Sector Nursing and care homes Home care Report 1998 1999 Working with Berenschot Nature of the study Feasibility study Number of participants * NA 122 Commissioned by VWS.

GGD Nederland VGN ActiZ Research for First Beleid. IWS. NIVEL and TNO See second test Van Naem & Partners CvZ Z-org Arcares LVT LVT. care Continuou CBO. NIVEL and countrywide NIZW rollout Integrated countrywide VVT benchmark Treasury only Nursing. Belgium and Germany Countrywide rollout Third benchmark Second countrywide rollout Financial only Plan of action 32 82 75 26 NA 108 As yet unknown 30 67 67 NA Healthcare 2005 administration agencies Home care Nursing and care homes Child healthcare 0 4 years Child healthcare 0 19 years Care for the disabled 2005 2004/200 5 2006 2006 2007 NIVEL Desan."$ Benchmarking in Dutch healthcare Sector Nursing and care homes Care for the disabled Nursing and care homes Report 2004 2004/200 5 2004 Working with See second test Nature of the study First countrywide rollout Number of participants * 100 28 Commissioned by Arcares VGN Customers Test Choice. Economic Programs and Prismant Customers Choice Exploratory 3 data gathering in the Netherlands. Desan and home s care Crossover Periodicall healthcare (by y sector) Vocational education Vocational education Housing corporations 2005/200 6 2007 Organisations MBO Raad MBO Raad Kenniscentrum Eerste Beroepsonderwijs benchmark en Arbeidsmarkt Kenniscentrum Beroepsonder wijs en Arbeidsmarkt Follow-up benchmark Pilot 2007 15 Corporations .

All three authors work at PricewaterhouseCoopers. Desan • Financial: Van Naem & Partners (child healthcare 4-19 years) About the authors Robbert-Jan Poerstamper is the partner ultimately responsible for all healthcare benchmarks discussed in this report. KIEN.Benchmark studies "% * A number of benchmarks allowed for participation in a limited number of building blocks. NIVEL. Prismant. Van Naem & Partners • Client surveys: Cliënt & Kwaliteit. Aafke Veltman has been involved in benchmark studies since 2006. Anneke van Mourik . He has been involved in development and implementation of benchmarks since 1996. strategic positioning: Prismant. with analyses and reporting as her specific field of expertise in addition to benchmark development. IWS. Research voor Beleid • Operations/process management.van Herk has also been involved in healthcare benchmarks since 1996. TNO • Innovation: CBO • Measuring care complexity elderly care: Van Loveren & Partners • Care and treatment registration (time-keeping): Customers Choice • ICT and analysis: Economic Programs. Consultants and agencies we have worked with or currently work with are typically involved at the general set-up and creation stage of the benchmark. . while at the same time taking on responsibility for a specific building block or task: • General: Berenschot. NIZW • Employee surveys: ATOS.

Our worldwide network of people and countries has access to a vast amount of knowledge and experience that we share with each other. make surprising links. medium-sized and smaller national and international companies. take decisions and achieve objectives. We seek fresh approaches."& Benchmarking in Dutch healthcare *connectedthinking PricewaterhouseCoopers (www. over 4. We actively identify key trends in healthcare. ‘PricewaterhouseCoopers’ refers to the network of member firms of PricewaterhouseCoopers International provides industry-focused assurance.400 professionals draw on Connected Thinking to provide sector-specific services and innovative solutions for large. governments and not-for-profit organisations. We monitor local and global healthcare to help our customers understand the issues. a report setting out our views on future healthcare developments and trends. More than 146. each of which is a separate and independent legal entity. At PricewaterhouseCoopers in the Netherlands. putting our expertise to work in healthcare at both national and international level.000 people in 150 countries work collaboratively using Connected Thinking to develop fresh perspectives and practical advice. The healthcare sector group is the largest specialist groups within PricewaterhouseCoopers. are committed and involved and collaborate from strength. Among the results of our efforts is HealthCast 2020: Creating a Sustainable Future. tax and advisory services to build public trust and enhance value for its clients and their stakeholders. Healthcare is one of those pivotal industries. PricewaterhouseCoopers and healthcare PricewaterhouseCoopers has active policies for what it considers pivotal industries. with our customers and with their .

care and home care Exceptional Medical Expenses Act the Health Care Insurance Board community health services the mental healthcare association Dutch Association of Finance Managers in Healthcare the Dutch Healthcare Inspectorate healthcare to children Dutch Hospitals Association the Dutch Council for Public Health and Care Association for Care of the Disabled in the Netherlands nursing. care and home care care and treatment provided the Netherlands organisation for health research and development care and complexity module .D Dutch healthcare abbreviations and acronyms ActiZ AWBZ CVZ GGD GGZ HEAD IGZ JGZ NVZ RVZ VGN VVT ZBR ZonMw ZZP the Dutch association for nursing.

(1988) Out of the Crisis. McGraw Hill.A. van (May. 7.M. (1992) The benchmark book. 13. (1995) Understanding Management. Erasmus Universiteit. and Poerstamper. 10. (1993) Strategic Benchmarking: How to Rate Your Company's Performance against the World's Best. J. (1989) Benchmarking: The Search for Industry Best Practices that Lead to Superior Performance. (Social enterprise in Dutch healthcare. Watson. . 6. New York. Fort Jong. G. (1996) High Performance Benchmarking. 9. Bullivant. (Benchmarking in the public sector.. Daft. vergelijken van prestaties als management tool. 14. state. R. W. Rotterdam. Watson. and local agencies. M. and Kip J. ASQ Quality Press. Goudriaan. Bentlage F. 2005) Benchmarken in de Openbare Sector.. Edwards Deming. 38.J.. Gangelen. 11. Longman. Watson. (Benchmarking in the public sector: using performance comparison as management tool. 12. Sdu Uitgevers. San Francisco/London. et al. 8. de. (1998) De excellente overheidsorganisatie. Public Controlling Reeks. RVZ (1998) Maatschappelijk ondernemen in de zorg.B.H. Kluwer. R. Cambridge University Press.. Camp. 18. (The excellent government organisation. (1996) Benchmarking for best practices in the public sector: Achieving performance breakthroughs in federal. A. Hoogwout. (Read in its official 1998 Dutch translation) Groot. 15. in Dutch only) Government Papers II 2003/2004 29 200 VII nr. 2. Bentlage et al. Wiley.E Endnotes 1. In the 'Promoting Benchmarking within LED' project. De Groot et al. in Dutch only). 3. H. Harlow.. in Dutch only) Keehley. Background paper. How benchmarking contributes to organisational learning. 16. M. H. De bijdrage van benchmarken aan organisatieleren. R. (2004) Benchmarking in de publieke sector. R. J. 17. (1994) Benchmarking for Continuous Improvement in the Public Sector. Boelens J. Alexander . Spendolini. In Dutch only). and Harrington J. Harrington. 5. New York. 4. P. Amacon.

Pollitt.Issues and illustrations.A critical overview'. (1994) 'International benchmarking as a tool to improve public sector performance . (2005) Management Tools and Trends 2005. in: OECD. S. Cave. 28. Rigby and Bilodeau. 24. 32. 31. and Van der Togt. Performance measurement in government . 35. M. 23. et al. and local agencies. in Dutch only) De Groot et al. Rigby and Bilodeau. (1998) Benchmarking for competitive advantage. Bendell et al. 33. Conference on international benchmarking. P. (1995) Benchmarking in 9 stappen. One notable point is that writers on benchmarking typically refer to the quality of products or processes and barely touch upon the financial dimension. B. C. (1994) De Nederlandse kwaliteitsprijs en onderscheiding. Bendell. Keehley. Companies incorporate identified best practices to meet improvement targets. 20. and Bilodeau. J. 36. Bendell et al. (1996) Benchmarking for best practices in the public sector: Achieving performance breakthroughs in federal.. T. and Hes F. 21. Accenture study entitled Assessment of Benchmarking Within Government. T. 20-21 February 1997. quoted in Accenture press release dated 31 July 2006. Vries de. 22. D. J. Bain & Company. state. De Groot et al. Watson.19. International benchmarking experiences from OECD countries. Hardjono. 37. 27. Defined as comparing efficiency and effectiveness of a process or processes in one organisation to those in other organisations. Bendell et al. Helgason. quality of the job or the quality of corporate social responsibility. Results based on statements by 960 managers in the corporate and not-for-profit sectors. Paris. San Francisco/London. Quoted in our first benchmark proposal. London. (The Dutch Quality Award. et al. Copenhagen. and Joss. 34. Deventer. 39. 40. Watson. Defined as comparing processes and performance with internal and external benchmarks. Deventer. Watson. R. 30. Rigby. 26. (Benchmarking in nine steps. in Dutch only) . 29. 38. 25..

D. 57.F. et al. Jossey-Bass. (1998) 'Benchmarking: Stimulation or Control?' Primavera Working Paper 1998-16. 54. Senge. (1990) The Fifth Discipline: The Art & Practice of the Learning Organization. De Groot et al. G. De Groot et al. .A. Vaidya. 48. Universiteit van Amsterdam Business School. 63. 53. Labour Market and Social Policy Occasional Papers no. 51. with the results then sent to the organisations as feedback. De Vrind.M. Doubleday. 57. Grotenhuis. Het Financeele Dagblad. De Vries and Van der Togt. 58. 42. and Schon. Van Gangelen.P. Keehley et al.see Section 8. De Groot et al. 49. (2004) 'Towards a Model for Measuring the Performance of e-Procurement Initiatives in the Australian Public Sector: A Balanced Scorecard Approach'. but can be of tremendous help to learning organisations. Bendell et al. Swinkels. C. April 14-15. 61. F. P. 5 January 2007. 60. 56. De Groot et al. 64. M. Addison-Wesley. 46. 45. Bendell et al.R. H. 59. M. Melbourne. Paris. 62. Watson.41. Rijksuniversiteit Groningen. Argyris. Kets de Vries. (1978) Organizational Learning: A Theory of Action Perspective. (2001) Patterns of acculturation in technology acquisitions. 43. De Vries and Van der Togt. 2006 Accenture study. Our healthcare benchmarks typically scan such responses in full. Proceedings of the Australian Electronic Governance Conference.. 44. dissertation. 2002) 'Improving the Performance of Health Care Systems: From Measures to Action (A Review of Experiences in Four OECD Countries)'. 50.A. Organisation for Economic Co-operation and Development (January. K. and Boerkamp. 55. (1993) Organizations on the Couch: Perspectives on Organizational Behaviour and Change. 2006 Accenture study. 52. this list identifies the key features of excellent performance . Open questions do not lend themselves for scoring and comparison. In fact.J. 47. De Groot et al.J.

The only best practice to come out of the 2000 home care benchmark that also constituted best practice in 2004. 69. Evaluation and Strategic Management in the Public Sector. Handreikingen voor benchmarking en het gebruik van benchmarks. De Groot et al. in Dutch only) Odenthal and Van Vijfeijken. Amersfoort. Odenthal and Van Vijfeijken. H. De Vries and Van der Togt.R. M. S. 2005) 'Benchmarking supply expenses the devil's in the definition: why hasn't the healthcare industry been able to fix the disconnect between the supply chain and the revenue cycle? Maybe a generally accepted definition of "supply expense" would be a start. and Wiersema.S. 2006) Verschillen vergelijken. Klages. Quality.' Healthcare Financial Management. 75. (1996) The Discipline of Market Leaders: Choose Your Customers. . Dementia Care Mapping (DCM) as developed by Tom Kitford at the University of Bradford. 71. Watson. F. 72. Until recently this was based on Zorgbehoeftemeting OuderenZorg (ZOZ) as developed by Van Loveren & Partners . According to the Consumer Quality Index. 20-2.but now part of the care complexity module system. (ed. Evaluation and Strategic Management in the Public Sector. D.65. 77. and Metcalfe. and Samuels. H. a standard system used in the Netherlands that takes on board both the client's assessment of a particular item and the importance they assign to it. Cowper and Samuels. Cowper. J.Commentary'. Guide to benchmarking and the use of benchmarks. 81. L. 74. 68. 79. (October. Kaczmakre. OECD. 83. 82. 70.) Benchmarking. 78. (1997) 'Benchmarking of public services in the United Kingdom and Sweden . (1996) 'Performance benchmarking in the public sector: the United Kingdom experience'. In: Trosa. 33 No. a gauge for elderly care needs . In: OECD/OCDE: Benchmarking. Perseus Books. Vol. M. Jackson Jr (1994). Treacy. G. 67. Watson. Oxford. 66. and Van Vijfeijken. E. Odenthal. pp. C. (Comparing differences. 73. CPS. Dominate Your Market. New York.. Grayson.e. Narrow Your Focus.i.g. 76. Treacy and Wiersema. (January. 'Back to the basics of benchmarking'.5. Klages. 80.

Public Utility Research Center. Keehley et al. 86. (June. Keehley. Zimmerman. 87. For an overview of benchmarking phases. 102. Home care benchmark. 105. Some client characteristics that organisations cannot influence in any way are known to affect scores. organisations will also get to see their unweighted scores. E. Pilot benchmark for healthcare administration agencies. 2006) Water Benchmarking Support System: Survey of Benchmarking Methodologies (abstract). and De Loor. 2003 nursing and care home benchmark study. see Keehley et al.84. A guide for the mental healthcare sector. Keehley et al. Harrington and Harrington. et al. Een handreiking voor de ggz. 101. to give them a better handle on areas for improvement. 109. Pilot benchmark for healthcare administration agencies. 88. De Vries and Van der Togt. University of Florida. 92. GGZ Nederland. 2003 nursing and care home benchmark study. De Groot et al. 95. 96. (March. De Groot et al. 108. Preliminary finding in care for the disabled pilot benchmark. Keehley et al. Mulder.g. Keehley et al. In addition. 93. 91. et al. 103. S. 90. (Learning through benchmarking. De Groot et al. Morris. RAI = Resident Assessment Instrument. 107. 106. 110. 94. 89. 100. 98. Bullivant. a different text for above-average scores than for below-average scores. 99. Consistent finding in home care. Keehley et al. in Dutch only) . Berg. 85. 2003 nursing and care home benchmark study. Hirdes et al. Keehley. E. have developed indicators that Prismant has fleshed out into a benchmark instrument. nursing and care home benchmarks and in pilot benchmark for healthcare administration agencies. 97. Amersfoort. 2005) Leren door benchmarken. et al. 104. M.

com 117. HEAD Association and PricewaterhouseCoopers (2006) Laat zien wat uw zorg heeft betekend. Bedrijfseconomische Statistieken. (TNS) 114. Putters et al. The findings of the follow-up study should be reported in September 2007. Eindrapportage. Watson. 122. Healthcare-specific research has since been carried out and the number of clusters reduced to five. 1982. Issue 4. Watson. 45. 119. June Het Financieele Dagblad. (Showing the value of your care. Nieboer A. 121. 120. 29 December 2006 and 6 January 2007. www. Newsletter. Utrecht. author (with Robert Waterman) of In Search of Excellence. Erasmus MC/Prismant. 118. 2006 Accenture study. 123. 113. et al. ZN Journaal 2006 nr. (2005) Benchmark CVA-ketens. in Dutch only) . (Benchmarking the CVA chain. 115.111. Rigby and Bilodeau. Final report. Tom Peters. in Dutch only) 112. 116.