Professional Documents
Culture Documents
ABSTRACT
This paper presents the lessons learned to date in a safety-performance benchmarking project,
where the client funded research to develop measures that would drive improvement on two
concurrent hospital construction projects.
The study shows the development of the performance measurement regime that was
adopted and the complexity involved in developing effective feedback mechanisms for
supervisors and workers on site.
This work is still in progress and each week the research team and the project team gain
new insights into the difficulties that are faced in any attempt to transform the construction
workplace.
The process to date has been crudely modelled, however it has to be recognised that such
models are not generic, rather they reflect the particular process on a project.
KEYWORDS
Safety in construction, performance measurement, feedback
Multiplex Professor of Engineering Construction Innovation, Director ACCI, School of Civil and
Environmental Engineering, University of New South Wales, Australia.
2
Deputy Director ACCI, School of Civil and Environmental Engineering, University of New South Wales,
Australia.
Lecturer, School of Civil and Environmental Engineering, University of New South Wales, Australia.
4
Postgraduate research student, School of Civil and Environmental Engineering, University of New South
Wales, Australia.
INTRODUCTION
Safety and quality continue to remain critical priorities in the context of improving
productivity and efficiency in the construction industry within Australia as well as overseas.
Larsson and Field (2002), in their analysis of the Victorian construction industry, provide
evidence of continued unacceptable risk exposure in terms of safety. Edwards and Nicholas
(2002) in their study of the UK construction industry portray it as the most hazardous
industry. Similarly, recent studies by the authors have showed that the cost of quality related
problems is of the same magnitude as the profitability of organisations in the sector.
(Marosszeky et al. 2002, Thomas et al. 2002).
In the project reported in this paper the client, a regional hospital authority, identified
safety and quality performance as critical areas for improvement. The client had two projects
of a similar scale (>$100 million) being constructed concurrently and decided to fund the
research with a view to drive process improvement through comparison between the two
projects. Both projects were to be built by the same general contractor and the client had their
agreement to cooperate in the project and implement ideas that emerged out of it. While the
original intent had been to conduct performance measurement in both safety and quality, the
lack of formal management processes in relation to quality made this task impractical and
this paper presents the lessons learned in relation to safety performance measurement and
benchmarking.
The research method used was essentially an iterative process that involves study/analysis
of the subject of measurement (e.g. process), identifying potential performance measures,
prioritising and accepting/discarding measures, and finally the development/refinement of
Key Performance Indicators, as well as the feedback mechanism, and their implementation.
Letza (1996) formulated a flow chart for this iterative mechanism as discussed later.
While the work is still in progress, a number of lessons have been learned in the process -
some of them surprising. This paper presents those lessons by placing the current project in
the context of past research, followed by a brief description of the research objectives and
methodology, and then providing a detailed description of how the process unfolded.
Familiarisation/review Familiarisation/review
Acceptance
Detailed Implementation, trial and
refinement of data collection
Implementation
Development of drafts form of feedback,
refinement and agreement
PROCESS DESCRIPTION
The process of developing the :framework essentially began with documentation analysis to
examine previous experience with performance measurement within the area of safety. A
preliminary framework for performance measurement was then designed, reviewed and
refined through an iterative process of joint stakeholder meetings :from both projects and the
concurrent development of ideas. Data collection commenced once this was agreed. The
overall initial framework is summarised in Table 1.
There was a quick realisation that in some cases these scores were simplistic in that it was
possible for both sites to achieve scores consistently close to 100%, even though normal
industry practice in these areas is much lower. So for example, all targeted toolbox meetings
were held. At first both sites found it difficult to achieve the number of planned audits,
however once the monthly performance scores were published the actual quickly converged
with the target. This supported the adage that if you measure performance, it will tend to
improve. This initial :framework was then subsequently developed and modified over the
following months along the basis of the process described in Figure 1. From the outset, as
exemplified in measure 5, we were looking to compare the performance of subcontractors in
order to generate some healthy competition and to create the basis of public recognition of
the best subcontractor each month.
The following two examples describe the application of the :framework in the elimination
of safety hazards on each site. The
Outc:omclndlc:ator
first is an examination of how Po;::::::." N•
ht-+T~oo~lbc~xT.Tal~ko--hP~Ion~no~dV~o."-k~tua~I-~No~s.-hT~M~ff~~~m~o~uu~M~WU~O~Im~ply~to~M~CO~m~wh~otM~,~Ih~o~
effective! y subcontractor Closed out Items·% ~~~::::.~a':J~~:~~~~a:-n~~=~::'r:!~e~e:d~:t~as
. weekly safety committee meetings. The second measure
management was respondmg to ~:~.:;'~~~.!·.~~:.::.'"•tylosuootholhodboon
hazards identified in site safety 2 SmotyWalko Closodoulltoms•% Safotywslkogene..tellotsofhoza<dolhothovetobe
the
rectified, % of Items closed out by the end of the week
Jk d th d • • th 3 days later was to be recorded.
wa S, an e SeCQfl 18 lfl e USe h3-+re;;;.,;;;-;kO,;;b,;;;;e..,;;;;;olo;;;lon;-h;P~Ion~ne~dV-;;;-,_-.;:k;;;;tuoY:t-~No~o.-hT~M-;;;co;;;;;n";;;;;,.;;;;to•c;;;plon;;;;n;;;ed;;;;toc;;;co;;;;nd;;;;;uc;-;to-;;;;ce;:;;;rto;;;-;lnn;;;;um;;;;be~,o;;;-f-1
f £ •• task observations each week to assess whether the
Q per ormance measures anstng Closedoutltems-% ~~:~;;,n:~:o;8~=~~~8o~!:~:~~o:'~::,g~~~!~~;;:nactual
out of the application of the Site ~~~~:~::r::~~~~:a~~~~~~::a~:~~:~~~::::~~~-
sa1cety Meter (SSM) (Trethewy et 4 SlteSafetyMeter s,.·.·.·.·.·.,·.·n·o·n·",,.•lte
~
parameters related to
Thasltesafetymeter(SSM}wasusedtoscorethesafety
environment; because of scepticism about this measure
Initially It was only used on the leading deck. This
• --
.!
~
0
LL
(/)
(,)
in the project could be compared with those
Trade later in the project.
The second problem was that as we
probed more deeply and spoke of publishing data on the walls of the site sheds, everyone
started to look more closely at the scores and it was realised that some of the issues listed on
safety walks are potential hazards in work tasks yet to be done and hence they should not be
included. This led us to more carefully analyse the listed items.
The next problem was how to
present the data. Our initial presentation Figure 3 ·Comparison of Normalised Repeat Items
showed the number of items for each per 100 workers in Month 1 & 2
trade so that a comparison between
Sr-----------------~
trades could be made. This presentation
also showed changes over time, month 6
- - - - - - - - - 1 DNormalisedRepeatllems-Monlh1
by month. However, the head contractor 4 DNonnafised Repeat Items- Month 2
CONCLUSION
The research had started with the objective of detailed process level performance
measurement and learning about effective feedback mechanisms, but even then the
researchers were surprised by the way the process unfolded. It is evident now that it's a
different story when trying to stimulate improvement at a detailed process level. Difficulties
will be encountered. Even the stakeholders (senior site management) involved in the process
of developing and refining performance measures don't see the problems until feedback
starts. As a result, generalised models such as that proposed by Letza ( 1996) understate the
complexity of the task. For this reason, design and piloting of feedback should be concurrent
with development of performance indicators as far as possible. Similarly, while the feedback
dimensions noted previously from the work by Forza and Salvador (2000) were expected to
have a bearing, they took on a different meaning on this project. In their case dynamic
adjustment of performance feedback relates to changing organisational objectives, whereas in
this study the objectives within the process also changed- e.g. changing the objective from
positive to negative feedback or vice versa. Similarly "relevance" as defined by them relates
to optimum decision making whereas in this case it related to delivering the appropriate
"message". Consequently, a number of observations emerged during the study. These
include:
• Some production process anomalies become apparent only when detailed
feedback is developed.
• Once communication of detailed feedback starts, there is a desire to filter
information and make it more targeted. This in tum leads to changes in the type
and format of presentation. Even the choice of words becomes an issue in detailed
feedback.
• Detailed process level performance measurement and feedback encounters the
same type of resistance and suspicion that existed at the management level when
non-traditional performance measurement was first introduced. In this case, even
the timing of inspections was an issue for a while. Strong leadership and skilled
negotiation are prerequisites for success.
• In the drive to improve process reliability, there is no substitute for getting into
the specific detail of the errors that are to be avoided. While at a project level, say
safety environment of the entire site, broad measures raise awareness and lead to
overall improved performance, it was found that even high aggregate scores can
hide specific areas of weakness.
The purpose of measuring performance is to create feedback that will lead to improvement. It
is relatively easy to gather lots of performance data, however it takes a great deal of effort to
extract from the data useful trends and to identify where efforts for improvement should be
directed. Finally, it is surprisingly difficult to develop ways to present the data so that the
information leaps out in a way that will create a reaction that leads to improved performance
The process described in this paper is currently being tested by implementation on site.
On completion of the trial, techniques such as time series analysis, linear regression, and
multiple regression will be used to examine the relationship between the framework and
improvements in safety. Furthermore, a survey instrument will be used to determine whether
some of the performance measures are more effective than others. Pending this detailed
analysis, anecdotal evidence suggests that the management of safety on the projects under
study is improving as a result of this work.
Acknowledgement: The authors gratefully acknowledge the funding and in-kind
support for this research provided by the Central Coast Area Health Service of New South
Wales, Australia. The authors are also grateful to the Walter Construction Group for their
significant contribution of time and resources to the project.
REFERENCES
Alarcon, L. F., Grillo, A., Freire, J. and Diethelm, S. (2001). Learning from Collaborative
Benchmarking in the Construction Industry, IGLC-9, Professional Activities Centre,
Singapore, 407-416.
Alarcon, L. F. (1997). Modelling Waste and Performance in Construction, Lean Construction
(Ed: Luis Alarcon), A. A. Balkema, Rotterdam, 51-66.
Ballard, G. (1997). Lookahead Planning: A Missing Link in Production Control, IGLC-5,
International Group for Lean Construction, 13-26.
Ballard, G. and G. Howell (1997). Implementing Lean Construction - Stabilizing Work
Flow, Lean Construction (Ed: Luis Alarcon), A. A. Balkema, Rotterdam, 101-110.
Cleeton, D. (1992). Is your feedback achieving change? Industrial and Commercial Training,
24 (9).
Diekmann, J. E., Balonick, J., Krewedl, M. and Troendle, L. (2003). Measuring Lean
Conformance, Proceedings 1 J'h Annual Conference on Lean Construction, Virginia
Polytechnic Institute and State University, 84-91.
Edwards, D. J. and Nicholas, J. (2002). The State of Health and Safety in the UK
Construction Industry With a Focus on Plant Operators, Structural Survey, 20 (2), 78-87.
Ellis Jr., R. D. (1997). Identifying and Monitoring Key Indicators of Project Success, Lean
Construction (Ed: Luis Alarcon), A. A. Balkema, Rotterdam, 43-50.
El-Mashaleh, M., O'Brien, W. J. and London, K. (2001). Envelopment Methodology to
Measure and Compare Subcontractor Productivity at the Firm Level, IGLC-9,
Professional Activities Centre, Singapore, 305-322.
Forza, C. and Salvador, F. (2000). Assessing some distinctive dimensions of performance
feedback information in high performing plants, International Journal of Operations &
Production Management, 20(3), 359-385.
Gaarslev, A. (1997). TQM the Nordic Way: TQMNW, Lean Construction (Ed: Luis
Alarcon), A. A. Balkema, Rotterdam, 463-470.
Ghio, V. A. (1997). Development of Construction Work Methods and Detailed Production
Planning for On-site Productivity Improvement, IGLC-5, International Group for Lean
Construction, 149-156.
Jaselkis E. (1996). Strategies For Achieving Excellence in Construction Safety Performance,
Journal of Construction Engineering & Management, 122 (1 ), 61-70.
Karim, K., Davis, S., Marosszeky, M. and Naik, N., Project Learning through Process
Performance Measurement in Safety and Quality, CIOB Journal, December, 2003.
Larson, T. J. and Field, B. (2002). The Distribution of Occupational Injury Risks in the
Victorian Construction Industry, Safety Science, 40, 439-456.
Letza, S. R. (1996). The design and implementation of the balanced business scorecard,
Business Process Re-engineering & Management Journal, 2(3), 54-76.
Loo, R. (2003). A multi-level causal model for best practices in project management,
Benchmarking: An International Journal, 10(1 ), 29-36.
Marsh, T. W., Robertson, I. T., Duff, A. R., Phillips, R. A., Cooper, M.D. and Weyman, A.
(1995). Improving Safety Behaviour Using Goal Setting and Feedback, Leadership and
Organization Development Journal, 16(1), 5-12.
Mohammed S. (2003). Scorecard Approach to Benchmarking Organisational Safety Culture
in Construction, Journal of Construction Engineering & Management, 129 (1), 80-88.
Marosszeky M., Thomas R., Karim K., Davis S. and McGeorge D. (2002). Quality
Measurement tools for lean production from enforcement to empowerment, Proceedings
1
10 h Conference of the International Group for Lean Construction (Ed: Carlos T.
Formoso & Glenn Ballard), Federal University of Rio Grande do Sui, Brazil, 87-99.
Ramirez, R. R., Alarcon, L. F. and Knights, P. (2003). Benchmarking Management Practices
in the Construction Industry, Proceedings 1ih Annual Conference on Lean Construction,
Virginia Polytechnic Institute and State University, 553-566.
Saurin, T. A., Formoso, C. T., Guimaraes, L. B. M and Soares, A. C. (2002). Safety and
Production: An Integrated Planning and Control Model, Proceedings 1oth Conference of
the International Group for Lean Construction (Ed: Carlos T. Formoso & Glenn Ballard),
Federal University of Rio Grande do Sui, Brazil.
Saurin, T. A., Formoso, C. T. and Guimaraes, L. B. M (2001). Integrating Safety into
Production Planning and Control Process: An Exploratory Study, IGLC-9, Professional
Activities Centre, Singapore, 335-348.
Santos, A. and Powell, J. A. (200 1). Effectiveness of push and pull learning strategies in
construction management, Journal of Workplace Learning, 13 (2), 47-56.
Scott, S. and Harris, R. (1998). A methodology for generating feedback in the construction
industry, The Learning Organization, 5(3), 121-127.
Thomas, R., Marosszeky, M., Karim, K., Davis, S. and McGeorge, D. (2002). The
Importance of Project Cultures in Achieving Quality Outcomes in Construction,
Proceedings 1 dh Conference of the International Group for Lean Construction (Ed:
Carlos T. Formoso & Glenn Ballard), Federal University of Rio Grande do Sui, Brazil.
Tilley, P., Wyat, A. and Mohamed, S. (1997). Indicators of Design and Documentation
Deficiency, IGLC-5, International Group for Lean Construction, 137-148.
Trethewy R., Cross J., Marosszeky M., and Gavin I. 2000, "Safety measurement, a positive
approach towards best practice", Journal of Occupational Health and Safety Aust/NZ, 16
(3).
Trethewy R., Marosszeky, M. and Cross, J. (1999). Practices for improving contractor
management of safety in the Australian construction industry, The Journal of
Occupational Health and Safety, 15 (5), 423-432.