You are on page 1of 36

1

THE BALANCED SCORECARD IN FACILITIES MANAGEMENT for internal management and external benchmarking Paul Coronel and Anne Evans Directors, Benchmarking PLUS Melbourne August 1999
ACKNOWLEDGEMENT This paper is the result of collaborative work between a number of people and the support of the AAPPA Board. In particular we wish to acknowledge the input of time and intellectual capital from the following heads of facilities management organisations, convened by Brian Fenn who was the catalyst for the whole exercise. They played a team leader role in particular aspects of development: Brian Fenn Sam Ragusa Andrew Frowd Neville Thiele David Spedding Kelvin Crump Queensland University of Technology; Griffith University; University of Wollongong; University of South Australia; Deakin University; Facilities Management Services, TAFE, Queensland;

Also the input of a number of other facilities managers who participated in a workshop to assist in identifying relevant performance measures for facilities management within a Balanced Scorecard framework.

CONTENTS Introduction Some characteristics of top performing organisations How the balanced scorecard fits in The essential characteristics of the Balanced Scorecard approach How it can apply to FM in a tertiary education context (for both internal management and benchmarking purposes) Linkages between organisational levels and functions Establishing appropriate objectives for FM in the Balanced Scorecard context Broad structure of a Balanced Scorecard A sample of the KPIs in a Balanced Scorecard for facilities management Linkages between KPIs at different levels Characteristics of KPIs in a balanced Scorecard - internal versus benchmarking versions Conclusion APPENDIX: Detailed format of Balanced Scorecards for facilities management

2 INTRODUCTION Some characteristics of top performing organisations It is clear from personal experience and consensus among observers of exceptional organisations, that top performing organisations have a number of aspects in common including the following: Customer Focus: a clear understanding of their customers identity and their key needs, and an effective means of response at both the strategic and the operational level; Leadership: particularly in the sense of defining what things the organisation must get right and to communicate this clearly throughout the organisation; Use of information to manage: a coherent performance reporting structure that helps integrate the actions of managers at different levels in pursuit of their key objectives; A restless quest for improvement: they will do what is necessary to ensure they identify what is best practice and adapt it to fit their own organisation as soon as possible.

HOW THE BALANCED SCORECARD FITS IN The Balanced Scorecard is an approach to setting up performance measurement structures that help an organisation with some of the characteristics mentioned above. It was first featured in the Harvard Business Review early in 1992*. (Ref to K&N article) Since then it has been used in many countries and industries as the basis of a top-down reporting structure which knits together the desired strategic perspective of an organisation with its management actions at various levels. A smaller number of organisations have also used it as a framework for performance comparisons (performance benchmarking) with other organisations. It clearly makes sense to use similar measures for external comparisons to those that one uses to manage inside the organisation. The essential characteristics of the Balanced Scorecard approach are as follows: performance measurement from four perspectives ensures the focus is not merely on short term cost / financial performance (See slide 2: the question posed in each of the four perspectives help give the essence of what is being measured.); a review of the key objectives of the organisation. Existing objectives may need to be restated or modified to gain the desired clarity and balance between the four

3 perspectives of the Balanced Scorecard approach. (Our experience has shown that this is necessary more often than not, both from clarity and balance viewpoints.); linking each of these objectives to between one and three key performance measures which together are used as the scorecard at the top of the organisation (See slide 3: a typical scorecard has between 12 and 24 key performance measures in all; having fewer means the risk of too narrow an outlook while risking more confusion and lack of focus.); cascading; that is connecting the key performance measures at the top level with similar measures at other levels in the organisation. These should be of direct relevance to managers in the various jobs at these other levels. Thus cascading supports the communication which goes with delegation and accountability. In this way the perspective and consequently the actions of managers in different parts of the organisation are more co-ordinated and focussed on achievement in accord with a balanced set of key objectives.

HOW IT CAN APPLY TO FACILITIES MANAGEMENT IN A TERTIARY EDUCATION CONTEXT (for both internal management and benchmarking purposes) Linkages between organisational levels and functions In my job I see a very wide variety of industry sectors and organisations. Two of the most complex, in terms of services delivered to the end user or customer are public hospitals and local government organisations. Facilities management organisations in the tertiary education sector rank up there in my view. For this reason we propose a structure we have used elsewhere; a Top Level Balanced Scorecard with supporting Balanced Scorecards for major functions within facilities management. (This is illustrated in slide 4). The four supporting scorecards shown are indicative; the precise number and focus of the supporting scorecards in any single facilities management organisation will depend on the range and magnitude of functions that are under the FM Managers jurisdiction. Also to a degree on the way in which that person has divided the accountability for the various functions between managers reporting to him/her. Establishing appropriate objectives for FM in the Balanced Scorecard context It goes without saying that facilities management supports the fundamental activities of the University. The objectives for FM should therefore be compatible with those of the parent organisation.

4 Balanced Scorecard for Internal purposes ( for use by individual FM Managers to help run their own organisations) In this case the facilities management objectives as a total set will be unique and specific to the circumstances of the particular FM organisation and its parent body. Individual objectives may be the same or similar to other FM organisations but it is highly unlikely that this will be true of the whole set. Balanced Scorecard for External purposes (for use by a number of FM Managers to benchmark quantitative performance measures and qualitative measures / practices between their organisations) In this case the facilities management objectives must be generic to be able to serve all the participating organisations and have a degree of relevance to each of them. The objectives provide a stimulus for choosing a balanced set of key performance indicators but their role need go little further. Broad structure of a Balanced Scorecard The implications are shown in slides 5 and 6. Balanced Scorecard for External purposes For benchmarking purposes the Balanced Scorecard has a set of generic objectives and key performance measures (KPIs). Balanced Scorecard for Internal purposes For internal purposes the Balanced Scorecard has a set of objectives specific to the individual FM organisation, and the KPIs are used to measure the level of achievement against those objectives. In addition the Facilities Manager will probably wish to: define targets for the managers responsible for each KPI; document strategies and tasks by which the objectives will be achieved; fix accountabilities among his/her people for each strategy and /or task.

These additional features MAY - but do not have to - be included on the scorecard structure. It is up to the individual manager. (Two of the leading people in our working party, Sam Ragusa and Andrew Frowd, have included some or all of these elements in scorecards for their own organisations. Both approaches differ somewhat from each other, as is to be expected).

5 A SAMPLE OF THE KEY PERFORMANCE INDICATORS IN A BALANCED SCORECARD FOR FACILITIES MANAGEMENT These are set out on the next two pages. Note that it is a sample only and that the full version is attached at the end of this paper. It is in much greater detail, and includes the Top Level scorecard and the four supporting scorecards relating to Maintenance, Capital Works, Cleaning and Security. Note in the sample that there is a mix of quantitative and qualitative measures. Note also that each of the quantitative measures is stated in specific terms. This is critically important when benchmarking, but also for internal purposes. To highlight what I mean, consider the contrast between a KPI stated as satisfied customers, we will satisfy our customers, or something similar, and the KPI as stated in the sample below score on customer satisfaction survey. The former is more difficult to report on internally and impossible to use for benchmarking. Unfortunately I have seen many instances of the former in my work over recent years. In addition in the full version the qualitative measures are derived from a series of questions with multiple choice answers. Each answer has a different score so that an overall figure can be calculated; this is both for the purpose of benchmarking with others and for simple format reporting of year to year trends for internal purposes.

6 Sample of the KPIs in a Balanced Scorecard for facilities management

Customer Perspective

Quantitative measures Score on customer satisfaction survey Number of complaints per period (per EFT customer staff or per EFTSU) % compliance with provisions of service level agreements LTIFR (for customer staff / students) Incident levels per period (for example security incidents, safety incidents) Qualitative measures / practices Alignment of strategy with the parent organisation strategy Practices regarding use of SLAs

Financial Perspective

Quantitative measures Top Level Cost ratios (for example operating cost per EFTSU)

Breakdown of the above by function such as cleaning and security (cost per EFTSU, cost per sq metre) Asset employed ratio Budget balance ratio (for example $000 asset value installed per EFTSU) (for example budget variance $ per budget $)

Internal Business Perspective

Quantitative measures Lost Time Injury Frequency Rate (for FM staff & contractors) Management overheads as % of direct service delivery costs (for example capital works and cleaning services) Project time variance as % of original plan time Project cost variance as % of original project cost (both relating to capital works) Qualitative measures / practices Asset Management & Maintenance practices

Innovation & Learning Perspective (people, processes, technology)

Quantitative measures % budget spent on improved technology Annual training days per FM staff member % improvement in customer satisfaction index % improvement in operating cost per EFTSU Qualitative measures / practices Quality practices & process improvement techniques used Incentives / rewards employed

8 Linkages between KPIs at different levels The sample just given does not show an important element of choosing KPIs; that is the value of having linkages between KPIs at top level and the same or similar KPIs at other levels of the facilities management organisation. It is one thing to have a set of detailed measures, but it is another to have the ability to take the helicopter view whenever necessary and see the linkages between that view and ground level. Decision making can be more strategic and yet maintain a practical monitoring and implementation connection with strategy. As a simple example the degree of customer satisfaction with facilities management is an aggregate of the satisfaction with its various elements; maintenance, security and so on. Hence if a Customer Satisfaction Score is derived by means of a customer survey for each of the important elements of facilities management it can be summarised at the top level as an average or weighted average. Comparisons with other facilities management organisations are facilitated as is internal management summary reporting, trend analysis, and resource allocation decisions. Some other KPIs where I believe this approach is worthwhile: Complaints per EFTSU per period Operating cost per EFTSU Trends in each of the above and in Customer Satisfaction Scores Budget balance ratio Injury rates Annual training days per person.

Characteristics of KPIs in a balanced Scorecard - internal versus benchmarking versions There are some important differences which I would like to point out, which influence the design of a choice of KPIs for collective benchmarking purposes as opposed to one for use by an individual facilities management organisation. These are shown in the table over the page.

KPIs for benchmarking vs KPIs for internal use

KPIs for collective benchmarking

KPIs for individual organisations

No limit to the number used

Shouldnt have too many

Individual KPIs dont need to be relevant to All KPIs need to be relevant all participants

Need absolutely specific definition

Not quite so critical

Reasonable approximations are acceptable

Reasonable approximations may be acceptable

Must be structured to accommodate size differences in participating organisations (eg # complaints/EFTSU)

Not needed for internal comparisons (hence could just have # of complaints)

Mentioning targets is inappropriate (unless the targets are being benchmarked)

Setting targets can be beneficial (eg, Reduce complaints by 10%)

CONCLUSION In this paper I have covered a number of important points in the design and use of the Balanced Scorecard approach, for both collaborative benchmarking between facilities management organisation and management within an individual organisation. The detailed scorecards which form the remainder of this paper are the result of considerable collaborative work between us and a number of individuals who were acknowledged at the beginning of the paper. Hence most of the people involved are highly accomplished facilities managers. That said, there is no doubt room for further refinement. Possible pathways for facilities managers Benchmarking: The scorecards are set out primarily for benchmarking purposes, and it is hoped that they will provide a platform for an addition or adjunct to the excellent AAPPA

10 Benchmarking Survey already in use. If so the performance indicators and qualitative practices questions will provide a template for those who wish to follow this route. Internal management: Even though the scorecards are set out primarily for benchmarking purposes, I trust that using the points raised in this paper, facilities managers who are so inclined will be able to adapt the structure and tailor them for use in their own organisation. In this case each can select which of the performance indicators and customer satisfaction survey questions suit their current circumstances, and supplement them with others if desired. Before doing so they will wish to make the objectives more specific to the circumstances of their own institution, and they way wish to set targets and assign individual responsibilities for delivering against those targets and indicators. Combination approach: Of course it is possible to do both. In this case some or all of the performance indicators used for internal purposes and qualitative practices questions will also be benchmarked externally, thus providing a more informative basis for self evaluation, planning, and target setting. I thank you for your attention today.

11

Slide 1

The Balanced Scorecard in Facilities Management:


for internal and benchmarking purposes

Paul Coronel
Benchmarking PLUS
Wellington, Septe mber 1999

Slide 2

The Balanced Scorecard Approach: Performance from Four Perspectives


Financial Perspective

Customer Service Perspective

How well do we deliver, financially?

Internal Services Perspective

How are we perceived by our customers?

Innovation & Learning Perspective

What must we excel at?

Can we continue to improve and create value?

B enc h mar k in g PLUS

12

Slide 3

Overview of Structure of Balanced Scorecard


Financial Perspective
Objectives K PI s KPIs

Customer Service Perspective


Objectives K PIs

Objectives

Internal Services Perspective


Objectives K PIs

Innovation & Learning Perspective


Objectives KPIs

1 to 3 objectives p er perspective 1 to 3 KPIs per o bjective Hence 12 to 24 KPIs total

B enc h mar k in g PLUS

Slide 4

University s Obje ctives


linkages

High level FM obj ect ives


Incorporated into/drive

FM Bala nced Sco eca ds r r


Top level scorecard

Maintenance

Cleaning

Security

Capital Works

B enc h mar k in g PLUS

13

Slide 5

Structure of High Level Scorecards for Benchmarking

Financial Perspective
Objectives (generic) KPIs

Customer Service Perspective


Objectives (generic) KPIs

Internal Services Perspective


Objectives (generic) KPIs

Innovation & Learning Perspective


Objectives (generic) KPIs

B enc h mar k in g PLUS

Slide 6

Structure of High Level Scorecards for Internal Management

Financial Perspective
Objectives (specific) KPIs Strate gies

Customer Perspective
Objectives (specific)

Internal Perspective
Objectives (specific)

K PIs

Strategies

K PIs

Strategies

I nnovation/learning Perspective
Ob jectives (specific)

KPIs

Strategies

B enc h mar k in g PLUS

Balanced Scorecard for management & benchmarking

Prepared for AAPPA Conference, September 1999

DETAILED FORMAT AND STRUCTURE OF BALANCED SCORECARDS FOR FACILITIES MANAGEMENT On the succeeding pages the scorecard structure referred to in the presentation paper is set out in detail. There are five scorecards in all: Top Level scorecard; scorecard for Cleaning; scorecard for Security; scorecard for Maintenance; scorecard for Capital Works.

Each contains the four perspectives - customer, financial, internal processes, and innovation and learning. Also each contains qualitative measures/practices as well as quantitative measures.

They are set out primarily with benchmarking in mind. However on the last sheet of this detailed layout there is a page of a scorecard shown in a format suitable for internal management purposes. To facilitate comparison between the alternative layouts, the subject matter (objectives and KPIs) shown in it corresponds closely with the subject matter in the first sheet of the scorecard for Maintenance which appears on page 16 of this section.

When commencing the first year of a co-operative benchmarking exercise it is common to reduce the total number of measures to make it easy for a greater number of organisations to participate. Even so, not all participants are usually expected to enter every single item of data required to complete the set. In succeeding years the measures are refined and extended in the light of the developing experience of the participants.

We trust the material shown in this detailed format will provide a basis for those organisations who wish to extend the scope of their benchmarking, to get started.

_ Page 14 of 36

"#

$%&

'(

)01

56

Balanced Scorecard for management & benchmarking TOP LEVEL SCORECARD Customer Perspective
(How do our customers see us?)
Objective Achieve highest possible level of customer satisfaction Customer satisfaction index Performance Indicators Comments / Source of data

Prepared for AAPPA Conference, September 1999

This assumes there is a customer satisfaction survey for each group of services (perhaps during the feedback process with the client regarding performance against SLAs - see Qualitative Measures / Practices).

Services can & should be grouped, eg, Security, Maintenance, Cleaning; so that they form part of a separate identifiable survey and can be used in the respective scorecard.

Each survey should be simple, capable of being scored. (See the example in the paper given in the breakfast session).

To calculate the Customer satisfaction index shown in this (top level) scorecard, an average can be calculated from the score for each survey. In this way it would be linked to all subsidiary scorecards.

This would be very useful for both internal management and external benchmarking. Number of complaints per EFTSU attending Capital Works time & budget performance index

Linked to all relevant subsidiary scorecards

A summary of some Capital Works Scorecard KPIs

Achieve alignment with the Universitys direction and with our customers needs.

See Qualitative Measures / Practices in this section.

A numerical rating of answers against these question is possible for benchmarking purposes.

They are also good self assessment questions for internal management purposes.

_ Page 15 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking

Prepared for AAPPA Conference, September 1999

Customer Perspective (Continued) Qualitative Measures / Practices


(Please circle the answer which most applies)

Planning (representation and alignment)

To what extent do you believe that you have identified the key University planning forums?
Not at all / we have identified some / we have identified most of them / we know all of them

On what proportion of the relevant planning forums is FM invited to sit at the planning table?
None / some / most / all of them

How would you judge the level of involvement of FM in the development of Universitys strategic plan?
No input / some input or involvement but should be asked for more / the FM function is properly represented and listened to

Please comment on the nature of the involvement


Mostly informal / Mostly formal - we are asked for specific information at certain steps in the process / Fairly even mix of both

Does the FM function have a long term / strategic plan, ie 3 years or more?
No / such a plan has been established for less than 5 years / for 5 years or more

How would you rate the degree of alignment between the FM strategic plan and the Universitys strategic plan?
(Eg: - key objectives / strategies in the Universitys strategic plan are analysed for their implications for FM; - Universitys projections and analysis of trends are incorporated in medium to long term FM planning.) ANSWER: Low / medium / high degree of alignment

Do you receive feedback from senior University management (DVCs and/ or PVCs) on FMs performance in regard to the above aspects?
No input Level of involvement should be increased / decreased / about right Level of alignment should be increased / about right.

_ Page 16 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking Customer Perspective (Continued)


Planning (customer input)
Yes / No

Prepared for AAPPA Conference, September 1999

Do you have a formal process to gain an understanding of the various customers physical resource and service requirements? Are service delivery options developed to address each customer groups requirements? Are these service delivery options discussed with each customer group? Are Service Charters / SLAs developed or modified as a result? In the case of capital works, are capital works plans modified as a result?
Yes / No Yes / No Yes / No Yes / No

Is agreement with the customer on the most appropriate delivery option and service level generally achieved?
Service Level Agreements / Service Charters

Yes / No

We have Service Charters / SLAs with our client groups


For none of our client groups / Some / Most / All

We measure our performance at least annually against the SLA


For none of the SLAs that are established / Some / Most / All

Our SLAs are set after some form of consultation with the client regarding their key needs
For none of the SLAs that are established / Some / Most / All

When we measure our performance against the SLAs, we use the clients rating.
For none of the SLAs that are established / Some / Most / All

Set up appropriate links to the relevant subsidiary scorecards eg, Cleaning, Security, Maintenance, Grounds

_ Page 17 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking

Prepared for AAPPA Conference, September 1999

Financial Perspective
(How do we look to our financial stakeholders ?)
Objective Obtain value for money and manage our budget Annual net operating cost ($A) per EFTSU attending Assets employed ($A000) per EFTSU attending Performance Indicators Comments / Source of data

High level measure of the intensity of asset utilisation

Ditto. Net means any revenue (eg hiring of facilities) is deducted from operating costs. Link to subsidiary scorecards

Budget over-run as % of budget (operating) Budget over-run as % of budget (capital)

Ability to manage the budget. Link to subsidiary scorecards (except Capital Works?).

Ditto. Link to Capital Works scorecard.

Obtain adequate funding for effective facilities management

A composite of Maintenance Index plus funding per unit for other services

Link to subsidiary scorecards - eg Maintenance Index of 1% to 1.5% of ARV, Cleaning cost per sq m or per EFTSU, Security cost per EFTSU attending,

Qualitative Measures / Practices


Management information for FM

Hardly any manager believes they have a totally effective management information system. With this in mind can you please comment:

Within reason, our management information is:

The main problem is:

Reliable / not reliable Easily accessible / not Up to date / takes a long time to arrive Accurate enough for the purpose / not Within the FM function / not Other departments / not General / specific systems (please comment below)

Comments: (eg, the specific systems of most concern provide information on the level of backlog maintenance)

_ Page 18 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking

Prepared for AAPPA Conference, September 1999

Internal Process Perspective


(What must we excel at?)
Objective Effective Asset Management Net annual value of commercial opportunities realised. Performance Indicators Comments / Source of data

Defined as revenue from usage of the Institutions assets by outside parties, less direct costs relating to such usage. Revenue includes the commercial value of free or discounted usage to charitable or other community groups.

Rating on Asset Management (self assessment

See Maintenance & Capital Works Scorecard. The overall score would appear here.

Anticipate & adopt appropriate technology for FM

Rating on Asset Management (self assessment)

See Qualitative Measures / Practices below .

Qualitative Measures / Practices


Technology

From your own knowledge, how appropriate is the facilities management technology that is used in your area of the University? Please consider and score each of the aspects below: - cost 0 1 2 3 4 - reliability 0 1 2 3 4 - fitness for my purpose 0 1 2 3 4 - keeps me competitive in my field 0 1 2 3 4 1 = Totally Inappropriate 4 = Totally Appropriate

0 = I dont know / cant give a knowledgeable answer.

Comments: .. ..

_ Page 19 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking

Prepared for AAPPA Conference, September 1999

Innovation and Learning Perspective


(Can we continue to improve and create value?)
Objective Improve key elements of Facilities Management Trend in Facilities Condition Index Trend in Customer Satisfaction Index Trend in total annual Operating Cost per EFTSU Performance Indicators Comments / Source of data

All link to subsidiary scorecards. - Operating Cost to Operating Cost elements, eg Security - FCI to Maintenance or Capital - Customer Satisfaction to each individual scorecard.

(In each of the KPIs proposed, trend means % movement between each of last 3 annual figures)

Develop our people

Annual training days per EFT facilities management person. Annual training cost per EFT security person.

Links to subsidiary scorecards Source of data: Financial System Staff Development Records

Qualitative Measures / Practices


Knowledge and Skills

Have you identified the knowledge and skills required to optimise the contribution of FM to the University over the next several years?
No / Yes, informally / Yes, a formal analysis has been undertaken in all areas

Have the gaps between existing and required knowledge and skills been identified?
No / Yes, informally / Yes, a formal training needs analysis has been undertaken for all areas

Have you a plan for filling these gaps?


No / Yes, an informal plan / Yes, a formal plan has been developed and is being implemented

_ Page 20 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking SCORECARD FOR CLEANING Customer Perspective
Objective Achieve highest possible level of customer satisfaction Performance Indicators Number of complaints per EFTSU attending

Prepared for AAPPA Conference, September 1999

Comments / Source of data With scores for this measure from other scorecards, summarises up to the Top Level Scorecard.

Customer satisfaction index

Summarises up to the Customer Satisfaction Index in the Top Level Scorecard, along with customer satisfaction scores from other scorecards.

The survey should be simple and capable of being scored. See Qualitative Measures / Practices below for example.

Percentage Compliance with Service Level Agreements

SLAs either as part of contract specification or internal agreement with customers. Score provided by means of regular checks of service provided by Cleaning Supervisor.

See also Qualitative Measures / Practices below.

Be Environmentally responsible

Waste volume/sq m Waste Volume/EFTSU

Waste is garbage which requires general (not special) handling from buildings, and bins along paths, etc (not landscaping waste).

_ Page 21 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking

Prepared for AAPPA Conference, September 1999

Customer Perspective (Continued) Qualitative Measures / Practices


Customer Satisfaction
Ratings: 1 = not acceptable; 2 = unsatisfactory; 3 = satisfactory; 4 = good; 5 = excellent

Sensitivity and understanding of customer needs Competence and expertise displayed with the advice or service provided. Reliability of service provided. Timeliness / speed of response to service requests Efforts made to solve problems and follow through Feedback provided to customers on services delivered. Service Level Agreements We have Service Charters / SLAs with our clients We measure our performance at least annually against the SLA Our SLAs are set after some form of consultation with the client regarding their key needs We use the clients own rating of our performance at least annually against the SLAs. Regular checks of service provided against the provisions of the SLA, are SLAs that are established / Some / Most / All conducted by the Cleaning Supervisor or equivalent position. Waste Volume We measure the volume of garbage which requires general (not special) handling from buildings, and bins along paths, etc (ie not landscaping waste).

None / Some / Most / All

For none of the SLAs that are established / Some / Most / All

For none of the SLAs that are established / Some / Most / All

For none of the SLAs that are established / Some / Most / All

For none of the

Whether measured /frequency / used for change process.

_ Page 22 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking Financial Perspective


Objective Annual net operating cost per EFTSU attending. Ditto per square metre Performance Indicators Comments / Source of data

Prepared for AAPPA Conference, September 1999

Net means any revenue is deducted from operating costs. Cost means the total all up including for example contracts, staff up to head of FM group, materials and consumables, waste removal, window cleaning, pest control, hazardous waste, recycling, sanitary bin service, grease trap service, landfill charges, cleaning of curtains and furnishings.

Budget over-run as % of budget (operating)

Measures the ability to manage the budget.

(With scores for these measures from other scorecards, summarises up to the Top Level Scorecard).

Internal Process Perspective


Objective Work Safely Performance Indicators LTIFR (for Cleaning Staff) Comments / Source of data

Industry Standard, from OH&S data

Minimise overheads

Ratio of cost of Cleaning Mgmt to Total Cleaning Cost.

Cleaning management costs include staff and overhead costs for cleaning supervision, as well as a share of corporate staff and overheads up to head of FM function. This will also apply if cleaning is contracted out. Ratio of Materials Cost/Total Cleaning Cost.

Measures materials control, & environmental responsibility to some degree.

Be Environmentally responsible

See above and Customer Perspective.

_ Page 23 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking Innovation and Learning Perspective
Objective Improve key elements of Facilities Management Trend in Customer Satisfaction Index Trend in LTIFR Trend in total Operating Cost per EFTSU (Also per EFT and per sq metre) Performance Indicators Comments / Source of data

Prepared for AAPPA Conference, September 1999

Perhaps not all three cost ratios - just the per EFTSU ratio

With scores for these measures from other scorecards, the cost and customer satisfaction measures summarise up to the Top Level Scorecard.

In each of the KPIs proposed, trend means % movement between each of last 3 annual figures

Qualitative Measures / Practices


Benchmarking of Cleaning Standard - Cleaning practice regularity
This simple rating system might need refining/expanding but could well be a start.

ITEM
3 Star Halls Common Areas vacuumed Offices vacuumed Toilets cleaned High Use Toilets cleaned Internal Bins emptied External Bins emptied Cob webbing internal Windows cleaned - Internal Windows cleaned - External weekly monthly daily daily twice weekly twice weekly 12 monthly

Level of service / frequency


4 Star weekly fortnightly daily twice daily daily daily 3 monthly 5 Star daily weekly daily three times daily daily daily monthly Other (state frequency)

_ Page 24 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking SCORECARD FOR SECURITY Customer Perspective
Objective Achieve highest possible level of customer satisfaction (see Qualitative Measures / Practices on the next page) 1. Satisfaction score on response to a simple survey Performance Indicators Comments / Source of data

Prepared for AAPPA Conference, September 1999

Survey of a small representative sample of students and staff.

Summarises to the Customer Satisfaction Index in the Top Level Scorecard, along with customer satisfaction scores from other scorecards.

2. Number of complaints per EFTSU attending

Summarises to the Top Level Scorecard, along with scores on this measure from other scorecards.

Make an impact on incident levels

3. Relative incident levels

Number of incidents per 1000 EFTSU on campus divided by number of equivalent incidents (eg theft) per 1000 population in the suburbs in the immediate environment of campus.

Source of data: Security System; Security Log Book; University Insurance Records; Finance Systems.

Qualitative Measures / Practices


Customer Satisfaction / perceptions (Rating scale: 1 = inadequate; 2 = satisfactory;

3 = more than satisfactory)

Please give us your perceptions / experience with the following: Your personal safety/security within my workplace eg classroom, laboratory, office. Your personal safety/security moving about the campus walking along pathways; driving or cycling on campus roadways; using stairways, elevators, toilets, public areas. Your personal safety/security during evenings and weekends on campus. Security of personal belongings eg money/valuables, books, vehicles, etc. Have you experienced a personal safety/security incident this year? If yes, was the incident reported to the security service? If yes, please give us your perception of the response by the security service. 1 1

2 2

3 3

1 2 Yes Yes 2

2 3 No No 3

_ Page 25 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking Financial Perspective


Objective Effective cost management 1. Annual operating cost per EFTSU attending. Performance Indicators Comments / Source of data

Prepared for AAPPA Conference, September 1999

Cost means the total all up including for example contracts, staff up to head of FM group. For benchmarking purposes, categorised into the following (respondents would complete the section - or sections in the case of multiple campuses - which apply to them: Metropolitan Campus - 7 Day, 24 hour service $ - Other span of hours $ City Campus - 7 Day, 24 hour service $ - Other span of hours $

2. Budget over-run as % of budget (operating) 3. Security cost per incident

(The scores for these first two measures are summarised in the Top Level Scorecard, along with their equivalents from other scorecards).

The intention is to provide a rough judgement of the degree of security problem on campus balanced against the resources devoted to it. The source of the data is the Security Log Book and the Finance System.

_ Page 26 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking Internal Process Perspective


Objective Work Safely Monitor intensity of security effort 3. Annual number of recorded calls/requests to security service for assistance per EFTSU attending. 4. Annual person hours of security service provided per EFTSU attending. Monitor the balance between various security services 5. Percentage of total annual person hours and operating costs applied to: - Crime / incident investigation - Securing/lockup buildings and facilities - Surveillance/mobile and foot patrols - Attendance to personal safety requests - Car parking control and regulation - Escort duty - Response to out of hours general inquiries. 6. Number of safety / security incidents (per 1000 EFTSU) on campus by type of incident: - Theft personal property - Theft University property - Damage personal property - Damage University property - Injury/assault to student/staff/visitor - Harassment to student/staff/visitor 2. Annual number of security/safety escorts per EFTSU attending. 1. LTIFR (for Security Staff) Performance Indicators Comments / Source of data

Prepared for AAPPA Conference, September 1999

Industry Standard, from OH&S data Data from security log books.

Data from analysis of security log books and apportionment of operating costs.

Monitor the mix security incidents

Data from analysis of: - Security System - Security Log Book - University Insurance Records - Finance Systems

Data shown in the three rows above can be used to monitor the level and balance in the mix of security services, to maintain efficiency and effectiveness.

_ Page 27 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking Innovation and Learning Perspective
Objective Develop our people 2. Annual training cost per EFT security person. 1. Annual training days per EFT security person. Performance Indicators Comments / Source of data

Prepared for AAPPA Conference, September 1999

Source of data: Financial System Staff Development Records

Develop our services

3. % of annual total security budget committed to improvement of services or installation of new services.

Data from annual budget.

Invest in appropriate technology

4. Cost per EFTSU attending, of access control and other electronic security services.

Data from annual budget.

_ Page 28 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking SCORECARD FOR MAINTENANCE Customer Perspective
Objective Provide a safe environment 1.1 Reduce the backlog of safety related maintenance SCI = 1 Reduction in value of safety related maintenance backlog Total Value of safety related maintenanc e backlog at begiining of period

Prepared for AAPPA Conference, September 1999

Performance Indicators

Comments / Source of data

An improvement in the Safety Condition Index which is defined as

Source of data: Maintenance management system Finance system

1.2. Reduce lost time injuries 1.3. Reduce WH&S incidents

Lost time hours of all staff / total of staff hours available Reduction in WH&S Incidents defined as
Noofincidentsinlastp eriod noofincidentsinpresentperiod Noofincidentsinlastp eriod

Academic and general staff. Staff availability assumed to be 46 weeks. Source of data: as above plus workers compensation records

Source of data: as above

Provide reliable services 2.1 Rapid response to breakdowns 2.2. Reduce complaints % response time achieved within published targets for various categories Number of complaints received per semester per EFTSU

Source of data for 2.1 and 2.2: Maintenance system / Help desk data / Quality system

For benchmarking purposes, the categories and targets would need to be defined and commonly accepted. Otherwise a measure such as average response time between call and return to service would need to be used.

Provide an aesthetically pleasing environment

Score on customer satisfaction survey

Annual survey conducted at each site. Survey should be site specific to take into account local issues. Some questions may be common across the system. Scoring system common eg 4=good, 5= excellent on a 5 point scale.

Provide equity of service

Maintenance $ as % of ARV by building or campus. Measured as a rolling 5 year average (eventually) of: (Maintenance Index) =
E xpenditureonMainten ance AssetRepalcementValueofAssetCategory

Also need to establish Facility Condition Index which should be simultaneously measured to indicate effectiveness of maintenance.

Hence need to establish a condition-based maintenance plan. See Qualitative measures / Practices in the internal process Perspective.

_ Page 29 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking

Prepared for AAPPA Conference, September 1999

Financial Perspective
Objective Optimise the maintenance dollar 5.1 Gain adequate maintenance funding Maintenance Index
M aintenanance E xpenditure = Asset Replacement Value

Performance Indicators

Comments / Source of data

Source of data for 5.1 and 5.2: Maintenance management system and Asset registers
100

It is suggested that the maintenance Index should be between 1 and 1.5% (Source: APPA/AAPPA Research 1980-1998, Dr Frank Bromilow (CSIRO), NCRB, FMA)

Maintenance index should be assessed in conjunction with movement in the FCI 5.2 Maintain an adequate value for the Facility Condition Index FCI = [1 - (Total Backlog Maintenance / Institution ARV)]

APPA/AAPPA Research 1980-1998 concludes that an FCI of 0.9 or above is an indicator of a manageable backlog

_ Page 30 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking Internal Process Perspective


Objective Adequate maintenance planning Corrective maintenance expenditure divided by total maintenance expenditure Performance Indicators Comments / Source of data

Prepared for AAPPA Conference, September 1999

Source of data: Maintenance management system and Asset registers

Adequate maintenance systems / practices

See Qualitative Measures / Practices below

Completion of answers to the questions and comparison of the scores achieved provides a basis for benchmarking.

A score would need to be set for each possible answer ( eg first question below: Annually = 4; Every 2 years = 3; Every 3 years = 2 and so on ).

Qualitative Measures / Practices


Asset Management - Facilities Condition management

A Facilities Condition assessment is conducted


never

Annually

/ every 2 years / every 3 years / less frequently /

Part of the assessment is some form of physical inspection A Facilities Condition Index is calculated A prioritised list of work requirements is developed The prioritised list of work requirements is a key to determining backlog maintenance works The prioritised list of work requirements helps to determine minor capital works projects For backlog maintenance: A cost estimate is derived for each item/project in the work requirements A recommended year of action is also identified This results in a costed plan for infrastructure sustainability
plan

Yes / no

Annually / every 2 years / every 3 years / less frequent

No / for some assets / for most assets / totally comprehensive Yes / no Yes / no

No / for some items / for most items / totally comprehensive

No / for some items / for most items / all items No

/ for 1 year ahead / for 2 years

/ 3 years / longer term

This has been endorsed by the leadership group at my University / Institute / establishment.

Yes / no

_ Page 31 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking

Prepared for AAPPA Conference, September 1999

Internal Process Perspective (continued)


Asset Management - Maintenance routines and decisions

A maintenance management system is in place. Routines are documented and delivered by the maintenance management system. CM, PM, B&DA are accounted for separately. A Planned Maintenance system is in place There are documented procedures for evaluating outsourcing decisions.

For planned maintenance / corrective / backlog / deferred Yes / no Yes / no

None / some / most / for all systems Yes / no

Innovation and Learning Perspective


Objective Performance Indicators Comments / Source of data

Improve key elements of Facilities Management

Trend in total Operating Cost per EFTSU Trend in Facilities Condition Index Trend in Customer Satisfaction Index Trend in lost time injuries (see Performance indicator 1.2 in Customer Perspective)

These dont all necessarily have to moving in the right direction for the result to be good. For example rapidly recovering a poor FCI may impact severely on operating cost per EFTSU.

(In each of the KPIs proposed, trend means % movement between each of last 3 annual figures)

Improve skills of workforce

Annual training days per EFT maintenance person

_ Page 32 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking SCORECARD FOR CAPITAL WORKS Customer Perspective
Objective Achieve highest possible level of customer satisfaction Customer satisfaction index (See indicative survey questions below) Performance Indicators

Prepared for AAPPA Conference, September 1999

Comments / Source of data

It assumes there is customer satisfaction survey for each project and that a client (spokesperson for key users) has been identified.

The survey should be simple, capable of being scored.

The score for each survey (calculated as % of maximum possible score) can be calculated and summarised, along with the score from other customer satisfaction scores, to an average score in the Top Level Scorecard for Facilities Management as a whole.

On time completion

Average time overrun as % of planned project duration

A weighted average figure is calculated to take into account the different size of individual projects. Source of data: project budgets and schedules.

Customer Satisfaction survey

We were consulted at the planning stage about our requirements


satisfactory

Not at all / yes but it was a partially satisfactory /

We were given realistic expectations of what the completed project would provide. We were kept adequately informed during the project. The completed structure has proven satisfactory Any associated systems contain appropriate technology. Operating costs so far have proven to be within our expectations. Remedial work has been unnecessary or of a very minor nature.

Not at all / to some degree / yes we were

Not at all / to some degree / to an appropriate degree

Not at all / in part / yes

None / some / most / for all systems / dont know

Yes / no Yes / no

_ Page 33 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking Financial Perspective


Objective Manage our budget effectively Budget over-run as % of budget (capital budget) Budget over-run as % of budget (operating budget) Performance Indicators

Prepared for AAPPA Conference, September 1999

Comments / Source of data

Ability to manage the budget. Reported to Top Level Scorecard

Ditto. Reported to Top Level Scorecard

Project competitively tendered

% of capital works by value, competitively tendered

Assesses the keenness to check costs and approaches with competitors

Internal Process Perspective


Objective Effective Capital/Asset Management Minimise management overheads Optimise project cost performance % management costs to capital works value Performance Indicators Comments / Source of data

% which the capital works group annual operating budget represents of the capital works completed over the same period

Budget over-run as % of budget (capital budget) separated into: n sum of projects completed by in-house people n sum of projects completed by contractors Average % management costs to capital works value by size strata of projects: n all projects less than $25000 n all projects > up to $100,000 n all projects > $100,000 up to $1,000,000 n all projects > $1,000,000

Assesses some of the relative merits of in-house projects and placing projects with contractors

Monitor portfolio for economies of scale

Highlights needs & opportunities to review and categorise project management practices for different sized projects

_ Page 34 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking

Prepared for AAPPA Conference, September 1999

Innovation and Learning Perspective


Objective Performance Indicators Comments / Source of data

Improve key elements of capital assets management Trend in total facilities assets employed per EFTSU Trend in Customer Satisfaction Index Trend in LTIFR for capital works staff and contractors

Trend in % management costs to capital works value

On a longer term basis, measures the degree to which the capital works function is being effective in managing the capital assets aspects - even though some of the measures may be indirect.

(In each of the KPIs proposed, trend means % movement between each of last 3 annual figures)

Improve skills of workforce

Annual training days per EFT capital works person

_ Page 35 of 36

89

AB

CDE

FG

HIP

TU

Balanced Scorecard for management & benchmarking BALANCED SCORECARD FOR INTERNAL MANAGEMENT PURPOSES MAINTENANCE FUNCTION

Prepared for AAPPA Conference, September 1999

This corresponds to the first page of the customer perspective of the scorecard shown for the maintenance function on page 16. The key difference is that this layout provides for specific targets, responsibilities, and deadlines to be documented on the same page. Customer Perspective Objective
1. TO PROVIDE A SAFE ENVIRONMENT 1.1 Reduction of the backlog of safety related maintenance by 20% p.a.

Strategy / Target

KPIs

Actions by whom / by when

Performance is measured by an improvement in the Safety Condition Index which is defined as SCI = 1 Reduction in value of safety related maintenance backlog Total Value of safety related maintenanc e backlog at begiining of period

1.2. Reduction of lost time injury as % of total hours by 10%. 1.3. Reduction of WH&S incidents by 10%

Lost time hours of all staff (academic and general), divided by total of staff hours available (46 weeks). Reduction in WH&S Incidents =
Noofincidentsinlastp eriod noofincidentsinpresentperiod Noofincidentsinlastp eriod

2.

TO PROVIDE RELIABLE SYSTEMS AND SERVICES

2.1. Ensure all response times are met with 95% confidence rate.

% response time achieved within published targets for various categories

2.2. Number of complaints received per semester 3. TO PROVIDE AN AESTHETICALLY PLEASING ENVIRONMENT 3.1 Perform regular Occupancy Evaluations and Condition Audits

Number of complaints received per semester per EFTSU.

Respondents giving a score of 4 or 5 or of good or excellent on a scale of 5

_ Page 36 of 36

89

AB

CDE

FG

HIP

TU