You are on page 1of 17

2nd Reading

July 12, 2013 14:16 WSPC/S0217-5959 APJOR 1350013.tex

Asia-Pacific Journal of Operational Research


Vol. 30, No. 5 (2013) 1350013 (16 pages)

c World Scientific Publishing Co. & Operational Research Society of Singapore
DOI: 10.1142/S0217595913500139

FACILITY MANAGEMENT BENCHMARKING: AN


APPLICATION OF DATA ENVELOPMENT ANALYSIS
IN HONG KONG

PHILIP Y. L. WONG
Department of Business Administration
Caritas Institute of Higher Education
18 Chui Ling Road, Tseung Kwan O, Hong Kong
pwong@cihe.edu.hk

STEPHEN C. H. LEUNG∗
Department of Management Sciences
City University of Hong Kong
Tat Chee Avenue, Kowloon, Hong Kong
mssleung@cityu.edu.hk

JOHN D. GILLEARD
CoreNet Global, Hong Kong
jgilleard@corenetglobal.org

Received 22 June 2011


Revised 26 March 2013
Accepted 31 March 2013
Published 16 July 2013

This paper proposes Data Envelopment Analysis (DEA) as a suitable data analysis
tool to overcome facility management (FM) benchmarking difficulties: FM performance
benchmarking analysis is often unsophisticated, relying heavily on simple statistical rep-
resentation, linking hard cost data with soft customer satisfaction data is often prob-
lematic. A case study is presented to show that DEA can provide FM personnel with
an objective view on performance improvements. An objective of the case study is to
investigate the relative efficiency of nine facilities with the same goals and to determine
the most efficient facility. The case is limited to nine buildings in FM on four inputs
and nine output criteria. The paper concludes by demonstrating that DEA-generated
improvement targets can be applied when formulating FM outsourcing policies, strategies
and improvements. Facility manager can apply DEA-generated improvement targets in
formulating FM outsourcing policies, specifications development, FM strategy and plan-
ning. FM benchmarking with DEA can enhance continuous improvement in services’
efficiency and cost saving. This will help reduce utility cost as well as pollution. This
paper fills the gap in the research of FM benchmarking by applying DEA which studies

∗ Corresponding author

1350013-1
2nd Reading
July 12, 2013 14:16 WSPC/S0217-5959 APJOR 1350013.tex

P. Y. L Wong, S. C. H. Leung & J. D. Gilleard

both soft and hard data, simultaneously. It also contributes to a future research of a
tradeoff sensitivity test between FM cost, services performance and reliability.

Keywords: Facility management; benchmarking; data envelopment analysis.

1. Introduction
Facility management (FM) is defined as “. . . a profession that encompasses multiple
disciplines to ensure functionality of the built environment by integrating people,
place, process and technology”, (IFMA, 2013). FM professions aim to provide build-
ing occupants with a pleasant and productive environment, under which commer-
cial occupants can concentrate their resources on their core business and residential
occupants can enjoy their living space. To achieve this objective, facility managers
are required to adopt a world view of an organization’s activities, not simply those
related to the built environment, but also other disciplines, e.g., business adminis-
tration, architecture and behavioral and engineering sciences.
Most organizations expect continuous improvement from their FM service
providers to achieve year-on-year cost reductions and enhancements to service qual-
ity. Hence, collecting cost and performance related data such as cleaning, security,
maintenance, and energy is generally a routine activity. In the arena of FM, per-
formance benchmarking may be defined as a continuous and systematic approach
for measuring and comparing the work processes of one organization with those of
another by bringing an external focus to the internal FM activities, functions or
operations. Performance benchmarking may also be characterized as an improve-
ment process whereby organizations learn through measuring and comparing both
quantitative and qualitative aspects of their business.
The measurement of FM performance relies largely on the proper application of
statistical tools. Based on the collected data, histograms and line charts are usually
plotted to indicate trends, identify ups and downs and to compare past performance.
These graphical representations help improve clarity in the decision-making process
and allow for better management presentation. However, they typically assess one
facility with multiple criteria or a range of facilities with one criterion. Rarely is
further analysis carried out. On the other hand, Lingle and Schiemann (1996) argue
that organizations using a balanced performance measurement system as the basis
for effective management do better than those who do not. In addition, the pro-
cess of measuring performance is completely wasted unless the performance data
produced informs management’s action (Neely and Bourne, 2000). Hence, by illus-
trating cause and effect, facility managers are better placed to influence executives
to adopt improvement measures. The objectives of this paper, are to provide a
general view on a Data Envelopment Analysis (DEA) application of FM operation
benchmarking; to establish an Input–Process–Output system and measure the over-
all efficiency for a portfolio of nine facilities by assessing their relative performance.
Traditionally, in most service industries, trend benchmarks are frequently deter-
mined with reference to past data. A variety of methods and statistics, including

1350013-2
2nd Reading
July 12, 2013 14:16 WSPC/S0217-5959 APJOR 1350013.tex

FM Benchmarking: An Application of DEA in Hong Kong

standard errors of regression coefficient, T-ratios, R-squared values, the analysis


of variance (ANOVA), and the analysis of residuals, can be applied to evaluate
‘goodness of fit ’. In the FM sector, long-term time series data tends to follow a sea-
sonal or cyclical pattern. Consequently, to better visualize and analyze trend data a
more sophisticated forecasting method is needed. In addition, linking hard cost data
with soft customer satisfaction data is problematical. In this paper, we propose to
adopt the widely-used decision tool, DEA, to take account of chosen and relevant
factors that affect a unit’s performance to provide a complete and comprehensive
assessment of efficiency.

2. Framework of DEA Model for FM Benchmarking


DEA was first introduced by Charnes et al., (1978). DEA is a linear programming
based technique usually applied to determine the relative performance measure-
ment of organizational units where multiple inputs and outputs make comparisons
difficult (Cooper et al., 2000). DEA combines many performance measures into
an indicator of efficiency and thus helps the FM unit to achieve their improvement
goals. The efficiencies assessed are relative in the sense that they reflect the scope for
resource conservation and output augmentation at one unit relative to other com-
parable units. Conventional single measures ignore the interactions and tradeoffs
among various performances. In the financial services sector return on investment
(ROI) or other similar financial ratios are well recognized as good measures of per-
formance. Operations of manufacturing industries are often standardized, quality
and quantity of product outputs per labor and time inputs clearly described and
benchmarking fully adopted. An FM operation may be considered by end-users as
competent when the qualities of FM services provided to them exceed their expec-
tation. However, end-user satisfaction is difficult to measure absolutely, and may
not fully reflect a complete picture of service provision. Hence, facility managers
need to know whether the resources they utilize are productive.
The application of DEA may be illustrated with reference to a simplified ‘Input–
Process–Output’ system (Wong, 2005):

(1) Inputs: material, energy, etc.


(2) Transformation processes that consumes these inputs to create or sustain some-
thing of value.
(3) Outputs in the form of final services: customer satisfaction, cleanliness, etc.

This can be conceptualized by the following example. A facility manager studies an


income statement for the past month. He focuses on a benchmark against which
to compare the actual performance against another benchmark in the form of a
feedback channel to allow information on variance to be communicated and acted
upon, illustrated in Fig. 1.
An output benchmark is a formal representation of performance expectations.
With the preset standards at hand, a facility manager can assess how well inputs

1350013-3
2nd Reading
July 12, 2013 14:16 WSPC/S0217-5959 APJOR 1350013.tex

P. Y. L Wong, S. C. H. Leung & J. D. Gilleard

Fig. 1. Cybernetic feedback model (Source: Simmons, 1999).

have been transformed into outputs. However, a benchmark in itself is not ade-
quate. There must be a way of applying the data, i.e., outputs should be compared
with standards and using the resulting variance information to change the inputs or
processes thus ensuring that performance will be met. Therefore, the second ingre-
dient is a feedback channel coupled with an understanding of what adaptations to
inputs and process are likely to improve the results. Simmons (1999) pointed out
that benchmarking is just like watching the speedometer when driving, “We need to
compare the information with highway speed limit sign posts (pre-set benchmark ) to
decide whether we should accelerate or slow down (process adjustment)”. Feedback
information can be used in many ways. For example, the facility manager of a call
centre can use the feedback information about an operator’s superior performance
to learn how others can do their job better.
Single-measure gap analysis is common among facility managers where bench-
marking subjects are typically confined to costs, profits in monetary terms and
other similar benchmarks. However, organizational performance is often evaluated
in terms of more complicated measures. Apart from completeness of comparison and
better consideration of subjects’ interactions and tradeoffs, Camp (1995) pointed
out the advantage of benchmarking by multiple measurements since absolute values
are rarely revealed in the benchmarking report. Camp (1995) continued by noting
that integrating multiple measures in an interactive manner requires benchmarking
techniques that are more sensitive than those normally applied.
As previously mentioned, the basic efficiency measure used in DEA is the ratio
of total inputs to outputs. In general, inputs can include any resources utilized by
an organization, and the outputs can range from products, performance measures
and cost measures.
The methodology of applying DEA for FM benchmarking is:

(1) To specify the inputs and outputs for each member of the benchmarking
group;
(2) To define efficiency for each member as a weighted sum of outputs divided by
a weighted sum of inputs.

1350013-4
2nd Reading
July 12, 2013 14:16 WSPC/S0217-5959 APJOR 1350013.tex

FM Benchmarking: An Application of DEA in Hong Kong

In calculating the numerical value for the efficiency of a particular member,


weights are chosen so as to maximize its efficiency. The benchmarking group is
put into the best-performance scenario. The common inputs for applying DEA in
FM benchmarking are general/administration expenditure and energy used. The
common outputs include number of customer and area/unit served. The efficiency
maximization model for the FM benchmarking is shown as below:
s
ur yrp
maximize r=1 m
i=1 vi xip
s
ur yrj
s.t. r=1
m ≤1 j = 1, 2, . . . , n,
i=1 vi xij

ur , vi ≥ 0 r = 1, 2, . . . , s, i = 1, 2, . . . , m.
Where p represents the selected unit among n units, s is the number of outputs,
m is the number of inputs, ur and vi are variable weights, to be determined in the
solution of the model, with given rth outputs and ith inputs, respectively, yrj and
xij are the known rth output measure and ith input measure, respectively, of the
jth unit.
In DEA calculations, the best performing unit is assigned as an efficiency score
of unity or 100%, and the performance of other unit vary, between 0% and 100%
relative to the best performance. The following hypothetical example with adjusted
real data illustrates the basics of DEA. The data was extracted from an internal
benchmarking survey by a property investment company in Hong Kong in the year
of 2003.
Before illustrating a large scale benchmarking case, this paper studies a simple
benchmarking project of a property investment company in Hong Kong which owns
two office buildings, Building A and Building B, in the same district with compara-
ble services and lessee compositions. A facility manager is given a duty “to present
to the company executives on how efficient the two FM units of the two buildings
among their peers are”. The manager only managed to collect the following data for
benchmarking Buildings A and B with 14 other comparable buildings in the same
district:

(1) Building services (BS) cost per square feet: BS cost includes the costs relating
to the services of electricity, air conditioning, plumbing and drainage, sea water
system (if applicable), fire, vertical transport as well as general cleaning.
(2) Rent per square feet.

Despite the lack of other conventional FM data for benchmarking, the facility
manager is convinced that meaningful information can still be drawn from the
benchmarking study with the use of DEA because:

(1) The BS cost represents a substantial operation cost of the whole building’s
facilities.

1350013-5
2nd Reading
July 12, 2013 14:16 WSPC/S0217-5959 APJOR 1350013.tex

P. Y. L Wong, S. C. H. Leung & J. D. Gilleard

Table 1. BS cost and rent per square feet of 16 build-


ings for the year 2001.

Building BS cost per square Rent per square


feet (HKD) per year feet (HKD)
A 4.92 19
B 6.17 18
1 5.42 17
2 6.33 17
3 6.25 20
4 4.58 16
5 5.25 18
6 4.83 19
7 6.33 17
8 6.00 19
9 5.42 18
10 6.25 19
11 4.92 16
12 5.58 18
13 5.75 18
14 5.25 19

Source: An internal benchmarking survey by a prop-


erty investment company in Hong Kong in 2003.

(2) Though rent is largely determined by demand and supply within its market
segment, it reflects the competitiveness of the quality of FM services, assuming
that the property market is perfect with respect to information and market
competition.
(3) When assessing organizations’ efficiency with DEA, financial evaluations were
not necessary. DEA only requires activity information (Homburg, 2001).

From Table 1, some statements concerning the relative efficiency of the buildings
can be made:

(1) BS cost of Building A is lower while rental per square feet charged is higher
than Building B. Clearly, if the input and output are representative, Building
A’s FM unit is more efficient than Building B’s.
(2) Building A and Building 4 have the lowest cost in BS. The two buildings may
be considered as the most productive from this limited aspect. However, from
the same table, it was noted that the rent per square feet of Building 3 is the
highest among the 16 buildings.

The annual cost for BS per square feet and the rent per square feet are plotted
for each building in Fig. 2: Buildings A, 3, 4, 6 and 14 form an “efficiency fron-
tier”. It was named so because they produce the most outputs in closed cases for a
reported amount of costs. Buildings close to the frontier are relatively efficient and
those inside the frontier are less efficient. The facility manager of Building B may
either become as efficient as A by decreasing its cost on BS or by increasing the

1350013-6
2nd Reading
July 12, 2013 14:16 WSPC/S0217-5959 APJOR 1350013.tex

FM Benchmarking: An Application of DEA in Hong Kong

Fig. 2. An efficient frontier identifies the benchmarks.

Table 2. Efficiency report for Building B.

Observed measure Benchmark Potential improvement


(HKD) (HKD) (HKD)
Output: Rent per square feet 18 20 2 (increase)
Input: BS cost per square feet per year 6.17 4.58 1.59 (reduction)

rent charged, may become similar to Building 3. These possible transformations of


Building B’s FM unit to those efficient ones near the frontier demonstrate the basic
idea of DEA.
As shown, a facility manager can develop an empirical efficient frontier based
on his own observation as a benchmark with limited data. However, DEA users are
always suggested to collect more data of representative performance measures and
incorporate these to refine the model and check any breakthroughs on the frontier
with up-dated data. In the paper by Schaffnit et al. (1997), it was shown that DEA
can deal with 291 benchmarking participants with five inputs and eight outputs.
In this case, DEA can indicate the exact targets for the inefficient units with
reference to the efficient ones diagrammatically. Facility Managers can thus check
the improvement progress against time from the diagram. Benchmarks may be given
in terms of inputs or outputs, as illustrated in Table 2.

3. Case Study
In this section, a case of a Portfolio of nine buildings is studied to illustrate the
application of DEA with more input and output items. The case study refers to
an internal benchmarking survey undertaken by the FM department of a Hong
Kong development company (the Company). The nine buildings in this study

1350013-7
2nd Reading
July 12, 2013 14:16 WSPC/S0217-5959 APJOR 1350013.tex

P. Y. L Wong, S. C. H. Leung & J. D. Gilleard

are located in different districts of Hong Kong. The buildings are used for both
operation and storage of machines and back office work. FM data of the nine
building for the year 2004 was collected as part of a cost analysis and customer
satisfaction survey. The objective of this case study is to examine the effectiveness
and efficiency of the Company’s FM department on the management of the nine
buildings.

3.1. Description of the organization and its FM unit


The FM unit of the Company had previously outsourced all cleaning and security
services of the 14 buildings. The relationships between the Company and the services
providers are mainly ‘compliance and control’, centered upon the obligations set out
in the contract documents. Key Performance Indicators (KPI), service levels and
contract sum were determined mainly with reference to building floor area. FM
contracts were made at the beginning of each financial year and variations during
the contract period are difficult. Good FM planning is crucial to the control of FM
cost. However, the FM budget planning did not take the customers’ satisfaction into
account. This is not unusual in Hong Kong, where cost efficiency is often viewed
as paramount and has precedence over customer related issues. The Company at
the time of the case study was primarily driven by the need to reduce cost where
practical.
The buildings operate 24 h a day. The Company’s FM unit monitors the out-
sourced service provider’s performance, e.g.,

(1) Quality of work,


(2) Responsiveness and timely delivery of services,
(3) Cost control,
(4) Safety performance,
(5) Environmental compliance,
(6) Scheduling and planning of current and future work,
(7) Customers’ satisfaction levels.

The FM unit also carries out customer satisfaction surveys on an annual basis.
The December 2004 survey covered overall satisfaction level by perception and
satisfaction levels of six other specific items:

(1) Security comfort level,


(2) Attitude of guard,
(3) Cleanliness of common areas,
(4) Cleanliness of office areas,
(5) Cleanliness of pantry,
(6) Cleanliness of washroom.

Input to the DEA FM operation includes costs measured in HKD for cleaning,
security, maintenance/security system and energy used measured in kWh. Outputs

1350013-8
2nd Reading
July 12, 2013 14:16 WSPC/S0217-5959 APJOR 1350013.tex

FM Benchmarking: An Application of DEA in Hong Kong

are the number of staff, office area, equipment area and customer satisfaction related
to six FM categories, i.e.,

(1) Cleanliness of common areas,


(2) Cleanliness of office area,
(3) Cleanliness of pantry,
(4) Cleanliness of washroom,
(5) Perceived attitude of guard,
(6) Overall building security.

For illustrative purposes nine buildings have been assessed using DEA with the
original data set for input and output criteria shown in Tables 3 and 4, respectively.
During the surveys, it was made clear to the participants that all satisfaction indices
of different areas were independent.
DEA calculations indicate the best performing unit with an assigned efficiency
score of unity or 100%. The scores of less efficient units vary between 0% and 99.9%
relative to the best performance. Table 5 indicates the DEA assessment for the nine
buildings.
In this analysis an input-oriented scenario was assumed, i.e., outputs are fixed at
a constant level and inputs are minimized to produce the assessed level of output.
Buildings 2, 3, 4, 5, 6 and 7 were assessed to be efficient, whereas buildings 1, 8 and
9 were assessed to be inefficient.
Table 6 shows the input targets found for the buildings 1, 8 and 9. The DEA
results help inform the facility manager on how efficient (or inefficient) each aspect
of a building performs. For example, the efficiency of Building 1 can be improved
(relative to the efficient buildings) by reducing the cost of cleaning, the system
security maintenance fee or by improving energy efficiency.
Table 6 indicates cost related FM performance benchmark. Similarly, general
customer satisfaction for the nine buildings may also be assessed, Table 7. DEA
results and efficiency targets are determined and illustrated, Tables 8 and 9.

Table 3. Data for four input criteria.

Unit Planned cleaning costs Total maintenance fee Energy use Total security cost
in HKD (security sys) in HKD (kWh) in HKD
Building 1 $489,123 $20,202 9,397,038 $577,678
Building 2 $348,415 $14,270 7,820,009 $341,275
Building 3 $370,401 $13,607 8,916,469 $220,560
Building 4 $469,902 $13,299 14,762,292 $468,557
Building 5 $494,904 $30,111 8,996,130 $503,141
Building 6 $338,221 $24,795 4,229,850 $234,870
Building 7 $294,874 $9,524 7,422,949 $216,810
Building 8 $427,122 $58,113 10,373,816 $735,361
Building 9 $533,331 $25,488 12,133,160 $588,405

1350013-9
Table 4. Data for nine output criteria.

Unit Equipment area Office area Number Satisfaction Satisfaction Satisfaction Satisfaction Satisfaction Satisfaction
(square feet) (square feet) of staff on securitya on guard on common areas on office on pantry on washroom
July 12, 2013 14:16 WSPC/S0217-5959

attitudea cleanlinessa cleanlinessa cleanlinessa cleanlinessa


Building 1 33,655 58,035 185 1.0645 1.1429 1.1429 1.0000 0.8077 0.6129
Building 2 149,720 33,940 139 0.6765 1.0000 0.6765 0.6000 0.6111 0.6563
P. Y. L Wong, S. C. H. Leung & J. D. Gilleard

Building 3 31,000 90,870 1,022 0.6286 0.9000 0.7826 0.2131 0.2203 0.0001
Building 4 54,010 146,055 1,140 0.6347 1.0714 0.7500 0.4930 0.2714 0.0001
APJOR

1350013-10
Building 5 52,950 82,440 911 0.9947 1.1421 1.3895 1.0000 0.7436 0.6330
Building 6 21,800 22,175 339 0.9184 1.3542 0.9388 0.6000 0.5000 0.4286
Building 7 41,720 65,465 233 0.4063 0.6947 0.7938 0.5333 0.5833 0.4043
Building 8 28,225 54,660 398 0.6288 0.8779 0.8864 0.7328 0.6667 0.4615
Building 9 41,175 99,840 807 0.9038 1.2762 0.7767 0.2935 0.0659 0.1765
a The value ranges from 0 to 2.0. The more satisfied the customers are, the higher the value is.
2nd Reading
1350013.tex
2nd Reading
July 12, 2013 14:16 WSPC/S0217-5959 APJOR 1350013.tex

FM Benchmarking: An Application of DEA in Hong Kong

Table 5. DEA calculation results with


four inputs and nine outputs.

Unit Input-oriented efficiency


Building 1 0.79
Building 2 1.00
Building 3 1.00
Building 4 1.00
Building 5 1.00
Building 6 1.00
Building 7 1.00
Building 8 0.78
Building 9 0.83

Table 6. Efficient input targets.

Unit Total cleaning Total maintenance fee Energy use Total security
cost (security system) (kWh) cost
Building 1 −28% −21% −21% −56%
Building 8 −22% −80% −22% −68%
Building 9 −22% −39% −17% −56%

Table 7. General customer satisfaction


on FM services.
Unit Input-oriented efficiency
Building 1 0.841
Building 2 0.818
Building 3 0.791
Building 4 0.667
Building 5 1.284
Building 6 1.021
Building 7 0.604
Building 8 0.788
Building 9 0.578

Table 8. DEA calculation results with


four inputs and four outputs.

Unit Input-oriented efficiency


Building 1 0.73980
Building 2 1.00000
Building 3 1.00000
Building 4 1.00000
Building 5 1.00000
Building 6 1.00000
Building 7 1.00000
Building 8 0.75371
Building 9 0.84200

1350013-11
2nd Reading
July 12, 2013 14:16 WSPC/S0217-5959 APJOR 1350013.tex

P. Y. L Wong, S. C. H. Leung & J. D. Gilleard

Table 9. Efficient input targets.

Unit Total cleaning Total maintenance fee Energy use Total security
cost (security system) (kWh) cost
Building 1 −33% −26% −26% −57%
Building 8 −25% −80% −25% −69%
Building 9 −29% −52% −16% −50%

The company typically assesses perceived level of security, attitude of guard,


cleanliness of common areas, cleanliness of office space, cleanliness of pantry, and
cleanliness of the washroom. Each assessment is assumed to be independent, or
uncorrelated.
The DEA output indicates that buildings 1, 8, 9 are relatively inefficient. Based
on the DEA calculations, efficiency of the FM units may be improved by better
allocating resources on the items which are considered to be more important by
customers.
With reference to Pan et al. (2010), according to Tables 4–9, there is no clear
indication from the DEA results noting whether the inefficiencies are due to the
pure technical inefficiencies or scale inefficiencies. Further research is suggested.

4. Discussion and Summary


The common index methodologies constructed from the data failed to rank the
performance of the buildings in the correct order, as defined by the occupants’
overall assessments. It is better to assess each of the several aspects separately
rather than rely only on a combined index (Humphreys, 2005). The same applies
to FM.
As pointed out by Pinder and Price (2005) and Wong (2005), DEA can be
applied as a means for screening benchmarking FM datasets and indicating items
to be improved. The results presented in this paper confirm that it does. One of the
advantages of using DEA in FM benchmarking is that it assists facility managers to
develop a clear picture of their facility operation: inputs and outputs. The picture
is important in planning improvement actions. The DEA-generated improvement
targets can be easily acquired with linear programming software and be applied in
formulating outsourcing policies, specifications development and FM planning.
Since the early 1990s, researches in FM and Benchmarking have stressed the
importance of objective measurement based on hard and soft data (Kincaid, 1994).
DEA is one of the few benchmarking tool which can take account both hard and
soft data.
With reference to an investigation of feedback collecting techniques of build-
ing performance by Bordass and Leaman (2005), members of benchmarking group
or study group can easily identify their improvement targets with different feed-
back collecting techniques. DEA enhances post-occupancy evaluation effectiveness.

1350013-12
2nd Reading
July 12, 2013 14:16 WSPC/S0217-5959 APJOR 1350013.tex

FM Benchmarking: An Application of DEA in Hong Kong

A financial incentive for FM team could be attached to the achievement of the


performance target.
To summarize, DEA model compare the efficiency of facilities with similar goals
and objectives. The relative efficiency is calculated as a ratio of outputs to inputs.
Owners of buildings or facilities may consider applying DEA to improve performance
with reference to the following example with Excel Solver function of Microsoft
office:
PROPERTY Inc. owns and leases four office Grade A buildings located in dif-
ferent locations in Hong Kong. The buildings are of different size and management.
PROPERTY wishes to determine which building operates efficiently. To calculate
the relative efficiency with DEA, PROPERTY needs to compare input to output
in each building. The inputs may be chosen as cleaning cost, number of employees
and energy cost. Outputs are gross revenue, number of customers and customers’
satisfaction rating. The input and output data are summarized in Table 10.
There are six decision variables:
X1 , X2 , X3 = relative input weight for cleaning cost, number of employee,
and energy cost, respectively.
Y1 , Y2 , Y3 = relative input weight for gross revenue, number of customers,
and customers’ satisfaction rating, respectively.
To calculate the efficiency of Building A, we define the objective function as
Maximize efficiency = (354Y1 + 88Y2 + 7.5Y3 )/(152X1 + 48X2 + 275X3),
which is subject to all efficiency of other buildings (efficiency cannot be larger
than 1):
604Y1 + 89Y2 + 7.3Y3 ≤ 213X1 + 52X2 + 650X3 (efficiency of Building A)
663Y1 + 85Y2 + 6.8Y3 ≤ 265X1 + 65X2 + 900X3 (efficiency of Building B)
375Y1 + 94Y2 + 9.1Y3 ≤ 157X1 + 40X2 + 200X3 (efficiency of Building C)
354Y1 + 88Y2 + 7.5Y3 ≤ 152X1 + 48X2 + 152X3 (efficiency of Building D)
and all X and Y ≥ 0.

Table 10. Data for demonstration of DEA application with excel solver.

Building A Building B Building C Building D


Inputs
Cleaning cost (in thousand) 213 265 157 152
Number of employee 52 65 40 48
Energy cost (in thousand) 650 900 200 275
Outputs
Gross revenue (in thousand) 604 663 375 354
Number of customers 89 85 94 88
Customer’s satisfaction rating 7.3 6.8 9.1 7.5

1350013-13
2nd Reading
July 12, 2013 14:16 WSPC/S0217-5959 APJOR 1350013.tex

P. Y. L Wong, S. C. H. Leung & J. D. Gilleard

Few research studies were carried out about the applications of DEA for facilities
management performance. The current limitations for the applications are mainly
due to the mixing of the two types of variables (scale dependent (e.g., area or
number of staff) and others that are scale independent (e.g., satisfaction indices))
which may cause inaccurate estimates of efficiency. On the other hand, most rules
of thumb of applying DEA require that the number of units is considerably larger
than the number of variables.

5. Conclusions
This paper has provided a general view on the DEA application of FM benchmark-
ing. We established an Input–Process–Output system and measured the overall
efficiency levels for a portfolio of facilities by assessing their relative performance.
In the case study, by applying DEA, the facility manager could easily identify
the toughest (or the most efficient) competitors. The gaps between inefficient units
and efficient counterparts were evaluated. A convincing proposal with areas and
means of performance improvement could be made and submitted to the executive
level. The results also indicated that DEA could work with soft and hard data of
FM with clear indications for improvements.
Different from the case of retail store distribution network which can be improved
by either closing the less efficient stores or merging them with the others in the same
service areas (Lau, 2012), facility manager seldom propose closing or merging facili-
ties which often provide utility services. Facility manager can apply DEA-generated
improvement targets in formulating FM outsourcing policies, specifications devel-
opment, FM strategy and planning.
Future research should focus on the determinants of inputs and outputs of
FM efficiency; such that organizations may be able to take proactive measures
to improve performance.

References
Bordass, B and A Leaman (2005). Making feedback and post-occupancy evaluation
routine 3: Case studies of the use of techniques in the feedback portfolio. Build-
ing Research & Information, 33(4), 361–375.
Camp, RC (1995). Business Process Benchmarking: Finding and Implementing Best Prac-
tices. USA: ASQC Quality Press.
Charnes, A, WW Cooper and E Rhodes (1978). Measuring the efficiency of decision making
units. European Journal of Operational Research, 2, 429–444.
Cooper, WW, LM Seiford and K Tone (2000). Data Envelopment Analysis — A Compre-
hensive Text with Models, Applications, References and DEA-solver Software. Dor-
drecht: Kluwer Academic Publisher.
Homburg, C (2001). Using data envelopment analysis to benchmark activities. Interna-
tional Journal of Production Economics, 73, 51–58.
Humphreys, MA (2005). Quantifying occupant comfort: Are combined indices of the indoor
environment practicable? Building Research & Information, 33(4), 317–325.

1350013-14
2nd Reading
July 12, 2013 14:16 WSPC/S0217-5959 APJOR 1350013.tex

FM Benchmarking: An Application of DEA in Hong Kong

International Facility Management Association (2013). Available from www.ifma.org/


fmpedia/default.aspx.
Kincaid, DG (1994). Measuring performance in facility management. Facilities, 12(6),
17–20.
Lau, KH (2012). Distribution network rationalisation through benchmarking with DEA.
Benchmarking: An International Journal, 19(6), 668–689.
Lingle, JH and WA Schiemann (1996). From balanced scorecard to strategy gauge. Is
measurement worth it? Management Review, 85(3), 56–62.
Neely, AD and M Bourne (2000). Why measurement initiatives fail. Measuring Business
Excellence, 4(4), 3–6.
Pan, T, S Hung and W Lu (2010). DEA performance measurement of the national inno-
vation system in Asia and Europe. Asia-Pacific Journal of Operational Research,
27(3), 369–392.
Pinder, J and I Price (2005). Application of data envelopment analysis to benchmark
building outputs. Facilities, 23(11/12), 473–486.
Schaffnit, C, D Rosen and JC Paradi (1997). Best practice analysis of bank branches:
An application of DEA in a large Canadian bank. European Journal of Operational
Research, 98, 269–289.
Simmons, R (1999). Performance Measurement and Control Systems for Implementing
Strategy. New Jersey: Prentice Hall.
Wong, YLP (2005). Facility management benchmarking: Measuring performance using
multi-attribute decision tools. PhD thesis, The Hong Kong Polytechnic University,
Hong Kong.

Philip Y. L. Wong is a Senior Lecturer in the Department of Business Adminis-


tration, Caritas Institute of Higher Education. He earned his PhD in Facility Man-
agement from The Hong Kong Polytechnic University and his MSc degree in Sys-
tems Engineering & Engineering Management from The Chinese University of Hong
Kong. After completing his Bachelor of Engineering degree from University College
London, he worked as a traffic engineer. Philip is a member of Royal Institution of
Chartered Surveyors (MRICS). His research interests include facility management,
benchmarking and decision making tools.

Stephen C. H. Leung is an Associate Professor in the Department of Management


Sciences, The City University of Hong Kong. He earned his PhD in Operational
Research and Management Science and his MPhil degree in Operational Research
from The Hong Kong City University of Hong Kong. After completing his Bachelor
of Science from The Baptist University, he worked as a transport modeller. Stephen
is a Fellow member of The Chartered Institute of Logistics and Transport and a
Fellow member of The Institute of Mathematics and its Applications. His research
interests include logistics and supply chain management, operations management,
RFID applications, transportation and heuristic algorithms.

John D. Gilleard is the Director of Learning (Asia/Pacific) of CoreNet Global.


He was appointed the Associate Dean, Faculty of Construction and Land Use and
Professor and Head of Building Services Engineering Department, The Hong Kong

1350013-15
2nd Reading
July 12, 2013 14:16 WSPC/S0217-5959 APJOR 1350013.tex

P. Y. L Wong, S. C. H. Leung & J. D. Gilleard

Polytechnic University. John received his PhD in Computer-aided Architectural


Design from The University of Salford; his ME from The University of Liverpool
and his BSc, University of Manchester. He was granted an IFMA Fellowship in 2002.
His current research interests include benchmarking, usability in the workplace and
the development of facility management in China.

1350013-16
Copyright of Asia-Pacific Journal of Operational Research is the property of World Scientific
Publishing Company and its content may not be copied or emailed to multiple sites or posted
to a listserv without the copyright holder's express written permission. However, users may
print, download, or email articles for individual use.

You might also like