You are on page 1of 15

The current issue and full text archive of this journal is available at

www.emeraldinsight.com/0263-2772.htm

Accessibility in
The use of a multi-attribute tool buildings
for evaluating accessibility in
buildings: the AHP approach
375
S. Wu, A. Lee, J.H.M. Tah, and G. Aouad
School of Construction and Property Management,
University of Salford, Salford, UK

Abstract
Purpose – The purpose of this article is to develop a quantitative building accessibility assessment
model for the construction industry.
Design/methodology/approach – The building accessibility assessment criteria are incorporated
in a hierarchy structure based on the relevant building regulations and British standards. The analytic
hierarchy process (AHP) is employed to determine the priority of the accessibility criteria. A review of
the application of AHP is included in the paper. Finally, a case scenario is used to illustrate the method.
Findings – This paper provides a methodology to prioritize the building accessibility criteria and to
indicate how well a building design meets accessibility requirements quantitatively.
Practical limitations/implications – A model is advocated for use by accessibility consultants
and building designers to establish a quantitative assessment for building accessibility. It can also be
used in the development of accessibility assessment software.
Originality/value – This paper presents a novel quantitative building accessibility assessment
model.
Keywords Analytical hierarchy process, Buildings
Paper type Research paper

Introduction
Current UK legislation regarding accessibility is governed by Building Regulations
and the Disability Discrimination Act (1995), which highlights the need for inclusive
environments for all people irrespective of any disability, including people with
physical, sensory and cognitive impairments. The essential requirements of DDA were
introduced in the UK in September 2004. Since this date all service providers have been
required to make “reasonable adjustment” to the physical features of their premises to
overcome physical barriers to access’ (Sawyer and Bright, 2004). The new edition of
part M of the building regulations came into force in May 2004. It was revised to take
account of the guidance given in BS8300:2001, “Design of buildings and their
approaches to meet the needs of disable people – code of practice” (ODPM, 2004).
At present, the accessibility of an existing building or new development is assessed
through access audit and access appraisals; this is in addition to the access officer from
the local planning office involved in the development of public facilities during the
planning stage:
.
Access audit. Establishes how well a particular building or environment Facilities
Vol. 25 No. 9/10, 2007
performs in terms of access and ease of use by a wide range of potential users. pp. 375-389
The audit report should recommend access improvements, prioritise action and q Emerald Group Publishing Limited
0263-2772
indicate where improvements can be made (Sawyer and Bright, 2004). DOI 10.1108/02632770710772478
F .
Access appraisal. Audit of the proposals for a new development, refurbishment or
25,9/10 alteration. This involves making a detailed assessment of the proposed level of
accessibility in a building using drawings, specifications and consultation with
architect or designer (Sawyer and Bright, 2004).

The process of the both access appraisal and auditing involves a thorough site
376 inspection, an assessment of the management and use of the building and preparation
of a report that identifies accessible user-friendly feature as well as access problems.
However, the process of this assessment is quite complex and often includes a very
long checklist with over a hundred design criteria has to be completed. Furthermore,
accessibility design criteria is not limited to the quantifiable requirement, such as
physical dimensions of a door or space, it also includes a large number of subjective
requirements, such as “edges of glazed door should clearly visible, door furniture
should be distinguishable.” The assessment of such subjective requirements is heavily
depended on the experience of the assessor. As part of the access appraisal and
auditing, a prioritised recommendation is included. It is extremely difficult to set
appropriate priority under different scenarios when a large number of criteria are
involved. It is also difficult to provide a clear indicator for the client or designer about
the overall accessibility level of the building/environment at the end of the assessment
process. This research aims to develop a quantitative indicator for the overall
accessibility level of a building/environment. A number of multi-attribute
methodologies have been investigated, such as goal programming, linear
programming and analytic hierarchy process (AHP). The AHP was chosen to meet
the accessibility assessment requirement to support the highly structured access
design criteria and the capability to produce appropriate priority for each criterion.

A review of analytic hierarchy process (AHP)


The AHP method was developed by Thomas Saaty more than two decades ago for
elucidating and resolving unstructured problems in the economic, social and
management sciences. As Saaty (1980) stated:
To be realistic our models must include and measure all important tangible and intangible,
quantitatively measurable, and qualitative factors.
Methodologically, it combines the basics of qualitative and quantitative research to
solve decision problems by justifying the decision-making process. It is described by
Partovi (1994) as:
A decision-aiding tool for dealing with complex, unstructured and multi-attribute decision.
Muralidhar et al. (1990) support the belief that AHP particularly caters for decision
making with multi-criteria. Apart from this, the high precision of relative priorities in
the calculations enhances the effectiveness of this technique.
The applications of AHP have been applied in industry to solve commercial decision
problems and address empirical research issues (Easley et al., 2000). Decisions today are
more complicated and difficult to make due to the greater number of impacts on them (e.g.
larger set of factors or criteria) and severe consequences resulting from poor decisions (De
Boer et al., 2001). The AHP method is expected to circumvent other basic linear weighting
methods to deal with imprecision for complex problems (De Boer et al., 1998).
AHP has been extensively applied in different areas including marketing, finance, Accessibility in
education, public policy, economics, medicine, sports, amongst others (Saaty et al., buildings
1987; Saaty and Mu, 1997; Saaty and Nezhad, 1981; Saaty and Rush, 1987) because of
the ease of its use. New applications have been found in the fields of information and
management (Byun, 2001; Forgionne and Kohli, 2001; Lai et al., 1999; Yang and Huang,
2000). AHP has been applied more recently in construction research (Li et al., 2000).
Apart from its value in solving a decision problem (e.g. selection of a contractor), its 377
usage has been extended to prioritise elements in a survey environment. For example,
Tan and Lu (1993) applied AHP when prioritising the criteria and factors affecting the
quality of construction engineering design projects. Its popularity comes from three
main advantages as described by Saaty (1980):
(1) It helps to decompose a complex and unstructured real world multiple criteria
decision making problem (or research problem) into a set of elements in terms of
variables organised in a multilevel hierarchical form that also determines the
overall priorities by quantifying information providers’ subjective judgements.
(2) It employs a pairwise comparison process by comparing two objects at a time to
formulate a judgement as to their relative weights. As this method exhaustively
compares one element with others, it can generate more useful information
available to validate the results.
(3) It measures the consistency level of each judgement matrix. Some researchers
refer to the consistency measure as the consistency test (Cheng and Li, 2001;
Leung and Cao, 2001). Specially, with adequate measurements, the AHP is more
accurate (with fewer experimental errors) in achieving a higher degree of
consistency.

Despite the above advantages, the effectiveness of AHP as a decision tool has been the
subject of much debate. For example, as it uses a scale value from 1 to 9 for pairwise
comparisons. Decision makers or respondents may need time to compare all of the
paired elements, especially when the problem of study consists of many levels, where
the elements of each level further divide into many sub elements. It is also argued that
indication of the consistency level is not necessary when the information providers are
clear about what they want to rank.
A study by Easley et al. (2000) supported that other pairwise comparison methods
(except the ordinal paired comparison method) may be as accurate as AHP for making
group decisions, but was either less difficult to use or allowed for no consistency
measure of responding to matrixes. They suggested the paired hierarchical (ranking)
method, which relaxes the imposed 1-9 scale constraint, and the folded normal AHP
(FNAHP) (MacKay et al., 1996), which relaxes the reciprocal comparison required for
computing the consistency ratio. Nevertheless, there is a basic assumption of these two
methods that the consistency in responses must still be maintained, but it is
pre-assured rather than post measured (Easley et al., 2000). Moreover, owing to the
prevalence of using a simple scale to rate a group of multiple criteria in
questionnaire-based research, the answers may be misconstrued without paired
comparison, resulting in greater value of the consistency measure. Further, a study by
Cheng and Li (2001) concluded that the consistency measure is a critical component of
AHP, and it makes AHP more reliable and useful as decision-making tool.
F The accessibility of a building is determined by a large number of criteria, such as
25,9/10 the size of the door, size of the corridor, attitude of staff, etc. Current assessment
methods have to check every single criterion, but it does not highlight the relative
importance of each criterion, although an experienced access consultant is aware of the
priorities of the requirements.
Furthermore, the final assessment result is usually presented as a report, and does
378 not provide a direct and quantifiable indication of the overall accessibility level of the
building. It is difficult for the owner of an existing building to prioritise their resource
to make reasonable adjustment, and it is also difficult for designers to understand the
accessibility level of their new design. Therefore, a solution to prioritise the
accessibility criteria and be able to provide quantitative indicator is needed. In this
paper, AHP is employed to tackle this problem.

AHP methodology
AHP is a hierarchical representation of a system. A hierarchy is an abstraction of the
structure of the system as a result of the decomposition of the complexity of the system
into different levels, which represent functional interactions of its elements and their
impacts on the entire system (Saaty, 1980). In order to develop the accessibility
indicator with the AHP, the following five steps are taken:
(1) Define the building accessibility indicator.
(2) Construct a hierarchy of criteria affecting the accessibility indicator.
(3) Employ a pair-wise comparison method for the criteria.
(4) Compute the consistency level to drop out the inconsistent responses.
(5) Compute relative weights of each criterion.

Step 1: define the accessibility indicator


The building accessibility indicator is a methodology to establish relative importance
of the accessibility design criteria and a quantitative measure for the overall
accessibility level of a building.

Step 2: construct the hierarchy of the accessibility design criteria


The British Standards (BS 8300:2001), the revised Part M of the building regulation
and the provisions of DDA (1995) have been reviewed to establish the detailed design
criteria.
The BS 8300:2001 design of buildings and their approaches to meet the needs of
disabled people – code of practice gives detailed guidance on the design of domestic
and non-domestic buildings. It draws on research commissioned by the Department of
Environment, Transport and the Regions in 1997 and 2001 and it is the most
comprehensive standard to date covering the needs of people with disabilities.
The new edition of Part M of the building regulations refers to new and existing
buildings being accessible and usable by people and takes account of the guidance
given in BS 8300:2001.
Based on these requirements, the accessibility design criteria are structured into the
following four levels to form an accessibility assessment decision hierarchy (Figure 1).
Level I: it is the objective or the overall goal of the accessibility assessment, which is to
ensure the accessibility of a building and provide an indication of the overall
Accessibility in
buildings

379

Figure 1.
Accessibility criteria
hierarchy for physical
features
F accessibility level of a building. The name given to level I of the hierarchy is the
25,9/10 “overall accessibility level”.
Level II: the second level represents the scope of the accessibility assessment. The
access audit or appraisal usually covers two main areas, namely:
(1) Physical feature.
380 (2) Access management issues.

Physical feature includes (Sawyer and Bright, 2004):


.
any feature arising from the design or construction of a building on the premises
occupied by the service provider;
.
any feature on those premises or any approach to, exit from or access to such a
building; and
.
any fixtures fittings, furnishings, furniture, equipment or materials in or on such
premises.

Access management issues are dealt with the access to information, staff attitudes
towards disability and accessibility issues, management procedure, maintenance and
use of the building.
Level III: this level breaks level II into further detailed elements, which is related to
individual areas and specific functions of a facility. Table I shows the detailed items
identified as level III items (Table I).
Level IV: this level details the key building components affecting accessibility
assessment, which are represented at the fourth level of the hierarchy that contains
detailed access criteria of each building components. In total, 42 components are
included in the assessment criteria hierarchy (see Figure 1). It will be difficult to make a
pair-wise comparison with many criteria at multiple levels (Tam and Tummala, 2001).
Therefore, further detailed checklists are not included in the criteria hierarchy.

(A) Physical (A.1) External environment Including approach, parking, transport link,
features routes, external ramps and steps
(A.2) Entrance Including visibility, entry control, doors,
thresholds and lobbies
(A.3) Horizontal circulation Including ease of navigation, corridors, doors,
directional information, internal faces
(A.4) Vertical circulation Including internal steps and stairs, ramps,
escalators and lifts
(A.5) Facilities Including WCs, specific facilities
(A.6) Communication and way Including communication systems, such as
finding phones, lift voice announcers, signs, maps
(A.7) Emergency egress Escape routes, refuges, alarms, fire-protected
lifts, emergency lighting
(B) Access (A.1) Access to information Publicity information material, formats, hearing
management enhancements
Table I. (A.2) Attitudes of staffs Training in disability and accessibility issues
Level II and III (A.3) Management policy and Including those relating to emergency egress
accessibility criteria practices
hierarchy (A.4) Maintenance issues Maintaining facilities, and achieving accessibility
Step 3: employ pair-wise comparison Accessibility in
Once the assessment criteria hierarchy has been constructed, the next step is to buildings
determine the priorities of elements at each level (“element” here means every member
of the hierarchy). However, before starting the AHP process, it is important to consider
how the building is used, managed and operated. A building often contains many areas
where different functions may affect the access requirement. The entire building or
parts of the building can be classified according to its use. It is also helpful at this stage 381
to understand how it can be improved to suit the needs of the users. There are four use
classifications described in Table II. Furthermore, an analysis of the building using
this classification can help to give a more accurate picture of the specific access
requirements of each individual area and appropriate judgement can be made in the
following AHP comparison process.
To begin the AHP process, a set of comparison matrices of all elements in a level of
the accessibility criteria hierarchy with respect to an element of the immediately higher
level are constructed so as to prioritise and convert individual comparative judgements
into ratio scale measurements. The preferences are quantified by using a nine-point
scale. The meaning of each scale measurement is explained in Table I. The pairwise
comparisons are given in terms of how much element A is more important than
element B (Table III). As the AHP approach is a subjective methodology, information

Use classification

1. Complete freedom of movement Free to enter, wander around, and leave without need to
make any contact with assistance points
Environment type: shops, shopping centre, department
stores, supermarket, non-fee-paying museums and
exhibitions
Information: provide remotely, usually by sign, information
point is helpful
Ensure appropriate levels of accessibility for all users
2. Controlled entry/freedom of Have point of control, payment desk or security point, after
movement pass that point, user will be allowed to have freedom of
movement
Environment type: sports halls, fee-paying museums, art
galleries, exhibition halls, libraries, some educational
buildings
Provide remote information, good lighting on reception area.
Good management practices to provide assistance if
required
3. Free entry/controlled movement Have a central entrance and movement inside building will
be restricted
Environment type: town halls, civic centres, major post
office, airport
4. Controlled entry/controlled Security is a major issue and the type of visitor will be
movement restricted. The control facility needs to be fully accessible;
the needs of user can be assessed at initial point of contact
Environment type: office with car park control, research
laboratories, some banks
Table II.
Source: Sawyer and Bright (2004) Use classification
F
Intensity of importance Description
25,9/10
1 Two compared elements are equally important
3 Moderate importance of one element when compared with another
element
5 Strong importance of one element when compared with another
382 element
7 Very strong importance of one element when compared with another
element
9 Absolute importance of one element when compared with another
element
2, 4, 6, 8 Intermediate values of the above non-zero numbers
If an element i has been assigned to one of the above numbers when
compared with another element j, then j has the reciprocal value when
Table III. compared with i
Scale of measurement in
pairwise comparison Source: Saaty (1980)

and the priority weights of elements may be obtained from the decision-maker of the
company using direct questioning or a questionnaire method. In the case example, the
data is acquired through discussion within an accessibility expert, disable users and
the research team. Tables IV to VII are the result of the pair comparison matrix for
both “physical feature” and “management issues”.

Step 4: computing the consistency level


The pairwise comparisons generate a matrix of relative rankings for each level of the
hierarchy. Table V shows the relative ranking for each element in the physical feature,
and Table VII shows the relative ranking for each element in management issues.
The number of matrices depends upon the number of elements at each level. The
order of the matrix at each level depends on the number of elements at the lower level
that it links to. After all matrices are developed and all pairwise comparisons are
obtained, eigenvectors or the relative weights (the degree of relative importance
amongst the elements), global weights, and the maximum eigenvalue (lmax) for each
matrix are then calculated in spreadsheet.
The lmax value is an important validating parameter in AHP. It is used as a
reference index to screen information by calculating the consistency ratio CR (Saaty,
2000) of the estimated vector in order to validate whether the pair-wise comparison

A1 A2 A3 A4 A5 A6 A7

A1 1.0000 5.0000 3.0000 3.0000 3.0000 2.0000 4.0000


A2 0.2000 1.0000 0.3333 0.3333 0.5000 0.3333 0.5000
A3 0.3333 3.0000 1.0000 1.0000 3.0000 2.0000 2.0000
A4 0.3333 3.0000 1.0000 1.0000 2.0000 2.0000 2.0000
Table IV. A5 0.3333 2.0000 0.3333 0.5000 1.0000 1.0000 0.5000
Paired comparison for A6 0.5000 3.0000 0.5000 0.5000 1.0000 1.0000 0.3333
physical feature A7 0.2500 2.0000 0.5000 0.5000 2.0000 3.0000 1.0000
Accessibility in
A1 A2 A3 A4 A5 A6 A7
buildings
A1 0.0476 0.2381 0.1429 0.1429 0.1429 0.0952 0.1905
A2 0.0625 0.3125 0.1042 0.1042 0.1563 0.1042 0.1563
A3 0.0270 0.2432 0.0811 0.0811 0.2432 0.1622 0.1622
A4 0.0294 0.2647 0.0882 0.0882 0.1765 0.1765 0.1765
A5 0.0588 0.3529 0.0588 0.0882 0.1765 0.1765 0.0882 383
A6 0.0732 0.4390 0.0732 0.0732 0.1463 0.1463 0.0488
A7 0.0270 0.2162 0.0541 0.0541 0.2162 0.3243 0.1081
Average weight 0.0465 0.2952 0.0861 0.0903 0.1797 0.1693 0.1329
Position 7 1 6 5 2 3 4 Table V.
Normalised matrix for
Note: CR ¼ 0:046 physical feature

B1 B2 B3 B4

B1 1.0000 0.2000 0.3333 0.1429


B2 5.0000 1.0000 3.0000 0.3333 Table VI.
B3 3.0000 0.3333 1.0000 0.2000 Paired comparison for
B4 7.0000 3.0000 5.0000 1.0000 management issues

B1 B2 B3 B4

B1 0.5966 0.1193 0.1989 0.0852


B2 0.5357 0.1071 0.3214 0.0357
B3 0.6618 0.0735 0.2206 0.0441
B4 0.4375 0.1875 0.3125 0.0625
Average weight 0.5579 0.1219 0.2633 0.0569
Position 1 3 2 4 Table VII.
Normalised matrix for
Note: CR ¼ 0:044 management issues

matrix provides a completely consistent evaluation. The consistency ratio is calculated


as per the following steps:
(1) Calculate the eigenvector or the relative weights and lmax for each matrix of
order n.
(2) Compute the consistency index for each matrix of order n by the formulae:
CI ¼ ðl max 2 nÞ=ðn 2 1Þ ð1Þ

(3) The consistency ratio is then calculated using the formulae:


CR ¼ CI =RI ð2Þ
F where RI is a known random consistency index obtained from a large number of
25,9/10 simulation runs and varies depending upon the order of matrix. Table VIII shows the
value of the random consistency index (RI) for matrices of order 1 to 10 obtained by
approximating random indices using a sample size of 500 (Saaty, 2000).
It has been a difficult process to maintain consistency in judgement during these
exercises when the size of paired comparison matrix increased. The consistency ratio
384 of the matrix of physical features is 0.046 and the size of the matrix is 7. This
comparison matrix had to be completed twice in order to achieve a consistency
judgement. Therefore, it is essential to group the related criteria into several groups
and each group contains no more than ten sub criteria.

Step 5: computing relative weights of each criterion


Saaty (1996) points out that “if there are more than two levels, the various priority
vectors can be combined into priority matrices, which yield one final priority vector for
the bottom level”. Local priority is the priority relative to its parent. Table IX shows the
priority of each criterion in the final selection of a contractor. Global priority, also
called final priority, is the priority relative to the goal.
As mentioned earlier, detailed checklists for each of the building components are not
included in the criteria hierarchy to reduce the number of pairwise judgments. In order
to provide a quantitative measurement at this bottom level, a five-point rating
(Liberatore et al., 1992; Liberatore, 1987) is introduced into this model. The alternative
approach is to use the five-rating score of outstanding (O ¼ 1), good (G ¼ 2), average
(A ¼ 3), fair (F ¼ 4) and poor (P ¼ 5) to rate each of the building components. Also,
this helps to decrease unexpected bias that might occur in the process of
decision-making when there are a large number of sub-factors to be compared.
In this paper, a demonstration exercise is conducted based on a building with
“complete freedom of movement” use classification. Table IX shows the global weight
of all level 4 criteria. The entrance doors (0.0501), WCs (0.0409) and reception area
(0.0438) are the three most important criteria of physical features for our scenario
building. These results are acknowledged by our accessibility experts in the research
team. The criteria of “access to information” and “management policy and practices”
under access management also have high global weight, which are 0.1859 and 0.0878,
that is because these criteria are not detailed further in level 4 criteria. The exercise also

Size of matrix (n) Random consistency index (RI)

1 0
2 0
3 0.52
4 0.89
5 1.11
6 1.25
7 1.35
8 1.40
9 1.45
Table VIII. 10 1.49
Average random index
(RI) based on matrix size Source: Saaty (2000)
Accessibility in
Accessibility criteria Local weight Global weight Score
buildings
(A) Physical features 0.6667 –
(A.1) External environment 0.0465 –
(A.1.1) Car parking 0.1931 0.0060 0.0239
(A.1.2) Setting down 0.097 0.0030 0.0150
(A.1.3) External routes 0.2381 0.0074 0.0221
(A.1.4) External steps and stairs 0.4717 0.0146 0.0292 385
(A.2) Entrance 0.2952 –
(A.2.1) Design 0.0344 0.0068 0.0203
(A.2.2) Doors 0.2548 0.0501 0.2006
(A.2.3) Thresholds 0.0483 0.0095 0.0475
(A.2.4) Lobbies 0.1585 0.0312 0.0936
(A.2.5) Entry system 0.0879 0.0173 0.1038
(A.2.6) Exits 0.1933 0.0380 0.1141
(A.2.7) Reception area 0.2227 0.0438 0.2191
(A.3) Horizontal circulation 0.0861 –
(A.3.1) Ease of navigation 0.0691 0.0040 0.0119
(A.3.2) Corridors and passageways 0.3489 0.0200 0.1001
(A.3.3) Surfaces 0.0874 0.0050 0.0151
(A.3.4) Handrails 0.1599 0.0092 0.0459
(A.3.5) Internal doors 0.3346 0.0192 0.0576
(A.4) Vertical circulation 0.0903
(A.4.1) Internal steps and stairs 0.0704 0.0042 0.0212
(A.4.2) Internal ramps 0.1436 0.0086 0.0259
(A.4.3) Escalators 0.1304 0.0079 0.0157
(A.4.4) Platform lift and stair lift 0.1997 0.0120 0.0120
(A.4.5) Passenger lifts 0.4559 0.0274 0.0000
(A.5) Facilities 0.1797 –
(A.5.1) WCs 0.4139 0.0496 0.0992
(A.5.2) Changing facilities 0.2117 0.0254 0.0761
(A.5.3) Rooms 0.0494 0.0059 0.0118
(A.5.4) Storage facilities 0.0301 0.0036 0.0144
(A.5.5) Refreshment areas 0.0716 0.0086 0.0429
(A.5.6) Counter and services desk 0.1155 0.0138 0.0692
(A.5.7) Assembly areas 0.1077 0.0129 0.0258
(A.6) Communication and way finding 0.1693
(A.6.1) Information 0.3523 0.0398 0.1591
(A.6.2) Wayfinding 0.3062 0.0346 0.0691
(A.6.3) Lighting 0.1621 0.0183 0.0915
(A.6.4) Colour and luminance contrast 0.1253 0.0141 0.0283
(A.6.5) Acoustic environment 0.0541 0.0061 0.0183
(A.7) Emergency egress 0.1329 –
(A.7.1) Evacuation plan 0.0329 0.0029 0.0117
(A.7.2) Horizontal and vertical evacuation 0.1297 0.0115 0.0575
(A.7.3) Refuges 0.0595 0.0053 0.0105
(A.7.4) Assistance 0.046 0.0041 0.0204
(A.7.5) Evacuation lifts 0.1213 0.0107 0.0537
(A.7.6) Alarm system 0.1867 0.0165 0.0331
(A.7.7) Emergency lighting 0.0909 0.0081 0.0161
(A.7.8) Final exit 0.3329 0.0295 0.0295
(B) Access Management 0.3333 –
(B.1) Access to information 0.5579 0.1859 0.3719
(B.2) Attitudes of staffs 0.1219 0.0406 0.1219
(B.3) Management policy and practices 0.2633 0.0878 0.2633
(B.4) Maintenance issues 0.0569 0.0190 0.0759 Table IX.
Local weight and global
Note: The final score is 2.96 weight
F demonstrated how to use the five-rating method to calculate the overall accessibility
25,9/10 indicator for the whole building. The final indicator is 2.96, which means that this
building meets about 60 per cent of the accessibility requirements.

Conclusion
386 New legislation and building regulations have made building accessibility even more
important than before. A quantitative measure of building accessibility is needed to
help clients and designers to understand accessibility issues and manage the services
to comply with the new requirements. This research proposed “the accessibility
indicator” as a methodology using the AHP method to help determine the relative
importance of all the accessibility criteria and produce quantitative measure as a result.
It is not an absolute indicator for building accessibility, but aims to represent how well
a building meets the specified accessibility requirements based on the legislation,
building regulation and user requirements.
A total of 54 criteria were collected from the building regulations, British Standards,
and various accessibility literatures. They have been structured into a hierarchy which
AHP method can be performed on. However, the criteria was still too large and was
difficult to maintain consistent judgements for the paired comparison exercise. The
next step of the research is to establish a large set of data to provide basic indicators for
different types of buildings under different scenarios and users can use these indicators
as a base to customise their criteria to suit their specific requirements. Also computer
tool needs to be developed to manage the data and assist the comparison exercise.

References
Byun, D. (2001), “The AHP approach for selecting an automobile purchase model”, Information
and Management, Vol. 38 No. 5, pp. 289-97.
Cheng, E.W.L. and Li, H. (2001), “Information priority setting for better resource allocation using
analytic hierarchy process (AHP)”, Information Management & Computer Security, Vol. 9
No. 2, pp. 61-70.
De Boer, L., Labro, E. and Morlacchi, P. (2001), “A review of methods supporting supplier
selection”, European Journal of Purchasing and Supply Management, Vol. 7 No. 2, pp. 75-89.
De Boer, L., Van der Wegen, L. and Telgen, J. (1998), “Outranking methods in support of supplier
selection”, European Journal of Purchasing and Supply Management, Vol. 4 Nos 2/3,
pp. 109-18.
Disability Discrimination Act (1995), available at: www.hmso.gov.uk/acts/acts1995/ (accessed
January 2004).
Easley, R.F., Valacich, J.S. and Venkataramanan, M.A. (2000), “Capturing group preferences in a
multicriteria decision”, European Journal of Operational Research, Vol. 125, pp. 73-83.
Forgionne, G.A. and Kohli, R. (2001), “A multiple criteria assessment of decision technology
system journal quality”, Information and Management, Vol. 38 No. 7, pp. 421-35.
Lai, V.S., Trueblood, R.P. and Wong, B.K. (1999), “Software selection: a case study of the
application of the analytical hierarchical process to the selection of a multimedia authoring
system”, Information and Management, Vol. 36 No. 4, pp. 221-32.
Leung, L.C. and Cao, D. (2001), “On the efficacy of modeling multi-attribute decision problems
using AHP and Sinarchy”, European Journal of Operational Research, Vol. 132, pp. 39-49.
Li, H., Cheng, E.W.L. and Love, P.E.D. (2000), “Partnering research in construction”, Engineering, Accessibility in
Construction and Architectural Management, Vol. 7 No. 1, pp. 76-92.
buildings
Liberatore, M.J. (1987), “An extension of the analytical hierarchy process for industrial R&D
project selection and resource allocation”, IEEE Transactions on Engineering
Management, Vol. 34 No. 1, pp. 12-18.
Liberatore, M.J., Nydick, R.L. and Sanchez, P.M. (1992), “The evaluation of research papers (or
how to get an academic committee to agree on something)”, Interfaces, Vol. 22 No. 2, 387
pp. 92-100.
MacKay, D.B., Bowen, W.M. and Zinnes, J.L. (1996), “A Thurstonian view of the analytic
hierarchy process”, European Journal of Operational Research, Vol. 89, pp. 427-44.
Muralidhar, K., Santhnam, R. and Wilson, R.L. (1990), “Using the analytic hierarchy process for
information system project selection”, Information and Management, February, pp. 87-95.
ODPM (2004), “Access to and the use of buildings – Part M of The Building Regulation 2000”,
The Office of the Deputy Prime Minister, London.
Partovi, F.Y. (1994), “Determining what to benchmark: an analytic hierarchy process approach”,
International Journal of Operations & Production Management, Vol. 14 No. 6, p. 25.
Saaty, T.L. (1980), The Analytic Hierarchy Process, McGraw-Hill, New York, NY.
Saaty, T.L. (1996), Multicriteria Decision Making, RWA Publications, Pittsburgh, PA.
Saaty, T.L. (2000), Fundamentals of Decision Making and Priority Theory, 2nd ed., RWS
Publications, Pittsburgh, PA.
Saaty, T.L. and Mu, E. (1997), “The Peruvian hostage crisis of 1996-1997: what should the
government do?”, Socio-economic Planning Sciences, Vol. 31 No. 3, pp. 165-72.
Saaty, T.L. and Nezhad, H.G. (1981), “Oil prices: 1985 and 1990”, Energy Systems and Policy,
Vol. 5, pp. 303-18.
Saaty, T.L. and Rush, M. (1987), “A new macroeconomic forecasting and policy evaluation
method using the analytic hierarchy process”, Mathematical Modelling, Vol. 9, pp. 219-31.
Saaty, T.L., Blair, A., Nachtmann, R. and Olson, J. (1987), “Forecasting foreign exchange rates:
an expert judgement approach”, Socio-economic Planning Sciences, Vol. 21 No. 6, pp. 363-9.
Sawyer, A. and Bright, K. (2004), The Access Manual, Blackwell Publishing, London.
Tam, M.C.Y. and Tummala, V.M.R. (2001), “An application of the AHP in vendor selection of a
telecommunications system”, Omega, Vol. 29 No. 2, pp. 171-82.
Tan, R.R. and Lu, Y. (1993), “On the quality of construction engineering design projects: criteria
and impacting factors”, International Journal of Quality & Reliability Management, Vol. 12
No. 5, pp. 18-37.
Yang, C. and Huang, J. (2000), “A decision model for IS outsourcing”, International Journal of
Information Management, Vol. 20 No. 3, pp. 225-39.

Further reading
Atthirawong, W. and MacCarthy, B. (2001), “Identification of the location pattern of
manufacturing plants in Thailand”, proceedings of the 6th International Manufacturing
Research Symposium, Cambridge, UK, 9-11 September, pp. 1-13.
Badri, M.A., Davis, D.L. and Davis, D. (1995), “Decision support models for the location of firms
in industrial sites”, International Journal of Operations & Production Management, Vol. 15
No. 1, pp. 50-62.
Blalock, H.M. Jr (1988), Social Statistics, 2nd ed., McGraw-Hill Book Co., Singapore.
F Canada, J.R. and Sullivan, W.G. (1989), Economic and Multi-attribute Evaluation of Advanced
Manufacturing Systems, Prentice-Hall, Englewood Cliffs, NJ.
25,9/10 Centre for Accessible Environments (1999), Deigning for Accessibility: An Essential Guide for
Public Buildings, Centre for Accessible Environments, London.
Cheng, E.W.L., Li, H. and Love, P.E.D. (2000), “Establishment of critical success factors in
construction partnering”, Journal of Management in Engineering, Vol. 16 No. 2, ASCE,
388 pp. 84-92.
Cheng, E.W.L., Li, H., Love, P.E.D. and Irani, Z. (1999), “The development of a conceptual
framework for structuring information in construction projects”, 9th Annual BIT
Conference: Business Information Technology Management Generative Futures,
CD Proceeding, The Manchester Metropolitan University, 3 and 4 November 1999.
Cheng, E.W.L., Li, H., Love, P.E.D. and Irani, Z. (2002a), “Development of a generic conceptual
framework for a total information networking system”, International Journal of Services,
Technology and Management, Vol. 3 No. 1, pp. 125-38.
Cheng, E.W.L., Li, H., Shen, L.Y. and Fong, P.S.W. (2002b), “Ranking of construction information
for various functions of the general contractors: a preliminary study”, Journal of
Construction Research, Vol. 3 No. 1, pp. 181-91.
Crowe, T.J., Noble, J.S. and Machimada, J.S. (1998), “Muti-attribute analysis of ISO 9000
registration using AHP”, International Journal of Quality & Reliability Management, Vol. 15
No. 2, pp. 205-22.
Hafeez, K., Zhang, Y. and Malak, N. (2002), “Determining key capabilities of a firm using
analytical hierarchy process”, International Journal of Production Economics, Vol. 76 No. 1,
pp. 39-51.
Hoffman, J.J. and Schniederjans, M.J. (1994), “A two-stage model for structuring global facility
site selection decisions: the case of the brewing industry”, International Journal of
Operations & Productions Management, Vol. 14 No. 4, pp. 79-96.
Jungthirapanich, C. and Benjamin, C.O. (1995), “A knowledge-based decision support system for
locating a manufacturing facility”, IEE Transactions, Vol. 27, pp. 789-99.
Krajewski, L.J. and Ritzman, L.P. (2002), Operations Management: Strategy and Analysis, 6th ed.,
Addison-Wesley, Boston, MA.
Lam, K. and Zhao, X. (1998), “An application of quality function deployment to improve the
quality of teaching”, International Journal of Quality & Reliability Management, Vol. 15
No. 4, pp. 389-413.
McCaffer, R. and Baldwin, A.N. (1984), Estimating and Tendering for Civil Engineering, Granada,
London.
MacCarthy, B. and Atthirawong, W. (2001), “Critical factors in international location decision:
a Delphi study”, Proceedings of 12th Annual Meeting of the Production and Operations
Management, Florida, USA, 30 March-2 April, 2001.
Ndekugri, I.E. and McCaffer, R. (1988), “Management information flow in construction
companies”, Construction Management and Economics, Vol. 6, pp. 273-94.
Platts, K.W. (1993), “A process approach to researching manufacturing strategy”, International
Journal of Operations & Production Management, Vol. 13 No. 8, pp. 4-17.
Platts, K.W., Mills, J.F., Bourne, M.C.S., Neely, A.D., Richards, A.H. and Gregory, M.J. (1998),
“Testing manufacturing strategy formulation processes”, International Journal of
Production Economics, Vol. 56/57, pp. 517-23.
Saaty, T.L. (1994a), Decision Making in Economic, Political, Social, and Technological
Environments with the Analytic Hierarchy Process, RWA Publications, Pittsburgh, PA.
Saaty, T.L. (1994b), “Highlights and critical points in the theory and application of the analytic Accessibility in
hierarchy process”, European Journal of Operational Research, Vol. 74, pp. 426-47.
Saaty, T.L. (1994c), “How to make a decision: the analytic hierarchy process”, Interfaces, Vol. 24
buildings
No. 6, pp. 19-43.
Saaty, T.L. (1995), Decision Making for Leaders: The Analytical Hierarchy Process for Decision in
a Complex World, RWA Publications, Pittsburgh, PA.
Saaty, T.L. (1996), Decision Making with Dependence and Feedback: The Analytic Network 389
Process, RWA Publications, Pittsburgh, PA.
Slack, N. and Lewis, M. (2002), Operations Strategy, Prentice-Hall, Harlow.
Yang, J. and Lee, H. (1997), “An AHP decision model for facility location selection”, Facilities,
Vol. 15 Nos 9/10, pp. 241-54.
Zahedi, F. (1986), “The analytic hierarchy process: a survey of the method and its applications”,
Interfaces, Vol. 16 No. 4, pp. 96-108.

Corresponding author
S. Wu can be contacted at: s.wu@salford.ac.uk

To purchase reprints of this article please e-mail: reprints@emeraldinsight.com


Or visit our web site for further details: www.emeraldinsight.com/reprints

You might also like