You are on page 1of 23

Abstract

Purpose – The paper proposes a maturity assessment method to measure the state of maintenance
practices in a company.
Design/methodology/approach – The method assumes that a maintenance department is
evaluated in terms of its managerial, organizational and technological capabilities. By its adoption it
is possible to analyse the maturity level reached by a company, in order to classify the criticalities in
its maintenance processes; a company can also make a benchmark with the best companies of a
reference sample.
Findings – The paper presents the method as a support to identify the levers to improve the maintenance
management system. The method is demonstrated on a company whose maturity is assessed before
making a benchmark against a sample of other manufacturing companies located in the Northern Italy.
Originality/value – The paper presents a scoring method for maturity assessment and a procedure
to use it in order to identify the criticalities in maintenance processes and to subsequently drive the
improvement of the maintenance management system. The paper should be useful both to
researchers and maintenance professionals interested in using new methods for long-term planning in
maintenance.
Keywords Manufacturing industries, Maintenance, Maintenance maturity, Maintenance processes,
Maintenance best practices, Survey
Paper type Research paper

1. Introduction
The investment to improve a maintenance management system is often a challenge
and the decision to invest is not straightforward, even if it is understood that the
profits and productivity of a company could be enhanced when maintenance
potentials are exploited (Pinjala et al., 2006). Even when proper maintenance
measurement systems (Parida and Kumar, 2006) are introduced in companies,
maintenance decisions are often limited in scope: indeed, the maintenance function is
normally bounded to a tactical and operational role and, correspondingly, the
economic and technical performances result from decision making done mostly
having in mind the short and midterm

The paper is the result of a research started in 2007 and developed through different
collaborations in Italy, Finland, France and Spain. The authors would like to thank all the
people who participated during these years, both for the conceptualization and initial tests of
the method (in alphabetical order, Adolfo Crespo Marquez, Juan F. Gomez Fernandez, Benoit
Iung, Erkki Jantunen, Eric Levrat) and its further use in empirical researches (in alphabetical
order, Filippo De Carlo, Stefano Ierace, Giorgio Mossa, Alberto Regattieri). A special thanks
goes to Marco Garetti who stimulated the research providing the initial idea of the whole
concept later developed and to all the people that participated at the empirical researches either
as MSc students or contracted collaborators at Politecnico di Milano (in alphabetical order,
Francesco Di Leone, Danilo Elefante, Klodian Farruku, Matteo Gasparetti, Sergio Pizzolante,
Paolo Rosa).
JQME (Tsang, 1998). Nevertheless, decision making on the long term is a winning
19,3 choice in order to effectively improve the maintenance management system: indeed,
the maintenance management strategic vision is an important issue to discuss (Murthy
et al., 2002).
Jonsson (1999), when reporting the results of a survey in Sweden on the
maintenance strategy, proves that companies with perceived long-term maintenance
296 plans emphasize the manufacturing capabilities more than other companies, thus
enabling to achieve a competitive advantage in business. Swanson (2001), based on a
similar survey with manufacturing plants in USA, focuses on the proactive
maintenance as a strategic lever positively correlated with the long-term improvement
of equipment availability, product quality and production costs. Strategic maintenance
planning is thus worth to be studied (Al-Turki, 2011). Besides, maintenance should be
considered as a separate value chain to better understand its importance for the overall
business strategy (Pinjala et al., 2006). Maintenance management models (see for
instance Crespo Ma´rquez et al., 2009; Lopez Campos and Crespo Ma´rquez, 2011) are
a relevant source to this end: they represent processes and practices, a company can
conform to in order to implement the improvement strategy.
Plenty of models has been proposed in the past and recent literature. Looking at the
chronology of the proposals, it can be noticed that novel features have been added
from time to time (Lo´pez Campos and Crespo Ma´rquez, 2009). Indeed, considering a
sample of such models, it can be observed that novelties refer to managerial,
organizational or technological capabilities of the maintenance department.
The organizational capabilities assume a particular importance for the maintenance
management model of Pintelon and Gelders (1992), especially for what concern the
link between the maintenance function and other organizational functions. Besides,
Pintelon and Gelders (1992) underline the relevance of managerial capabilities,
acquired through the use of quantitative techniques, total productive maintenance
(TPM) and reliability centred maintenance (RCM) as practices to support
maintenance management decision making. According to Vanneste and Van
Wassenhove (1995), an integrated and structured approach is the solution for
improving maintenance: their integrated approach first glimpses a joint exploitation of
the organizational and managerial capabilities, also considering the support of
technological solutions.
The technological capabilities are then addressed also by successive management
models. To this concern, it is worth mentioning Tsang (2002) since he is one of the
first authors that suggests e-Maintenance as part of the maintenance management
model. The e-Maintenance concept has been further developed since then, eventually
leading to the comprehensive definition of e-Maintenance as: a “maintenance support
which includes the resources, services and management necessary to enable proactive
decision process execution. This support includes e-technologies (i.e. ICT, web-based,
tether-free, wireless, infotronics technologies) but also e-Maintenance activities (that
is operations or processes) such as e-monitoring, e-diagnosis, e-prognosis, etc.”
(Muller et al., 2008). Discussions on the technological capabilities in a maintenance
management system have been inspired by this seminal definition (for instance,
further contributions are provided in Jantunen et al., 2008; Fumagalli et al., 2010).
Notwithstanding the advancement in technologies, the recent literature has
continued to emphasize the importance of the organizational and managerial
capabilities. So¨derholm et al. (2007) underline that maintenance has a relevant role for
fulfilling the requirements of external stakeholders, interested in the values brought by
the maintenance activities, e.g. production managers and production operators.
An enlargement of scope – due to the consideration of the external stakeholders – is
then suggested, especially in order to drive the improvement of organizational
capabilities in the maintenance management system; in particular, cross-functional
correlation with production is considered as an important issue to this concern.
Assuming another perspective, Crespo Ma´rquez (2007) proposes a model oriented to
the improvement of the operational reliability besides the life cycle cost of the physical
assets, thus leading to the enlargement of maintenance management towards asset life
cycle management, which is another relevant concept to improve managerial
capabilities. Knowledge management has also been considered as a lever, mainly for
improving the organizational capabilities, even if the computerization – hence, the
technological capabilities – becomes essential in order to effectively operate it; the
reader can see for example the incorporation of the tacit and explicit knowledge in a
computer database as discussed in Waeyenbergh and Pintelon (2002). Crespo Ma
´rquez and Gupta (2006) provide an overall picture by proposing a maintenance
management framework made of three pillars as enablers to support organizational
improvements and managerial actions based on technological solutions. The pillars are
mentioned as: IT (including: computerized maintenance management system (CMMS),
e-Maintenance, condition monitoring technologies), maintenance engineering
techniques (including: RCM, TPM, maintenance policy optimization model, etc.),
organizational techniques (including: relationships, management techniques,
motivation, operators involvement, etc.). Amongst the three pillars, further insights on
the maintenance engineering pillar are provided in a later publication (Crespo Ma
´rquez et al., 2009) that envisions the set of activities and engineering tools needed,
as good practice, for achieving proper managerial capabilities in order to keep and
improve maintenance effectiveness and efficiency. Last but not least, Lo´pez Campos
et al. (2010) suggest the alignment of the maintenance management model to the
quality management standards, having a special concern for the quality in
maintenance processes: this is a relevant issue both for controlling the organizational
and managerial capabilities of the maintenance department and for driving the
development of technological solutions for its support.
All the above cited models can be considered powerful descriptive tools: according
to Lo´pez Campos and Crespo Ma´rquez (2009), at least they describe the
components of the maintenance management system (i.e. the declarative models), at
most they represent process-oriented descriptions normally providing also the
information flow between the components (i.e. the process-oriented models).
Conforming to such models – hence, to the good/best practices they provide – is a
pragmatic idea to improve the maintenance management system of a company,
according to “the best of the breed” at the state of the art. Nonetheless, this is not
enough in the experience of the authors. A company, in fact, normally requires to
assess the current quality of the maintenance management system before deciding an
investment for its improvement. Nevertheless, to this end, just describing the
maintenance processes and checking the difference with existent maintenance
management models from literature is not deemed enough. A further insight is
needed.
This research postulates that it is necessary that the company classifies the
criticalities in its maintenance processes (in order to answer the question “which are
the most critical processes to invest on?”) and makes a benchmark with the best
companies to better drive the investment (to answer the question “which are the
processes to focus on, considering the best companies as a reference target?”). Under
this postulation, the paper proposes a maturity assessment method to identify the
practices to be improved in the different maintenance processes of a company.
Using the concept of maturity can be, in general, considered a way to assess tangibly –
normally by using a score – how a business process is conducted. Amongst the benefits of
maturity models, Volker et al. (2011) mention that they provide a normative description
of the good/best practices which, in the concern of this paper, becomes the rank of
maintenance practices. Based on such a rank, the maturity assessment method is intended,
first, for being used as a discussion tool in order to enable the interviewees (i.e.
maintenance managers) reflecting on the current status of the maintenance department:
this is essential before deciding an investment. Therefore, the rank may also help
measuring the distance of the company’s model with respect to the good/best practices
proposed by the maintenance management models in literature and/or by the best
companies. As a consequence, a classification of maintenance processes, from the highest
to the lowest critical process, might be obtained: this is also advisable because, in general,
a criticality analysis helps better deciding the target of an investment.
The method has been developed starting from the results of previous works. Its
concept was first presented in Garetti et al. (2007). Afterwards, empirical tests were
performed by assessing the maintenance practices in the process production sector
(Fumagalli et al., 2008), in the case of maintenance service providers in the
telecommunication networks (Gomez Fernandez et al., 2008) and for original
equipment manufacturers (OEMs) extending their products with maintenance services
(Macchi et al., 2010). In these contexts the maturity assessment method was used as a
tool for a case study analysis and achieved positive feedbacks for what concern its
potentials to assess the maturity of maintenance processes. The present paper aims at
further extending the initial empirical evidences presented in Macchi et al. (2011),
where the method was indicated as tool used in a survey on manufacturing companies,
run in the context of the observatory on “Technologies and Services for Maintenance”
of the School of Management of Politecnico di Milano (TeSeM, 2012[1]). The
maturity assessment method is now thoroughly presented in its background (section
2), its theoretical model (section 3) and its use through the analysis applied to a
company selected from the sample of companies of the TeSeM research (section 4).
The demonstration of its use is quite important because it provides a proof that the
method helps companies focusing on potential investments for improving their
maintenance management system. The concluding remarks envision the future
exploitations of the method, thanks to its potentials for empirical researches and
benchmarking projects (section 5).

2. A literature background on maturity assessment


Defining a rank of maintenance practices is a known but still open problem.
“Maturity” is a concept that has been introduced to this end, in order to assess
different maintenance management practices in a qualitative or a semi-quantitative
way. This does not mean that “maturity” is necessarily used to define an holistic
indicator to make the rank: a maturity assessment could be developed for analysing
different aims or process areas (PAs) (Garetti et al., 2007; Macchi et al., 2010; CMMI
Product Team, 2010) and more indicators – usually based on a scorecard – may be
adopted for this concern. In the research presented in this paper, maintenance PAs are
assessed based on a set of indicators.
The design of such an assessment method has its theoretical roots in the
suggestions from literature, sourced looking at methods not necessarily designed for
analysing maintenance management in industry. Henceforth, the main existing
methods are first presented (section 2.1); afterwards, one of these methods is selected
as reference background to help designing the method proposed in this paper (section
2.2).

2.1 Overview of methods


Hain and Back (2009) present an extensive literature analysis on maturity models
which, even if applied in a specific scope (analysis of collaboration amongst
companies), enable a general understanding of what has to be done for developing
maturity assessment. From the methodological point of view, De Bruin et al. (2005)
propose an approach, based on a survey and a Delphi study, for testing the
effectiveness of a maturity assessment method. From an applicative point of view,
Kajko-Mattsson (2002) looks at the IT domain – which can be considered an original
field of interest for maturity assessment –, therefore she “specifies what a problem
management process should look like” and “structures each process into three
maturity levels (Initial, Defined, Optimal)”. Her proposal is interesting because it is
the result of an empirical study carried out in industries adopting corrective
maintenance on software products, thus it represents a good example of how a
maturity assessment method can be effectively moved to the analysis of maintenance
practices.
The above cited authors analysed (Hain and Back, 2009) or even used (Kajko-
Mattsson, 2002) the capability maturity model integration (CMMI) approach, originally
coming from the software engineering field. CMMI was developed by the Software
Engineering Institute at Carnegie Mellon University and it is one of the most mentioned
model to build a maturity assessment method (for further details, see CMMI Product
Team, 2001; CMMI Product Team, 2010; Minzoni, 2004). Shortly speaking, the CMMI
approach is derived from the capability maturity model (CMM) presented by Paulk et al.
(1993). CMM distinguishes PAs, where each area consists of a homogeneous set of key
practices for software engineering. This method achieved a great success and CMMI was
then proposed as a revised release, completed at the beginning of 2000s, in order to
cover a range of activities and key practices related both to software development and
engineering. Since then, CMM(I)-based models can also be found in the areas of project
management, design, reliability, project reviews and supply chain management.
Related to the topic of the present research, then, it is worth mentioning the
maturity model proposed by Hauge and Mercier (2003) to establish a roadmap for
RCM. The roadmap is based on the CMM method: a stage of the RCM
implementation roadmap is presented as a list of key practices, representing the
activities that, when performed, lead to a set of goals in a specific PA; being the goals
associated to maturity levels (MLs), when a group of goals is reached, the
correspondent ML holds.
The pressures, actions, capabilities, enablers (PACE) framework is another
maturity model proposed by the Aberdeen Group (2006), and it is based on a scheme
which differs from previous proposals (i.e. it is not CMM(I)-based). Indeed, it is built
on the concept that a pressure on a firm (e.g. to maximize the availability) needs
actions (as preventive maintenance programs), which can be implemented thanks to
capabilities (as the real time monitoring) and enablers (as the CMMS for data
storage). According to this framework, a firm can compare its responses to pressures
with those provided by “best in class” companies, to make a benchmark. The
understating idea is that the firm may focus on specific objectives in order to improve
its business and, subsequently, it may achieve different maturities as a result of the
actions, capabilities, enablers chosen to achieve responses to pressures (P).
Another method, worth to be mentioned, is presented by Cholasuke et al. (2004).
Their proposal is based on a maintenance maturity grid built upon two variables.
The variable named as “good maintenance practices employment” considers a list of
factors in order to represent areas which to work on to achieve a successful
maintenance management. Then, a series of good practices are indicated for each
factor: e.g., in reference to the spare parts management factor, the adoption of Pareto
diagrams to control stock and the recording of spare parts” costs are the good
practices that can be employed. On the basis of the list of factors, each firm achieves a
percentage of adopted practices, with respect to the whole enlisted. The second
variable is the “benefits gained from maintenance”: this is used to measure the
effectiveness of the maintenance management system. Henceforth, a student t-test of
significance is performed in order to identify those factors that actually influence the
effectiveness of the maintenance management system: the factors considered as
significantly related to the maintenance effectiveness are then defined.
Amongst the most recent references, Schuh et al. (2009) propose a method close to
the one presented in this paper: they adopt the idea to customize the use of the CMM
method for the analysis of maintenance in industrial firms. In particular, their maturity
assessment is aimed at supporting the improvement of maintenance organizations
through a diagnosis of maintenance practices. The maturity model, based on CMM, is
specifically adopted to develop an assessment procedure providing the MLs of
different maintenance PAs of the firm under analysis. The assessment of MLs is then
used, jointly with a pairwise comparison of the importance of the PAs, to prioritize the
improvements required for the maintenance organization. A case study is also shown
thus demonstrating the applicability of the method: as a concluding remark after this
initial empirical study, it is envisioned that the maturity assessment may be
periodically applied to assure continuous improvement of maintenance in a firm.
Comparing the above mentioned approaches, the PACE framework is quite promising
but it requires a better understanding of the “pressure’ concept to distinguish different
levels of pressure that a firm can be subjected to. Since “pressure” is a different concept,
it can be reasonably treated separately from the “maturity” concept: this research has then
preferred leaving it separated (as in Macchi et al., 2011), keeping an open issue for future
works. Cholasuke et al. (2004) propose a promising approach as well, because they are
envisioning a relationship between good practices (strictly associated to the “maturity”
concept) and results achieved by maintenance management (as an outcome of adoption of
good practices): future works should, indeed, aim at proving the relationship through
empirical evidences in industry. Nonetheless, Cholasuke et al. (2004) are presenting a
simplified approach for ranking the maintenance practices, while CMMI (CMMI Product
Team, 2010) has a more structured approach. Furthermore, CMMI can be considered
positively also because it has proven attractive in several fields of application, thus
showing its flexibility for assessing the maturity in different PAs. The proofs in
maintenance related issues (such as in Hauge and Mercier, 2003; Schuh et al., 2009) are
the last but not least motivation for considering CMMI as reference method. Schuh et al.
(2009), in particular, issue an initial example in order to demonstrate that the quality of a
maintenance management system as a whole can be analysed through the adoption of a
CMM(I)-based maturity assessment method. Their approach is aligned with the research
interest of this and future works: their main limitation is subsequent to the fact that they
adopt a firm-centric view while potential synergies, naturally present in maturity models
as normative descriptions, are not exploited to compare the maturity assessment, locally
done at a firm, with the relevant benchmarks present in the good/best practices
proposed by the maintenance management models of other companies (e.g. the best
companies in a sector).
2.2 The CMMI as a reference method
CMMI is better analysed as it has been selected as a background for the development
of the maturity assessment method proposed in this paper.
CMMI approach suggests two representations as alternative: the staged
representation (SR) and the continuous representation (CR). SR considers five MLs,
related to the whole activity being assessed. For each of them, a few PAs are defined,
that have to be improved to reach the specific ML. Hence, if a generic firm is at level
3 of maturity, it has to improve a sub-set of predefined PAs, out of the whole, to reach
level 4. CR defines six capability levels (CLs), instead of MLs; a CL represents a
measure assigned to a lone PA; in this way, the maximum flexibility for firms to
choose which processes to point at is provided. In fact, herein, each PA has a different
CL, and the assessment of the whole of them makes up a so-called capability profile.
Synthetically speaking, CR offers the maximum flexibility for prioritizing process
improvements and aligning them with the business objectives, while in SR a
predefined path must be followed. SR is then more rigid, because it hypothesizes that,
to grow up, a firm has “one best way” to travel along: once reached a predefined
series of goals (i.e. the sub-set of predefined PAs needed to reach a specific ML), then
the firm is able to step up to the upper level.
This research postulates that the CR mode is preferable for maintenance maturity
assessment because of its road-mapping flexibility: constraining to “one best way” to
process improvement in maintenance management is then avoided by using CR.
Besides, choosing CR has the advantage to enable, in the maturity assessment, more
visibility to the analysis of single PAs. Henceforth, the assessment of investments
(even of small dimensions) is going to be possible, since stepping up the maturity can
be analysed by referring to each single PA, rather than a set of PAs involved in a
ML. The maintenance maturity assessment method is presented in the remainder
as application of the CMMI theory, in the CR mode. It is worth pointing out that,
differently from the original CMMI terminology, the jargon “ML” is preferred instead
of using “CL”, even if the concept still stands: thus, for instance, it will be stated
that maturity profiles (instead of equivalently saying capability profiles) are created
for the
PAs of interest.

3. The theoretical model of the maturity assessment method


3.1 The method
What is postulated by the proposed approach is the idea that a ML can be associated
to a given PA, according on how the processes included in the area are either managed
or executed. To this end, a scorecard is proposed (Table I). This is used in order to
assess the current practices run in the processes pertaining to the PA: if the processes
in the PA adopt either good or best practices, an high ML is associated to the PA; the
highest ML is assigned if the practices used in the PA run according to a continuous
process improvement-like mode (ML5 in Table I); the lowest ML is assigned when
the practices are either weakly available or not performed at all (ML1 in Table I).
In order to define the PAs, the literature on maintenance management models –
reviewed in the introduction of this paper – provided inspiration. In particular, the
main sources can be considered: Pintelon and Gelders (1992), Vanneste and Van
Wassenhove (1995) for the PAs related to the organizational and managerial
capabilities of the maintenance department; Tsang (2002) and Muller et al. (2008) for
the PAs impacting on the technological capabilities; Crespo Ma´rquez and Gupta
(2006) for the entire picture, integrating all the three capabilities. Accordingly
with such
JQME
Maturity level Description
19,3
ML5 Optimizing Process is managed by ensuring the continuous improvement; causes of defects
and problems in the processes are identified, taking actions in order to prevent
problems from occurring in the future

ML4 Quantitatively Process performance is measured, and causes of special variations are
302 managed detected; quantitative analyses are conducted, indeed a good balance is
reached between the quantitative and qualitative analysis; process
management is fulfilled thanks to organizational responsibilities and fully
functional technical systems
ML3 Defined The process is planned; semi-quantitative analyses are done periodically to
define good practices/management procedures; process management depends
on some specific constraints for the organizational responsibility or the
technical systems
Table I. ML2 Managed The process is partially planned; performance analysis is mostly dependent on
The scorecard individual practitioners’ experience and competences; process management is
defining the scale of weak because of deficiencies in the organizational or in the technical systems
maturity levels ML1 Initial The process is weakly controlled, or not controlled at all

capabilities, the analysis is then segmented thanks to different indicators: this is a


follow up of the general idea that a proper segmentation should help classifying the
criticalities in the maintenance processes. Indeed, the maintenance maturity is not
considered as a holistic indicator, it is instead evaluated through three maturity
indexes according to the original concept of Garetti et al. (2007). Henceforth, the PAs
are aggregated to measure the managerial, organizational and technological
capabilities of the maintenance department. As final outcome, the maintenance
maturity is assessed both as a synthetic index – in the remainder general maturity
index (GMI), for measuring the general ML – and a set of component indexes
accordingly with the need to measure the managerial, organizational and
technological capabilities of the maintenance department – respectively, MMI, OMI
and TMI:
. the management maturity index (MMI) is used to assess all the PAs concerned
with the planning and control cycle (i.e. ranging from the work order
management to the maintenance planning and budgeting);
. the organizational maturity index (OMI) refers to all those PAs concerned with
knowledge management and improvement of internal and external relationships
(i.e. within the maintenance “internal” structure and with all the parties
“external” to the maintenance department; both other enterprise functions – as
production – and third parties are considered as “external”); and
. technological maturity index (TMI) takes into account all the PAs related to the
support provided by the CMMS/ERP, diagnostic and prognostic tools,
maintenance engineering tools, other ICT tools (e.g. to support workers’
mobility); the objective is not just to verify the presence of the tools but to
assess how the tools are effectively adopted in the company’s practice.
Each index – GMI, MMI, OMI or TMI – results then in a correspondent ML
accordingly with the scorecard of Table I.
Figure 1 is a summary of the theoretical model of the maturity assessment method:
a firm is initially analysed by measuring the maturity profiles for the target processes
considered in the maintenance management model; afterwards, the analysis steps up a
level and so the aggregated profiles are calculated for the PAs (ten PAs); a further
MATURITY PROFILES MATURITY PROFILES MATURITY PROFILES
(AS MATURITY INDEXES) (AS MATURITY INDEXES) (AS MATURITY INDEXES)

CAPABILITY PROCESS AREA PROCESS

Maintenance planning and


budgeting CONTAINS A SET OF PROCESSES

Information sharing with third


parties CONTAINS A SET OF PROCESSES
Managerial capability

CONTAINS A SET OF PROCESSES


Work order management

Relationships with other CONTAINS A SET OF PROCESSES


enterprise functions

Relationships with third parties


(outsourcing) CONTAINS A SET OF PROCESSES
Maintenance
capability
Organisational capability Empowerment of maintenance
personnel CONTAINS A SET OF PROCESSES

Maintenance engineering
structure CONTAINS A SET OF PROCESSES

Monitoring, diagnostics and


prognostics system CONTAINS A SET OF PROCESSES

Computerized maintenance
management system
CONTAINS A SET OF PROCESSES
Technological capability
Reliability and maintenance
engineering system CONTAINS A SET OF PROCESSES

DIAGNOSIS BASED ON MATURITY


MATURITY PROFILES INDEX CALCULATION

aggregation enables to synthesize the maturity profiles of the maintenance department


as a whole or considering its managerial, organizational and technological
capabilities. Accordingly, the criticalities in maintenance processes can be classified
starting from the maturity profiles at the most aggregate level and breaking down for
diagnosis of the root causes until identifying the weakest PAs/processes.

3.2 The questionnaire


The maturity assessment method is practically implemented through a questionnaire.
This is the “tool” used in order to collect the answers from the interviewee
representative of the firm under analysis (normally, the interviewee is the maintenance
manager, since he/she should have the best knowledge of the practices actually run in
the company). A closed set of answers – already prepared in the questionnaire – is
proposed to the interviewee for each question: in particular, the question contains a set
of practices referred to a target maintenance process and its answers are ranked
according to a normative description ranging from the initial/basic practice to the
good/best practice, in line with the scorecard presented in Table I (thus, from ML1 to
ML5). More than one question is proposed for each process/PA: Table II provides a
summary of the number of questions included in the questionnaire for each capability.
Each question is then assigned the score according to the answer of the interviewee
(thus, a level between ML1 and ML5). Based on the answers to the questionnaire, it is
then possible to calculate all the maturity indexes (as in Figure 1). The calculation
procedure used in this paper is a slight revision of what was postulated in Macchi et al.
(2011). It is worth synthesizing such a procedure in its basic issues:
(1) The output of the interview is a set of scores related to all the processes and
PAs. In order to have a unique score addressing a given capability, the specific
maturity index is calculated (i.e. TMI, OMI, MMI) as an average of the scores
obtained by the PAs pertaining to the capability.
(2) All the maturity indexes measuring the capabilities of the maintenance
department (i.e. TMI, OMI, MMI) are eventually synthesized in the GMI by
means of an additive model (GMI ¼ (MMI þ OMI þ TMI)/3).
The adoption of this procedure was verified considering the capability of the maturity
assessment to fit the perceptions of the firms on the status of their maintenance
practices.
A first verification was applied according to the methodological advices from De
Bruin et al. (2005) and was carried out in the frame of the first annual research of the
Observatory TeSeM through two workshops, attended by 15 companies each, and
direct interviews to a subset of the companies out the whole sample of the survey
ended in February 2012 (TeSeM, 2012). Summarizing the feedbacks gained both from
the workshop and the interviews, it can be said that about 85 per cent of the
companies said that their perception was aligned with the results of the maturity
assessment, while the remaining 15 per cent considered the results as valid, but
wanted to get more insights on each single PA, to validate the credibility at the more
aggregated level of managerial, organizational and technological capabilities also with
the results at more disaggregated level (i.e. within each single PA).

4. The demonstration of the maturity assessment method


A case study is herein adopted to demonstrate the method. The reference sample used
as a benchmark for the case study is initially presented in order to provide an
overview of the features therein represented (section 4.1); the implementation of the
method is then described in its logic (section 4.2) before proceeding with the
demonstration (section 4.3).

4.1 Overview of the reference sample for the benchmark


The companies (a sample of 45) were interviewed through a survey during the annual
research of the TeSeM Observatory. They are on the majority of a medium (22 per
cent)

Capability Process areas (PAs) Questions


the survey questionnaire

Table II.
Capabilities, process areas
and questions included in
Technological capability

and prognostics system


Compu
terized
mainte
nance
manag
ement
system
Reliabi
lity and
mainte
nance
engine
ering
system
Organizational capability

enterprise functions
Relatio
nships
with
third
parties
(outsou
rcing)
Empo
werme
nt of
mainte
nance
person
nel
Mainte
nance
engine
ering
structur
e
Managerial capability

budgeting
Inform
ation
sharing
with
third
parties
Work
order
manag
ement
and large size (65 per cent of the sample) and they come from different sectors
(chemical, pharmaceutical, automotive and mechanical sectors); the interviewees
were all held in production sites located in the northern Italy.
Within the sample, the following features stand for the maintenance organization:
the number of maintenance workers ranges from 1 to 65; the maintenance budget is
normally determined either by the organizational unit to which the maintenance unit
refers to or is directly defined by the top management; most of the companies in the
sample (53 per cent) does not present a business/functional unit with maintenance
engineering responsibilities. A large part of the sample (47 per cent) considers an
interaction amongst maintenance department and production department for
monitoring and continuously improving the availability of the production system;
31 per cent of the sample keep the interaction among maintenance and production
department on definition of the maintenance plan and corrective maintenance actions,
while 22 per cent of the sample does not consider a structured cooperation among
maintenance and production department.
Considering the use of technologies in maintenance, thus keeping a perspective on
the technological capability, it is worth pointing out that: the CMMS is not so
frequently used (only by 55 per cent of the sample) being it replaced by simpler tools,
such as spreadsheets and local databases; the majority of the companies using the
CMMS (that is, 58 per cent of the companies using the CMMS) does not update
the maintenance plan in the CMMS after its first release subsequent to the plant
installation, or updates it only after the occurrence of important events, such as a plant
revamping; analysts are partially involved in a detailed analysis of the data collected
through the CMMS, in particular, a limited number of companies (46 per cent of the
companies using the CMMS) dedicates at least one person for data analysis; it is not
surprising then that, considering the overall sample, a large majority of companies
(75 per cent) does not adopt a software tool for failure data analysis.
Keeping now a perspective on the managerial capability of the maintenance
department, the sample reveals that: the RCM methodology is not used to support
management and, in some cases, not even known (62 per cent); the preventive
maintenance is defined either basing on the operators’ experience or on the suppliers’
recommendations (43 per cent), while the majority of the companies (57 per cent)
uses tools for quantitative analysis in order to define or redefine the best time to make
preventive maintenance; last but not least, condition-based maintenance is mainly
operated through the inspections as primary asset evaluation method (59 per cent).
The features, herein reported to give a flavor of the reference sample, are aligned
with the measurement of the maintenance practices through the maturity assessment
method later presented (in section 4.3).

4.2 Implementation of the maturity assessment method


The maturity assessment method is implemented as a procedure of two phases
thought for the same purpose: to make a criticality analysis in order to identify the
weakest points and to subsequently drive the improvement of the maintenance
management system of a company. Nonetheless, the criticality analysis has two
different concerns according to the focus given at each phase.
The first phase is, in fact, company driven and allows to answer to the question
“which are the most critical processes to invest on?” having a focus only on the
company and its own processes. The second phase is benchmark driven: it aims at
providing an answer to the question “which are the processes to focus on, considering
the best companies as a reference target?”; henceforth it needs to build up the
reference sample for the benchmark, in order to obtain the answer. At the end of the
two phases, the company should have an idea of the different PAs/processes, and
practices therein, to be improved.
Next Table III summarizes objective, task and tools at each phase. It is worth
observing that the tools used in the procedure are quite simple. The scorecard (Table
I) is the basic and common tool adopted in order to assess in a measurable way the
state of practices in a company (both the company under analysis or any company in
the benchmark). The other tools are used as visual supports to help the company’s
decision maker reflecting on the current status of the maintenance department and, in
particular, on the criticalities of the maintenance management system. Indeed, during
the first phase, the bar-charts are considered for visualizing the weakest capabilities/
PAs/processes, in order to finally stimulate a reflection on the needs to improve just
having a company perspective. During the second phase, the weakest
capabilities/PAs/ processes result from positioning the company under concern in the
reference sample chosen for the benchmark: the box plots are a useful tool to this end.
In particular, the decision maker reflects on the actual status of the company through
looking at the box plots, which stimulate further thinking on the improvement of the
management system.
The procedure is demonstrated in a case study in the next section.

4.3 Using the maturity assessment method in a case study


A demonstration of use of the method is hereafter presented by applying the
assessment to a manufacturing company. The company, from now on called example-
company for reasons of confidentiality, is a manufacturer of aluminium and copper
components for electromechanical, pneumatic and oil-pressure equipment for the
automation sector. The analysis concerns a plant located in northern Italy, employing
52 persons, among which 29 are production operators, also dedicated to basic
maintenance tasks; furthermore, the specific maintenance tasks are commissioned by
the company to third parties, namely maintenance service providers or OEMs offering
maintenance services for the after sales. A person – an engineer – is charged with
the maintenance management. The example-company is directly taken out from
the sample of 45 companies presented in section 4.1: the reference sample for the
benchmark, used during the second phase of the assessment procedure, is then
reduced to the remainder of 44 companies.
According to the maturity-based diagnostic process envisioned in the previous
Figures 1 and 2, now shows, as a starting point, the maturity profile at the CL: this is

Phase Objective Task Tools


procedure Company driven To classify the criticalities To assess the
criticality analysis in the maintenance practices run in the
processes processes of the
company
Table II. Benchmark driven To make a benchmark with To assess the distance of
Objective, task and tools criticality analysis the best companies the company’s practices
at each phase of the with respect to the good/best
maturity assessment practices of maintenance
management models Sc
ore
car
d
and
bar
-
cha
rts

Sco
rec
ard
and
box
plo
ts
the first outcome of the company driven criticality analysis (i.e. the first phase of the
assessment procedure). The bar-chart provides both the GMI and the component
indexes measuring the capabilities of the maintenance department, i.e. TMI, OMI and
MMI, of the example-company: this enables to obtain a general overview of the
weaknesses and strengths of the company for its maintenance practices. In particular,
it is worth pointing out that the example-company is not so much mature: the GMI is
only close to ML2 which, according to the scorecard (Table I), means that “the
process is partially planned; performance analysis is mostly dependent on individual
practitioners’ experience and competences; process management is weak because of
deficiencies in the organizational or in the technical systems”. The technological
capability – as measured through the TMI – is the less mature capability for
maintenance operations.
An in-depth analysis is also carried out by breaking down the maturity indexes, to
deploy the maturity profiles of some selected PAs and, thus, to continue with the
maturity-based diagnostic process envisioned in the previous Figure 1. The PAs are
chosen starting from the weakest capability: by doing so, the maturity profile at CL
can be better understood thanks to a bundle of values that helps identifying the main
criticalities (i.e. the strongest and weakest PAs for the capability). Figure 3 provides,
for the example-company, the maturity profile of the chosen PAs pertaining to the
technological capability.
In the example shown, the company resulted in the following bundle of values: the
lowest level is achieved for “reliability and maintenance engineering system”, other
PAs (i.e. the “CMMS” and “monitoring, diagnostics and prognostics system”) obtain
an higher level, even if always limited to a low maturity. These low values motivate
the low ML for the TMI. Indeed, this initial assessment lead the company to reflect on
the actual status of the maintenance department and be aware, based on tangible
“measures”, of the weak practices run in the maintenance processes: henceforth, the
Reliability and Monitoring,
Computerized
maintenance diagnostic and
maintenance
engineering system prognostic system
management system

example-company had more elements to support the decision; the company, in fact,
actually decided to improve its practices through the investment in a CMMS, in order
to enhance the technological capability as first lever for the improvement. In the
expectation before the investment, this should have guaranteed a better control of
the maintenance activity, a proper information collection and maintenance plan
management, thus finally creating further possibilities for the use of data analysis
supporting tools for maintenance engineering activities.
As a concluding remark of the first phase of the assessment – that is, the company
driven criticality analysis – it can be asserted that the example-company reached a
proper awareness so to be convinced that an action should be done on the CMMS
lever to reduce the criticalities. The second phase – that is, the benchmark driven
criticality analysis – extends understanding of the criticalities: in particular, at this
phase, the decision maker of the example-company can also reflect on the actual
status of its maintenance department, after discovering the position of its company in
the reference sample used for benchmark.
In the remainder, the case study analysis highlights this second phase both looking
at the “as-was” situation of the example-company (before the CMMS
implementation) and the “as-is” situation (after the implementation). In particular, all
the graphs (the box plots, see them reported in Figures 4-6) show the maturity indexes
of the example- company positioned in the reference sample: the “as-is” situation is
graphically indicated as circles, the “as-was” situation is drawn as rhombs.
As a first step of the second phase of the assessment, the sample distribution of the
companies used for the benchmark should be characterized: a box plot (Figure 4) is
used to make the characterization of the general ML GMI and the indexes related to
each capability of the maintenance department, i.e. OMI, TMI, MMI.
As visualized by the GMI’s box plot, the companies in the reference sample
achieve on average a level around ML3. This is due in great part to the low level of
maintenance practices in technological and managerial capabilities (see the low values
visualized by the TMI’s and MMI’s box plots). Conversely, the organizational
capability (i.e. measured by the OMI) is generally better; nonetheless, the sample is
quite dispersed and companies may also show very low maturities for the
organizational capability. As a second step of this phase, the example-company is
positioned in the
ML1

Computerized Reliability and Monitoring,


maintenance maintenance diagnostic and
management system engineering system prognostic system

sample, see the rhombs and circles drawn in Figure 4: using the benchmark, the
company can now discover its positioning according to the reference sample.
It is now relevant to discuss about the selection of the proper benchmark to be
adopted. In Figure 4, the sample of companies was characterized by similar
geographical locations of the production sites (i.e. northern Italy). Nevertheless,
benchmarking should take into account some explanatory variables influent
for the practices, to drive the creation of the proper sample and, thus, to make the
benchmark as much as possible homogeneous. For this reason, it was decided to look
at a sub-sample of companies competing in the discrete manufacturing industry,
including only those companies taken from the automotive and mechanical sector:
next Figure 5 shows the box plots of the maturity indexes recalculated based on the
use of this sub-sample (25 companies, not including the example-company). This new
reference sample was considered better for achieving a relevant comparison for
the example-company, which is then positioned therein (rhombs and circles in the
box plots).
Figure 5 shows how, as a consequence of the investment on the CMMS, the
company is now aligned with the average technological ML reached by similar
companies of the sub-sample. Instead, before the CMMS implementation took place,
the company was clearly positioned at the bottom of the lower whisker of the box plot
of the technological capability: apart the absolute low value of the related PAs –
revealed also during the first phase of the assessment – a relative weakness, compared
with companies in the manufacturing industry, came out. Indeed, this was another
reason that moved the decision maker to think about the investment on the
technological capability of its company.
The graphs in Figure 5 open the possibility to deepen the analysis on specific PAs:
not only maturity indexes at CL can be used for benchmarking, but also specific
comparisons amongst the maturities of PAs are advisable. Next Figure 6 supports this
further analysis.
In particular, the previous results highlighted that the technological capability was
critical, considering also the benchmark. Therefore, an in depth analysis was also
done to make a diagnosis of the maturities measured at the CL. The subsequent
breakdown confirmed that the CMMS was the lever to be improved: the example-
company, in fact, was clearly at the bottom of the low whisker of the box plot drawn
for the CMMS PA (see the rhombs in Figure 6).
In the case study, the example-company actually decided to invest on the
implementation of the CMMS and related technological practices in order to
eventually enhance the use of information for reliability and maintenance
engineering. In particular, the investment was strategically driven by the need to
achieve, at first, better control of the maintenance activities and, as a follow up, better
planning capability. This resulted in the major shift from a measured value of GMI
(see the circle for the GMI in Figure 5) equal to ML2 to a value equal to ML3 which,
according to the scorecard (Table I), represents a ML where “the process is planned;
semi-quantitative analyses are done periodically to define good practices/management
procedures; process management depends on some specific constraints for the
organizational responsibility or the technical systems”. Indeed, some constraints are
still limiting the full exploitation of the benefits gained with the CMMS
implementation: in particular, the managerial capability of the maintenance
department is still weak in the “as-is” situation (see the circle for the MMI in Figure
5).

5. Conclusions
The research presented in this paper started from the general need of manufacturing
companies to be supported during the identification of levers to improve their existent
maintenance management system. The solution proposed to this end was a
maintenance maturity assessment method. This was theoretically built upon a
scorecard inspired by the CMMI methodology and was further implemented as a
procedure made up of two phases, whose objectives are, first, to classify the
criticalities in maintenance processes and, thereafter, to make a benchmark with the
best companies in order to better drive the investment decision required for the
improvement. The method is now available and can be an aid to “measure” the state
of maintenance practices in a company.
The method allows to answer two questions. The first question (which are the most
critical processes to invest on?) has also been addressed by other methods available in
literature. Nonetheless, it is worth mentioning that the scientific value added by the
proposal of this paper is the structuration of a multileveled maturity-based diagnostic
analysis based on the application of a consolidated methodology (CMMI) to the
capabilities and PAs considered for a maintenance department, synthesized from a
wide number of references from literature on maintenance management models. The
second question (which are the processes to focus on, considering the best companies
as a reference target?) was instead leading to an innovative concept: to use the
potentials of the maturity assessment methods to make a measurable benchmark on
maintenance practices. To the authors’ knowledge, in fact, structured approaches to
maintenance maturity assessment are initiating to appear in literature, showing some
similarities with respect to the method proposed in this paper (see in particular the
above cited Schuh et al., 2009). Nonetheless, no one has exploited, for a
benchmarking purpose, the capability of maturity assessment methods to provide a
normative description as ranked orders of practices (from practices at low MLs to
practices at an high MLs). The demonstration presented in this paper, and other
feedbacks already
collected in the frame of the whole TeSeM research (extended to four regions in Italy and
over 100 interviewed companies), provided promising indication on the industrial interest
for such a kind of benchmarking activity.
The future research should exploit the maturity assessment method at least in three
ways. First, the method may be widely applied in empirical surveys aimed at
analyzing, and measuring, the state of maintenance practices. To this regard, it will be
interesting to make cross-comparisons of the state of practices between different
industries, sectors or company’s sizes. Second, further exploitation can be naturally
related to benchmarking projects and services, in order to support companies being
aware of criticalities in their maintenance management system, before planning
investments for improvement. Last but not least, it will be also relevant to complete
the empirical methodology, combining the maintenance maturity assessment with the
analysis of typical key performance indicators (KPI) used to measure the maintenance
results. The main objective, standing behind this research idea, is to prove that
maturity assessment can be considered a way to measure a set of “leading” indicators
(the maturity indexes) which can be used to anticipate positive results measurable by
means of “lagging” indicators, such as those measured through KPIs commonly
adopted in maintenance.
Note
1. The method has been used during the first annual research of the Observatory TeSeM, a
permanent research structure of the School of Management of Politecnico di Milano which
integrates in a collaboration platform other Italian universities (at the moment, Politecnico
di Bari, Universita` degli Studi di Bergamo, Universita` degli Studi di Bologna, Universita`
degli Studi di Firenze) with the purpose to monitor the state of the art of the maintenance
choices within small, medium and large companies located in different regions of Italy, in
the field of industrial plants, infrastructures and services.

References
Aberdeen Group (2006), “The asset management benchmark report-moving toward zero
downtime”, The Aberdeen Group, pp. 27-28, available at: http://aberdeen.com/Aberdeen-
Library/2852/RA_AssetMgnt_MOH_2852.aspx (accessed 25 June 2013).
Al-Turki, U. (2011), “A framework for strategic planning in maintenance”, Journal of Quality
in Maintenance Engineering, Vol. 17 No. 2, pp. 150-162.
Cholasuke, C., Bhardwa, R. and Antony, J. (2004), “The status of maintenance management in
UK manufacturing organizations: results from a pilot survey”, Journal of Quality in
Maintenance Engineering, Vol. 10 No. 1, pp. 5-15.
CMMI Product Team (2001), Capability Maturity Model Integration (CMMI), Version 1.1,
Carnegie Mellon University, Pittsburgh, PA.
CMMI Product Team (2010), CMMI –SVC, Version 1.3, Carnegie Mellon University, Pittsburgh,
PA. Crespo Ma´rquez, A. (2007), The Maintenance Management Framework: Models and
Methods for
Complex Systems Maintenance, Springer Verlag, London.
Crespo Ma´rquez, A. and Gupta, J.N.D. (2006), “Contemporary maintenance management:
process, framework and supporting pillars”, Omega, Vol. 34 No. 3, pp. 313-326.
Crespo Ma´rquez, A., Moreu de Leo´n, P., Go´mez Ferna´ndez, J.F., Parra Ma´rquez, C. and
Lo´pez Campos, M. (2009), “The maintenance management framework: a practical view
to maintenance management”, Journal of Quality in Maintenance Engineering, Vol. 15
No. 2, pp. 167-178.
De Bruin, T., Rosemann, M., Freeze, R. and Kulkarni, U. (2005), “Understanding the main Minzoni, M.
phases of developing a maturity assessment model”, Proceedings of the Australasian (2004),
Conference on Information Systems (ACIS), Sydney, 30 November-2 December. “CMMI:
Fumagalli, L., Di Leone, F., Jantunen, E. and Macchi, M. (2010), “Economic Value of optimize
Technologies in an eMaintenance platform”, Proceedings of 1st IFAC Workshop, A- the
MEST’10, Advanced Maintenance Engineering, Services and Technology, Lisboa, 1-2 process
July, pp. 23-28. to
improve
Fumagalli, L., Elefante, D., Macchi, M. and Iung, B. (2008), “Evaluating the role of the
maintenance maturity in the adoption of new ICT in the process industry”, Proceedings product”
of 9th IFAC Workshop on IMS (Intelligent Manufacturing Systems), Szczecin, 9-10 ,
October. available
Garetti, M., Macchi, M., Terzi, S. and Fumagalli, L. (2007), “Investigating the organizational at:
business models of maintenance when adopting self diagnosing and self healing ICT systems www.mo
in multi site contexts”, Proceedings of the IFAC CEA (Conference on Cost Effective kabyte.it
Automation in Networked Product Development and Manufacturing), Monterrey, 2-5 (accesse
October. d 21
Gomez Fernandez, J.F., Fumagalli, L., Macchi, M. and Crespo Marquez, A. (2008), “A Novemb
scorecard approach to investigate the IT in the maintenance business models”, er 2011).
Proceedings of the Annual 10th International Conference on The Modern Information
Technology in the Innovation Processes of the Industrial Enterprises, Prague, 12-14
November.
Hain, S. and Back, A. (2009), “State-of-the-art on maturity models for collaboration”, available
at: www.alexandria.unisg.ch/publications/advanced-search/214253 (accessed 15 October
2012).
Hauge, B.S. and Mercier, B.A. (2003), “Reliability centered maintenance maturity level
roadmap”, Proceedings of the Annual Reliability and Maintainability Symposium,
Tampa, FL, 27-30 January.
Jantunen, E., Adgar, A. and Aitor, A. (2008), “Actors and roles in E-maintenance”, The Fifth
International Conference on Condition Monitoring & Machinery Failure Prevention
Technologies (CM2008/MFPT2008 Conference), Heriot-Watt University, Edinburgh,
15-18 July.
Jonsson, P. (1999), “Company-wide integration of strategic maintenance: an empirical
analysis”,
International Journal Production Economics, Vols 60-61, pp. 155-164.
Kajko-Mattsson, M. (2002), “Problem management maturity within corrective maintenance”,
Journal of Software Maintenance and Evolution: Research and Practice, Vol. 14 No. 3,
pp. 197-227.
Lo´pez Campos, M. and Crespo Ma´rquez, A. (2009), “Review, classification and
comparative analysis of maintenance management models”, Journal of Automation,
Mobile Robotics and Intelligent Systems, Vol. 3 No. 3, pp. 110-115.
Lopez Campos, M.A. and Crespo Ma´rquez, A. (2011), “Modelling a maintenance management
framework based on PAS 55 standard”, Quality in Reliability Engineering, Vol. 27 No. 6,
pp. 805-820.
Lo´pez Campos, M.A., Go´mez Ferna´ndez, J.F., Gonza´lez D´ıaz, V. and Crespo Ma´rquez, A.
(2010), “A new maintenance management model expressed in UML, in reliability, risk
and safety: theory and applications”, in Brisˇ, Guedes Soares and Martorell (Eds),
Reliability, Risk and Safety: Theory and Applications, Taylor & Francis Group, London,
ISBN 978-0-415-55509-8.
Macchi, M., Fumagalli, L., Pizzolante, S., Crespo Marquez, A. and Gomez Fernandez, J.F.
(2010), “Towards e-Maintenance: maturity assessment of maintenance services for new
ICT introduction”, Proceedings of the APMS 2010 Conference, Como, 11-13 October.
Macchi, M., Fumagalli, L., Rosa, P., Farruku, K. and Gasparetti, M. (2011), “Maintenance
maturity assessment: a method and first empirical results in manufacturing industry”,
Proceedings of MPMM 2011 Conference, Lulea, 13-15 December.
A maintenance
maturity assessment

313
JQME Muller, A., Crespo Marquez, A. and Iung, B. (2008), “On the concept of e-maintenance: review
19,3 and current research”, Reliability Engineering and System Safety, Vol. 93 No. 8, pp.
1165-1187.
Murthy, D.N.P., Atrens, A. and Eccleston, J.A. (2002), “Strategic maintenance management”,
Journal of Quality in Maintenance Engineering, Vol. 8 No. 4, pp. 287-305.
Parida, A. and Kumar, U. (2006), “Maintenance performance measurement (MPM): issues and
314 challenges”, Journal of Quality in Maintenance Engineering, Vol. 12 No. 3, pp. 239-251.
Paulk, M.C., Curtis, B., Chrissis, M.B. and Weber, C.V. (1993), Capability Maturity Model for
Software, Version 1.1, Carnegie Mellon University, Pittsburgh, PA.
Pinjala, S.K., Pintelon, L. and Verecka, A. (2006), “An empirical investigation on the
relationship between business and maintenance strategies”, International Journal of
Production Economics, Vol. 104 No. 1, pp. 214-229.
Pintelon, L.M. and Gelders, L.F. (1992), “Maintenance management decision making”,
European Journal of Operational Research, Vol. 58 No. 3, pp. 301-317.
Schuh, G., Lorentz, B., Winter, C.P. and Gudergan, G. (2009), “The house of maintenance –
identifying the potential for improvement in internal maintenance organizations by means
of a capability maturity model”, Proceedings of the 4th World Congress on Engineering
Asset Management, Athens, 28-30 September.
So¨derholm, P., Holmgren, M. and Klefsjo¨, B. (2007), “A process view of maintenance
and its stakeholders”, Journal of Quality in Maintenance Engineering, Vol. 13 No. 1, pp.
19-32.
Swanson, L. (2001), “Linking maintenance strategies to performance”, International Journal of
Production Economics, Vol. 70 No. 3, pp. 237-244.
TeSeM (2012), “Osservatorio Tecnologie e Servizi per la Manutenzione”, available at:
www.tesem. net/english-site (accessed 15 October 2012).
Tsang, A. (1998), “A strategic approach to managing maintenance performance”, Journal of
Quality in Maintenance Engineering, Vol. 4 No. 2, pp. 87-94.
Tsang, A. (2002), “Strategic dimensions of maintenance management”, Journal of Quality in
Maintenance Engineering, Vol. 8 No. 1, pp. 7-39.
Vanneste, S.G. and Van Wassenhove, L.N. (1995), “An integrated and structured approach to
improve maintenance”, European Journal of Operational Research, Vol. 82 No. 2,
pp. 241-257.
Volker, L., Van der Lei, T.E. and Ligtvoet, A. (2011), “Developing a maturity model for
infrastructural asset management systems”, in Beckers, T. and Von Hirschhausen, C.
(Eds) Proceedings of 10th Conference on Applied Infrastructure Research – Infraday
2011, 7-8 October, TU Berlin, Berlin.
Waeyenbergh, G. and Pintelon, L. (2002), “A framework for maintenance concept development”,
International Journal of Production Economics, Vol. 77 No. 1, pp. 299-313.

About the authors


Marco Macchi is a Researcher at Politecnico di Milano, currently teaching Modelling of
Production Systems and Logistics and Design and Management of Production Plants, Vice
Director of the Executive Master on Industrial Maintenance Management, scientifically
responsible for the Observatory on Technologies and Services for Maintenance, Secretary of
the IFAC Working Group on Advanced Maintenance Engineering, Services and Technology,
Book Review Editor of Production Planning & Control. He is author or co-author of four
books and more than 100 papers at national and international levels. His research interests are
in industrial engineering, with special concern to maintenance management and industrial plant
automation. Marco Macchi is the corresponding author and can be contacted at:
marco.macchi@polimi.it
Luca Fumagalli holds a post-doc position at the Department of Management, Economics
and Industrial Engineering of Politecnico di Milano. He obtained a Master of Science in
A maintenance
Mechanical Engineering in 2006. He was a PhD student between January 2007 and December maturity
2009 and obtained his PhD in 2010, with a thesis concerning ontology-based solutions for the assessment
support to maintenance processes. His research interests are focused on innovations in
maintenance management. He has been lecturer and teaching assistant in courses at
undergraduate and post graduate level, such as the courses of Reliability Analysis, Maintenance
Management and Modelling of Production Systems. 315

To purchase reprints of this article please e-mail: reprints@emeraldinsight.com


Or visit our web site for further details: www.emeraldinsight.com/reprints

You might also like