You are on page 1of 4

Copyright 2003 Information Systems Audit and Control Association. All rights reserved. www.isaca.org.

The COBIT Maturity Model in a


Vendor Evaluation Case
By Andrea Pederiva

he maturity model provided by the COBIT Management


Guidelines for the 34 COBIT IT processes is becoming
an increasingly popular tool to manage the timeless
issue of balancing risk and control in a cost-effective manner.
Control Objectives for Information and related Technology
(COBIT) is published by the IT Governance Institute (ITGI)
and Information Systems Audit and Control Foundation
(ISACF).
The COBIT Maturity Model is an IT governance tool used
to measure how well developed the management processes are
with respect to internal controls. The maturity model allows an
organization to grade itself from nonexistent (0) to optimized
(5). Such capability can be exploited by auditors to help management fulfill its IT governance responsibilities, i.e., exercise
effective responsibility over the use of IT just like any other
part of the business.1
A fundamental feature of the maturity model is that it
allows an organization to measure as-is maturity levels, and
define to-be maturity levels as well as gaps to fill. As a result,
an organization can discover practical improvements to the
system of internal controls of IT. However, maturity levels are
not a goal, but rather they are a means to evaluate the adequacy of the internal controls with respect to company business
objectives.
In volume 6, 2002, of the Information Systems Control
Journal, the article Control and Governance Maturity Survey:
Establishing a Reference Benchmark and a Self-assessment
Tool, by Erik Guldentops, CISA, CISM, Wim Van
Grembergen, Ph.D., and Steven De Haes, discusses the results
of the 2002 ISACA survey on the maturity level of 15 COBIT
IT processes. According to the article, survey target processes
were selected a year prior by interviewing a group of 20 IT and
senior experts.
The ISACA survey results can be used as a reference
benchmark and a self-assessment tool. The results of the
survey cover a broad range of countries, industries and size
groups, making them useful for numerous companies
worldwide.
In an engagement experience, this author participated on a
team that used the COBIT Maturity Model to benchmark four
possible vendors, and then compared its results to the ISACA
survey results. The process undertaken, as well as the lessons
learned and the results, is discussed in the remainder of this
article.

Main Issues and Lessons Learned


At the beginning of this benchmarking effort, there were
two main issues:
The need for a criterion to choose the processes to benchmark
The need for a method to measure the vendors maturity
level with respect to the COBIT Maturity Model
The processes to benchmark were chosen by scoring the
COBIT IT processes on a risk-importance basis, from the point
of view of a potential customer. This task followed a logic
similar to the one in the risk assessment form of the COBIT
Implementation Tool Set.
The definition of a method to measure the maturity level
required more effort, in part, because the desire was for a
method precise and efficient enough to allow for interaction
with potential vendors. A questionnaire and a ranking system
were developed to compute the maturity level from the
questionnaire results. While the approach was not unusual,
there were a few new ideas used that proved to be valuable.
(These new ideas subsequently have been tested by other
AIEA2 colleagues.)
The method used is not strictly incremental and, therefore,
does not satisfy the COBIT Maturity Models incremental criterion3to check a posteriori.4
However, the method proved to be strong, with respect to
the objective of benchmarking the four organizations under
examination, and the results were logical given the knowledge
collected on the organizations during the benchmarking effort.
Moreover, it appears that the method can be further developed
to build a strictly incremental approach.5
Finally, if combined with different methods, the comparison
between the benchmarking results and the ISACA survey
results provided a basis for an overall discussion on the
distribution of the strongest and weakest areas.

Benchmarking Results
Figure 1 displays the results of the benchmarking effort as
compared with the 2002 ISACA survey results. The four organizations were essentially software makers with an independent
consultancy branch; hence, the benchmark values refer to the
ISACA 2002 survey results related to the Europe, Middle East
and Africa (EMEA) small6 IT service providers. The ISACA
value was computed as the weighted mean of the respondents
distribution on the six possible maturity levels (0 - 5).
The benchmarking results reveal that three of the four companies are aligned between themselves and the ISACA sample
with the exception of a few processes. However, one company

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 3, 2003

almost always has a maturity level far below the other


companies.
The results can be better understood with the following
facts:
CO4 was the only participant in the benchmark that was not
ISO9000-certified.
The respondents are IT vendors with consultancy branches;
hence, IT processes (including AI4 Install and Accredit
Systems) are their core processes.
The processes in which the ISACA sample has a score lower
than the benchmarking sample appear to be the typical core
processes of an application systems developer and vendor.
This suggests a possibly different composition between the
benchmarking sample and the ISACA sample for the EMEA
small IT service providers.
CO3 is the IT subsidiary of a publicly owned group with
strong quality and organizational culturein part, due to its
defense-related experiences.
Figure 1Benchmarking the Results

PO5Manage the IT investment


5.00
M1Monitor the process

4.00

P10Manage projects

3.00
2.00
DS10Manage problems and
incidents

1.00

AI1Identify solutions

0.00

DS5Ensure systems security

AI6Manage changes

AI2Acquire and maintain


application software

AI5Install and accredit systems

Benchmark Method
The method used in the benchmark is based on a questionnaire derived from the COBIT Maturity Model, and it relies on
a scenario concepti.e., every maturity level is considered
to be a scenario. A maturity level scenario includes the description of the organization and the internal controls of a company
satisfying the requirements of a specific maturity level.
The questionnaire is intended to capture the compliance of
an IT organization under investigation to the diverse scenarios
describing each maturity level. Based on the questionnaire
results, an algorithm computes a compliance vector that
describes the compliance of the organization to every scenario.
Then, it uses the vector to compute the maturity level as a
weighted average of the organizations compliance with respect
to each scenario.

The Questionnaire
To arrange the questionnaire, the maturity level descriptions
of the COBIT Maturity Model were studied. It was concluded
that the descriptions of the maturity levels could be viewed as
sets of atomic statements. Each maturity level description is
a statement that can be either true or false, or either partially
true or partially false. The examination resulted in the
realization that a compliance value could be computed for
the maturity level by collecting and then combining a
compliance value for each statement.
Based on this concept, the maturity level descriptions were
split into separate statements, and all statements in the maturity
level descriptions were separate in the questionnaire.7
Figure 2 displays an example of how the questionnaire
statements were derived for the maturity model of process
PO10 Managing Projects.
Figure 2Questionnaire Construction for PO10
Managing Projects (maturity levels 0 and part of 1)
Maturity Level Description
0 NonexistentProject
management techniques are not
used and the organization does
not consider business impacts
associated with project
mismanagement and
development project failures.
1 Initial/Ad hocThe organization
is generally aware of the need for
projects to be structured and it
is aware of the risks of poorly
managed projects. The use of
project management techniques
and approaches within IT is a
decision left to individual IT
managers. Projects are generally
poorly defined.

Questionnaire Statements
The organization does not use
project management techniques.
The organization does not
report on project
mismanagement effects or
development project failures.
The organization is generally
aware of the need for projects
to be structured.
The organization is aware
of the risks of poorly
managed projects.
The use of project management
techniques and approaches
within IT is a decision left to
individual IT managers.
Projects are generally poorly
defined.

To obtain a compliance value for each statement, the following question was asked: With respect to your organization,
how much do you agree with the following statements? Four
possible answersnot at all, a little, quite a lot or
completelywere provided.
The answers were mapped to the following compliance
values0, 0.33, 0.66, 1 (figure 3).8
Figure 3Compliance Level Numeric Values
Agreement with Statement
Not at all
A little
Quite a lot
Completely

Compliance Value
0
0.33
0.66
1

The Algorithm
When the questionnaire is complete, each maturity level
will have a set of statements, each with its own compliance
value of 0, 0.33, 0.66 or 1. For example, figure 4 shows the
fully compiled questionnaire for the level 3 maturity model of

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 3, 2003

Figure 4Questionnaire for the Level 3 Maturity


Model of Process PO10

2
3
4
5
6
7

8
9
10

11

Statements

The IT project management process and


methodology have been formally
established and communicated.
IT projects are defined with appropriate
business and technical objectives.
Stakeholders are involved in the
management of IT projects.
The IT project organization and some
roles and responsibilities are defined.
IT projects have defined and updated
schedule milestones.
IT projects have defined and managed
budgets.
IT projects monitoring relies on
clearly defined performance
measurement techniques.
IT projects have formal post-system
implementation procedures.
Informal project management training
is provided.
Quality assurance procedures and postsystem implementation activities have
been defined, but are not broadly
applied by IT managers.
Policies for using a balance of internal
and external resources are being defined.

Completely

Level

Not at all
A little
Quite a lot

How much
do you agree?
Statements
compliance
values

0.66

0.66
x

The ability to see the compliance values as a description of


the contribution of each maturity level scenario to the
overall maturity level of the organization. The compliance
values were normalized by imposing them as shown in the
example in figure 6.
Figure 6Computation of the Normalized Compliance Vector

Level
0
1
2
3
4
5
Total:

0.66
x

Finally, the maturity level summary for the process was


computed by combining the normalized compliance values for
each maturity level as shown in figure 7.

0.66
x

Normalized
compliance values
[A/Sum(A)]
0.000
0.000
0.176
0.275
0.272
0.277
1

1
0.33

Not normalized
compliance values
(A)
0.00
0.00
0.50
0.78
0.77
0.79
2.84

Figure 7Computation of the


Summary Maturity Level

1
0.66

Level
0
1
2
3
4
5

Total level: 8.63

process PO10 Managing Projects, with the compliance values


for each statement.
The compliance value for the scenario can be computed as
the average of the compliance level of the statements. In the
case of maturity level 3, it is equal to 8.63/11 = 0.78.
Working in the same way on the other maturity levels, one
can compute a compliance value for all the maturity levels
form 0 to 5. Figure 5 provides an example of the possible
results of this computation.
Figure 5Computation of the Maturity Level
Compliance Values
Sum of statements Number of maturity Maturity level
Maturity compliance values level statements compliance value
level
(A)
(B)
(A/B)
0
0.00
2.00
0.00
1
0.00
9.00
0.00
2
3.00
6.00
0.50
3
8.63
11.00
0.78
4
6.97
9.00
0.77
5
6.31
8.00
0.79

Normalized
compliance values
(B)
0.000
0.000
0.176
0.275
0.272
0.277
Total maturity level:

Contribution
(A*B)
0.00
0.00
0.35
0.83
1.09
1.38

3.65

Lessons Learned
Because of its construction criteria, the questionnaire is
aligned completely with the maturity model and fairly detailed
with respect to the maturity requirements. This has proven to
be useful to support subsequent discussions aimed at identifying the key points that were enabling or preventing the
organization to reach a given maturity level.
The experience also suggested that to exploit the benefits of
the maturity level paradigm, it appears useful to allow a company to grade itself in the full range from 0 to 5 rather than
having to choose a position in the coarse grid (0, 1, 2, ... 5).
As a suggestion, in performing a benchmarking effort, first
discuss the maturity requirements without showing the maturity level in which the questions belong. This will reduce any
bias by the respondents that can be present when they know
the effects of the answers on the final result.
Further work must be done to assemble a strictly incremental approach to the measurement of the maturity level, as
required by the COBIT Maturity Model. However, for comparison purposes, the method presented here proved to be efficient,
providing strong results and facilitating the subsequent discussions on the use of the COBIT Maturity Model as an effective
management tool.

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 3, 2003

To make the method applicable beyond comparison, as with


planning improvements (as-is, to-be, gap analysis), it must
manage partial compliance at lower levels. That is not an issue
in the method presented here, but for improvements one wants
to see consistency at the lower levels before evaluating the
contributions at the higher levels.

Summary
The COBIT Management Guidelines do not suggest any special methodology to measure the maturity level of the IT
processes, and many approaches can be followed. However, to
use the COBIT Maturity Model as an effective management
tool, companies must develop an efficient methodology to
measure the maturity level of their IT processes. The results of
a benchmarking effort based on the COBIT Maturity Model and
the method used to measure a maturity level for the IT processes have been presented. Although the method is not strictly
incremental, as required by the Maturity Model, it proved to be
an efficient tool to measure the maturity level for comparison
purposes. It also provided ideas that can be developed and used
to build a strictly incremental method.

Endnotes
Hardy, Gary,Make Sure Management and IT are on the
Same Page: Implementing an IT Governance Framework,
Information Systems Control Journal, volume 3, 2002
2
Associazione Italiana IS Auditors (AIEA), is the Milan
ISACA chapter, in Italy.
1

COBIT Management Guidelines, IT Governance Institute and


Information Systems Audit and Control Foundation, 2000, p. 100
4
For example, one has to check a posteriori if a company in
which the method is rating, for example, 3.5 really satisfies
all the conditions to meet the maturity level 3.
5
This author has recently engaged in another benchmarking
effort involving some 20 small Italian banks. This effort
involves a strictly incremental approach.
6
Staff is greater than 150 people, but less than 1,500 people.
7
When the descriptions were split into distinct statements,
some statements partially lost their context and became less
understandable; for this reason, some statements were modified for the sake of clarity or to avoid ambiguities.
8
It was discovered that equivalent questions and a set of possible answers could be used. For example, after some discussion, the following question was chosen: How do you rank
the following statements in the range true, partially true, not
completely false, false? Some alternatives were discussed,
and the one described was chosen.
3

Andrea Pederiva
is a manager at Deloitte & Touche in Treviso, Italy. He has
developed vast experience in the field of management and control of information systems with specific reference to the
development and the assurance of internal controls for IT organizations. He has experience in privacy, IT auditing, quality
management, IT security, project management and project risk
management.

Information Systems Control Journal, formerly the IS Audit & Control Journal, is published by the Information Systems Audit and Control Association, Inc.. Membership in the association, a voluntary
organization of persons interested in information systems (IS) auditing, control and security, entitles one to receive an annual subscription to the Information Systems Control Journal.
Opinions expressed in the Information Systems Control Journal represent the views of the authors and advertisers. They may differ from policies and official statements of the Information Systems Audit
and Control Association and/or the IT Governance Institute and their committees, and from opinions endorsed by authors' employers, or the editors of this Journal. Information Systems Control Journal
does not attest to the originality of authors' content.
Copyright 2003 by Information Systems Audit and Control Association Inc., formerly the EDP Auditors Association. All rights reserved. ISCATM Information Systems Control AssociationTM
Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the
association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, Mass. 01970, to photocopy articles
owned by the Information Systems Audit and Control Association Inc., for a flat fee of US $2.50 per article plus 25 per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume,
and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the
association or the copyright owner is expressly prohibited.
www.isaca.org

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 3, 2003

You might also like