You are on page 1of 5

2015 International Conference on Science in Information Technology (ICSITech)

Assessment to COBIT 4.1 Maturity Model


Based on Process Attributes and Control Objectives

Teduh Dirgahayu Dwiyono Ariyadi


Department of Informatics Master Program in Informatics
Universitas Islam Indonesia Universitas Islam Indonesia
Yogyakarta, Indonesia Yogyakarta, Indonesia
teduh.dirgahayu@uii.ac.id ayick19@gmail.com

Abstract—COBIT 4.1 defines a number of IT processes; each number of control objectives to provide assurance that the
of which consists of several activities. A process is controlled by process’s objective is achieved. Control objectives can be
several control objectives. All processes share the same maturity policies, procedures, practices, or organizational structures.
attributes. COBIT 4.1 however does not define any assessment Processes are considered as the smallest units of enterprise IT
method to determine the maturity of IT processes. This paper governance. The result of a process assessment is indicated in a
reviews current assessment methods and identifies their level of maturity.
drawbacks. This paper then proposes an assessment method to
the COBIT 4.1 maturity model whose calculation is based on All processes share common attributes that represent
process attributes and control objectives. The assessment aspects of those processes, e.g. procedures, tools, and
presumes that when the control objectives are met, the activity responsibility. A process is mature when all those aspects are
must have been executed. Data are collected using a mature. These attributes are thus also called maturity attributes
questionnaire given to selected key functions. The maturity level [1].
is calculated using simple average. The assessment method is
illustrated with a case study of process Manage Data (DS11). The COBIT 4.1 does not suggest any method to assess the
assessment method allows one to identify aspects of a process that process maturity. Several literatures have proposed assessment
need improvement. methods [11][12][13]. Those methods however do not
explicitly use the process attributes in the assessment of
Keywords—COBIT 4.1; assessment; maturity; process attributes; process maturity. Hence the result cannot identify which
control objectives aspects of a process that should be improved.
I. INTRODUCTION The objectives of this paper are two folds. The first
objective is to review current assessments to the COBIT 4.1
Nowadays most enterprises rely heavily on information
maturity model and to identify their potential drawbacks. The
technology (IT) to enable their operations. IT is viewed as a
second objective is to propose an alternative assessment based
strategic means to achieve the enterprises’ business objectives.
on COBIT 4.1 process attributes and control objectives to
Improvements on enterprise IT are hence necessary to ensure
overcome those potential drawbacks.
that IT aligns with enterprise business. For doing so,
management shall understand the status of its enterprise IT in This paper is further structured as follows. Section II
order to make right decisions on which improvements to describes the COBIT 4.1 maturity model. Section III reviews
conduct. This status indicates how enterprise IT is being current assessment and identifies their potential drawbacks.
managed. Section IV presents an alternative assessment that explicitly
considers process attributes and control objectives. The
COBIT 4.1 is a framework for enterprise IT governance that
assessment is illustrated with a case study. Section V gives the
was released by IT Governance Institute (ITGI) in 2007 [1]. In
conclusions and identifies future work.
2012, COBIT 5 is released. We argue that COBIT 4.1 is still
relevant to discuss because COBIT 5 expands and integrates II. COBIT 4.1 MATURITY MODEL
COBIT 4.1 with other IT governance frameworks [2].
COBIT 4.1 considers that an enterprise IT process comprises
In COBIT 4.1, the status of enterprise IT is assessed using a a number of related activities. E.g., in domain Delivery and
maturity model that indicates the reliability degree of enterprise Support (DS), process Manage Data (DS11) consists of five
IT. In other areas, the concepts of maturity models are common activities: (i) translate data storage and retention arrangement
to measure processes or artifacts, e.g. in software development into procedures; (ii) define, maintain and implement
process [3], business process management [4][5], information procedures to manage the media library; (iii) define, maintain
security [6][7], and e-government [8][9][10]. and implement procedures for secure disposal of media and
In COBIT 4.1, the governance of enterprise IT is defined in equipment; (iv) back up data according to scheme; and (v)
34 processes within four domains. Each process consists of a define, maintain and implement procedures for data restoration
[1].

978-1-4799-8386-5/15/$31.00 ©2015 IEEE 343


2015 International Conference on Science in Information Technology (ICSITech)

An activity execution involves one or more functions A process has six maturity attributes, i.e. awareness and
within the enterprise as indicated in a RACI chart. This chart communication (AC), policies, plans and procedures (PPP),
indicates which units shall be responsible (R) and be tools and automation (TA), skills and expertise (SE),
accountable (A) for that process; and which units shall be responsibility and accountability (RA), and goal setting and
consulted (C) and informed (I) about that process. measurement (GSM). These attributes represent a process from
different aspects [1].
A process is controlled by a number of control objectives
that can be policies, procedures, practices, or organizational COBIT 4.1 includes a maturity attribute table that is resulted
structures. E.g., process DS11 has 6 control objectives: DS11.1 from a combination of maturity model and attributes, as
business requirements for data management; DS11.2 storage depicted in Table 3. Due to space limitation, the contents of the
and retention arrangements; DS11.3 media library management table are not shown. The full table can be read in [1]. Table 4
system; DS11.4 disposal; DS11.5 backup and restoration; and shows examples of the contents of the maturity attribute table.
DS11.6 security requirements for data management [1]. Attribute descriptions are not process-specific.
The performance of a process is measured using KPIs (key The maturity model is used to assess the current (as-is) and
performance indicators). Its results reflect the achievement of the expected (to-be) status of enterprise IT. The gap between
the process goals. E.g., process DS11 defines KPIs: (i) percent the current and expected statuses indicates where
of user satisfaction with availability of data; (ii) percent of improvements are needed including their priorities [14].
successful data restorations; and (iii) number of incidents
where sensitive data were retrieved after media were disposed. III. REVIEW ON CURRENT MATURITY ASSESSMENTS
KPI scores can only be known when the process includes A maturity survey has been conducted to establish a
monitoring and measurement [1]. reference benchmark [15]. It indicated that the maturity of
In an enterprise, a process evolves from non-existent to a enterprise IT processes were between 2.0 and 2.5. The
process with optimized capability. As mentioned earlier that assessment was focused on 15 most important processes.
processes are the smallest unit of IT governance, hence the Unfortunately, the assessment method is not explained.
maturity model is used to measure process maturity. Table 1 Several maturity assessment methods have been proposed
depicts the COBIT 4.1 generic maturity model. Each process [11][12][13]. A method in [11] uses a questionnaire and a
adapts this generic model into its own specific maturity model, ranking system. This method however does not take into
e.g. level 0-2 of process DS11 as shown in Table 2. account the evolving nature of a process as suggested by the
COBIT 4.1 maturity model. Instead, it measures the
TABLE 1. GENERIC MATURITY MODEL [1] “compliance” of an IT process with each maturity level.
Level Status Description
TABLE 3. MATURITY ATTRIBUTES TABLE [1]
0 Non-existent Process is not applied at all.
Level AC PPP TA SE RA GSM
1 Initial/ad hoc Process is ad hoc and disorganized.
Repeatable but 1
2 Process follows regular patterns.
intuitive 2
3 Defined process Process is documented and communicated.
3
Managed and
4 Process is monitored and measured. 4
measurable
5 Optimized Process follows best practices and automated 5

TABLE 2. MATURITY LEVELS OF PROCESS DS11 (EXCERPT) [1] TABLE 4. MATURITY LEVELS OF AWARENESS AND COMMUNICATIONS [1]

Level Description Level Awareness and Communication (AC)


0 Data are not recognized as corporate resources and assets. There is 1 Recognition of the need for the process is emerging. There is
no assigned data ownership or individual accountability for data sporadic communication of the issues.
management. Data quality and security are poor or non-existent.
2 There is awareness of the need to act. Management communicates
1 The organization recognizes a need for effective data management. the overall issues.
There is an ad hoc approach for specifying security requirements for
data management, but no formal communications procedures are in 3 There is understanding of the need to act. Management is more
place. No specific training on data management takes place. formal and structured in its communication.
Responsibility for data management is not clear. Backup/restoration
4 There is understanding of the full requirements. Mature
procedures and disposal arrangements are in place.
communication techniques are applied and standard communication
2 The awareness of the need for effective data management exists tools are in use.
throughout the organization. Data ownership at a high level begins to
5 There is advanced, forward-looking understanding of requirements.
occur. Security requirements for data management are documented
Proactive communication of issues based on trends exists, mature
by key individuals. Some monitoring within IT is performed on data
communication techniques are applied, and integrated
management key activities (e.g., backup, restoration, disposal).
communication tools are in use.
Responsibilities for data management are informally assigned for
key IT staff members.

344
2015 International Conference on Science in Information Technology (ICSITech)

Questionnaire statements are derived straightforwardly TABLE 5. QUESTIONS FOR PROCESS DS11 LEVEL 0-2 [11]
from the maturity level descriptions. E.g., we can see that
Level No. Questionnaire statements
questionnaires for process DS11 as listed in Table 5 are derived
from the descriptions in Table 2. Possible answers for these 0 0.1 Data are not recognized as corporate resources and assets.
questionnaires are provided in a Likert scale as “not at all” 0.2 There is no assigned data ownership or individual
(compliance value = 0.00), “a little” (0.33), “quite a lot” (0.66) accountability for data management.
and “completely” (1.00). The maturity level is then calculated 0.3 Data quality and security are poor or non-existent.
as a weighted average of those compliance values.
1 1.1 The organization recognizes a need for effective data
In Table 5, the statements for each level do not address management.
process attributes in entirety. E.g., level 2 assesses the maturity
1.2 There is an ad hoc approach for specifying security
of attributes AC (question 2.1), PPP (2.2 and 2.3), RA (2.3 and requirements for data management, but no formal
2.5) and GSM (2.4). Attributes TA and SE are however not communications procedures are in place.
assessed.
1.3 No specific training on data management takes place.
Furthermore, control objectives are not considered in 1.4 Responsibility for data management is not clear.
entirety. E.g., level 2 assesses control objectives of data
management requirements (question 2.1, 2.2, 2.3 and 2.5), 1.5 Backup/restoration procedures and disposal arrangements are
in place.
backup, restoration, and disposal (2.4). Storage and media
library are not considered at all. This incomplete consideration 2 2.1 The awareness of the need for effective data management
of process attributes and control objectives might result in exists throughout the organization.
unclear indications on which attributes or control objectives 2.2 Data ownership at a high level begins to occur.
need to improve.
2.3 Security requirements for data management are documented
An assessment method in [13] adopts the COBIT 4.1 by key individuals.
maturity model to define its maturity metrics. These metrics are 2.4 Some monitoring within IT is performed on data management
used to assess the maturity of activity execution, assigned key activities (e.g., backup, restoration, disposal).
responsibility, documents, and KPIs monitoring. The maturity 2.5 Responsibilities for data management are informally assigned
levels of activity execution include the process maturity for key IT staff members.
attributes of COBIT 4.1 inherently. The assigned responsibility
assesses the compliance of a process with its corresponding TABLE 6. METRICS FOR IT PROCESS MATURITY ASSESSMENT [13]
RACI chart. The maturity of documents and KPIs monitoring
is assessed based on the percentage of documents available to Level Assigned responsibility Documents KPIs
support the process and the percentage of the process’s KPIs in place monitored
monitored, respectively. Metrics for these assessments is 0 No RACI-relationships assigned. 0% 0%
depicted in Table 6.
1 25% of RACI-relationships assigned. 20% 20%
This method defines maturity in three levels, i.e. activity,
2 More than 26 % of RACI-relationships 40% 40%
process, and enterprise level. At the activity level, assessment assigned. 25 % or less of the identified
is conducted on the activity execution and assigned relationships are in line with COBIT.
responsibility. At the process level, the assessment calculates
3 More than 26% of RACI-relationships 60% 60%
the average of the maturity scores of all underlying activities,
assigned. 26- 74% of the identified
plus the document and KPIs maturity. At the enterprise level, relationships are in line with COBIT.
the assessment calculates the average of the maturity scores of
all processes. The method assumes that all metrics have the 4 More than 51% of RACI-relationships 80% 80%
assigned. 51- 99 % of the identified
same weight. relationships are in line with COBIT.
This assessment method applies a more comprehensive 5 100 % of RACI-relationships assigned. 100% 100%
approach by considering process activities, RACI, supporting 100 % of the identified relationships
documents, and KPIs. Nevertheless, it does not include are in line with COBIT.
maturity attributes and control objectives explicitly. Similar to
[11], this assessment might result in unclear indications on An attempt to align the COBIT 4.1 maturity model to
which attributes or control objectives to improve. ISO/IEC 15504:2003 measurement scale is proposed in [12].
Metrics for KPIs monitoring in Table 6, however, is not ISO/IEC 15504 is a standard for assessing and improving
defined properly. According to the COBIT 4.1 maturity model, software development process [16]. It provides minimum
KPI scores can only be known when a process includes requirements to ensure assessment consistency and
monitoring and measurement (i.e. maturity level 4). Thus it repeatability. The maturity levels of each process attributes in
should not be assessed in an increasing percentage, from no COBIT 4.1 is mapped to the rating scale in ISO/IEC 15504, i.e.
KPIs monitoring (0% at level 0) to full KPIs monitoring (100% level 1 to “none” (0-15%), level 2 to “partially” (16-50%),
at level 5). level 3 to “largely” (51-85%), and level 4 and 5 to “fully” (86-
100%).

345
2015 International Conference on Science in Information Technology (ICSITech)

In this way, COBIT 4.1 process maturity could be assessed TABLE 8. CALCULATION OF PROCESS MATURITY LEVEL
using process assessment methods that comply with
Control objectives Maturity attributes
requirements specified in ISO/IEC 15504. This attempt
however does not consider COBIT 4.1 control objectives of AC PPP TA SE RA GSM
enterprise IT processes. DS11.4
IV. ASSESSMENT BASED ON PROCESS ATTRIBUTES AND DS11.5
CONTROL OBJECTIVES DS11.6
In COBIT 4.1, IT process maturity is characterized by Attribute maturity 1.33 1.67 2.00 2.00 1.00 1.00
maturity attributes. Therefore, these attributes should be used
in assessment, gap analysis, and improvement planning [1]. Process maturity 1.50
The use of maturity attributes would give us clear indications
on which aspects of a process should be improved. The answers for each combination of maturity attribute and
A process consists of a number of activities. A process is control objectives (e.g. AC and DS11.6) from all respondents
controlled by control objectives. We consider control are averaged and put into the corresponding cell in the table.
objectives as requirements that a process must satisfy. COBIT Attribute maturity is then calculated as the average of control
4.1 suggests that IT process audit is necessary to check whether objectives maturity on that particular attribute as indicated with
control objectives are met [1]. We consider activities as means a vertical arrow in Table 8. Finally, process maturity is
that must be done to meet the control objectives. An activity calculated as the average of all attribute maturity, which is
execution does not attest that its control objectives are met. On 1.50. By rounding this number to the nearest integer, we can
the contrary, when the control objectives are met, we can say that process DS11 of the case study has maturity level 2
presume that the activity must have been executed. Therefore (repeatable but intuitive).
we propose an assessment method that is based on control This assessment allows us to identify aspects of a process
objectives rather than activities. In the following, we describe that need improvement. From Table 8, we can see that
the assessment method and illustrate its use with process DS11 attributes RA and GSM score the lowest and thus need to be
as a case study. prioritized. The improvement could refer to the description of
In the case study, we assess proses DS11 as indicated in attribute maturity table for the expected (to-be) maturity level.
COBIT Quickstart, which is a simplified version of COBIT 4.1 Suppose that the expected level is 3 (defined process), the
for smaller organizations [17]. Process DS11 of COBIT improvement can be “organization shall assign responsibility
Quickstart has 3 control objectives only, i.e. DS11.4, DS11.5, for data management”.
and DS11.6. The study has been conducted in a public
education institution in East Java, Indonesia. V. CONCLUSIONS
This paper has presented a method for assessing enterprise
Our assessment uses a questionnaire, in which its IT processes to the COBIT 4.1 maturity model based on process
statements are developed for each combination of maturity attributes and control objectives. The use of process attributes
attributes and control objectives. Possible answers immediately allows us to identify specifically aspects that need to be
reflect the maturity levels of the control objective. Table 7 improved. We choose to use control objectives, instead of
shows an example of questionnaire statement and its possible activities, since we presume that activity must have been
answers for attribute AC and control objective DS11.6. The executed when the control objectives are met.
questionnaire is then distributed to selected respondents, i.e.
key functions in the enterprise as indicated in the RACI chart Our future work will use this assessment method with more
of the process. The answers are then calculated as in Table 8. case studies and draw practical lessons from those experiences.
Moreover, we will investigate a method to validate the
TABLE 7. QUESTION FOR AWARENESS AND COMMUNICATION OF DS11.6 recommendations indicated by the assessment method.
DS11.6 How is the management’s awareness to data security? REFERENCES
0 Organization is not aware to data security.
1 Organization recognizes the need of data security. [1] IT Governance Institute, COBIT 4.1, 2007.
[2] ISACA, COBIT 5, https://cobitonline.isaca.org/about, 2014.
2 There is awareness to data security and need to act accordingly.
[3] Software Engineering Institute, CMMI® for Development, Version 1.3,
There is a forum to communicate this issue.
Technical Report, CMU/SEI-2010-TR-033, 2010.
3 There is an understanding about data security. There is a formal [4] M. Röglinger, J. Pöppelbuß, and J. Becker, “Maturity models in business
communication from management to act effectively w.r.t. data process management,” Business Process Management Journal, 18(2), pp.
security. 328-346, 2012.
4 The need of and act w.r.t. data security management have been [5] T. De Bruin and M. Rosemann, “Towards a Business Process
understood by organization. Periodically, an internal forum is Management Maturity Model”, Proc. of European Conf. on Information
conducted to discuss relevant issues. Systems, 2005.
[6] M. Siponen, “Towards maturity of information security maturity criteria:
5 Data security management has been applied by organization. There six lessons learned from software maturity criteria,” Information
are external forums to find solutions to data security Management & Computer Security, 10(5), pp. 210-224, 2002
issues/problems.

346
2015 International Conference on Science in Information Technology (ICSITech)

[7] V.A. Canal, ISM3 1.0. Information Security Management Maturity [12] A. Walker, T. McBride, G. Basson, and R. Oakley, “ISO/IEC 15504
Model, Institute for Security and Open Methodologies, 2004. measurement applied to COBIT process maturity”, Benchmarking: An
[8] K.V. Andersen and H.Z. Henriksen, “E-government maturity models: International Journal, 19(2), pp. 159-176, 2012.
Extension of the Layne and Lee model,” Government Information [13] M. Simonsson, P. Johnson, and H. Wijkström, “Model-based IT
Quarterly, 23(2), pp. 236-248, 2006. governance maturity assessments with COBIT”, Proc. of European
[9] D.Y. Kim and G. Grant, “E-government maturity model using the Conf. on Information Systems, 2007.
capability maturity model integration,” J. of Systems and Information [14] IT Governance Institute, Board briefing on IT governance. 2nd ed., 2003
Technology, 12(3), pp. 230-244, 2010. [15] E. Guldentops, W. van Grembergen, and S. de Haes. “Control and
[10] G. Valdés, M. Solar, H. Astudillo, M. Iribarren, G. Concha, and M. governance maturity survey: establishing a reference benchmark an a
Visconti, “Conception, development and implementation of an e- self-assessment tool”, Information Systems Control Journal, 6, 2002.
government maturity model in public agencies,” Government [16] ISO, ISO/IEC 15504-2:2003 Information technology – Process
Information Quarterly, 28(2), pp. 176-187, 2011. assessment – Part 2: Performing an assessment, 2003.
[11] A. Pederiva, “The COBIT maturity model in a vendor evaluation case”, [17] IT Governance Institute, COBIT Quickstart 2nd edition, 2007.
Information Systems Control Journal, 3, 2003.

347

You might also like