You are on page 1of 116

Using Enterprise Architecture for COBIT 5

Process Assessment and Process Improvement

Gonçalo Rodrigues Cadete

Thesis to obtain the Master of Science Degree in

Information Systems and Computer Engineering

Supervisors: Prof. Miguel Leitão Bignolas Mira da Silva / Prof. Steven De Haes

Examination Committee
Chairperson: Prof. Nuno João Neves Mamede
Supervisor: Prof. Miguel Leitão Bignolas Mira da Silva
Member of the Committee: Prof. José Manuel Nunes Salvador Tribolet

November 2015
Acknowledgments
First and foremost, I would like to express my sincere appreciation to Prof. Miguel Mira da Silva, for
his steadfast guidance and valuable advice.

Likewise, I would also like to thank Prof. Steven De Haes for his challenging questions, which helped
keep the focus along the way. And, surely, one should not forget to mention his contributions for the
creation of the COBIT 5 framework, on whose shoulders this work builds upon.

This work would not have come to light without Prof. José Tribolet’s inspirational calls, namely to
promote a systemic approach to the governance and management of organizations. We hope that this
work may provide a small contribution to the stated objective.

A sincere thanks is due to Prof. Artur Caetano, Prof. Nelson Gama, Bruno Horta Soares, and Marco
Vicente, who offered their time and advice in order to help enrich this work, thus making its path an
easier and more interesting endeavour.

Finally, I would like to express my gratitude to the public sector organisations which promptly hosted
the field studies, both military (second iteration) and civil (third and fourth iterations). Their
contributions remain anonymous in this dissertation, for confidentiality purposes. They are to be
praised for being in the frontlines of the struggle to promote a systemic approach to the governance
and management of organizations.

i
ii
Resumo
O quadro de referência COBIT 5 promove a Arquitectura Empresarial como área processual relevante
para a criação e manutenção dos facilitadores da governação e da gestão dos Sistemas de
Informação. Estes facilitadores são, por sua vez, instrumentais para garantir o alinhamento entre as
necessidades das partes interessadas e as soluções sistémicas que endereçam essas mesmas
necessidades.

A eficácia e a eficiência das iniciativas de avaliação e melhoria dos processos de governação e


gestão poderão ser negativamente afectadas por uma capacitação deficiente da área de Arquitectura
Empresarial. De facto, a falta de auto-conhecimento organizacional obriga as partes interessadas a
empregarem uma quantidade desproporcionada de recursos na sincronização activa de informação e
de conhecimento, requerida para suportar as actividades típicas de recolha e de validação de
evidências – entre outras. Como consequência deste desperdício, ficarão assim disponíveis menos
recursos para as actividades de optimização dos facilitadores da governação e da gestão.

Este trabalho propõe, como forma de melhorar os resultados das iniciativas de avaliação e melhoria
de processos, a utilização de descrições arquitecturais que integram a lógica dos processos COBIT 5,
recorrendo para tal às extensões standard do ArchiMate - de forma a facilitar a adopção da solução.

A solução arquitectural aqui proposta foi desenvolvida ao longo de quarto iterações, usando o modelo
processual da metodologia Design Science Research Methodology (DSRM). Esta abordagem iterativa
permitiu que se obtivessem melhorias incrementais em cada um dos ciclos ciclo de desenho, no que
concerne à definição dos objectivos, ao desenho dos artefactos e às técnicas de avaliação utilizadas.

A solução proposta foi demonstrada usando três casos reais, no contexto de duas organizações do
sector público, usando cenários do tipo ex-ante e ex-post. Para as fases de avaliação deste trabalho,
foram utilizadas as demonstrações, formulários de avaliação, bem como entrevistas e sessões de
grupo, de forma a avaliar a eficácia, a consistência contextual e a qualidade estrutural da solução de
Arquitectura Empresarial proposta.

Palavras-Chave: COBIT, Enterprise Architecture, governance of enterprise IT, TOGAF, ArchiMate,


business and IT alignment, design science, design science research methodology.

iii
Abstract
The COBIT 5 best practice framework promotes Enterprise Architecture (EA) as a key process for
helping to guide the creation and maintenance of governance and management enablers. These
governance and management enablers are sought as instrumental for providing assurance regarding
the alignment between the business stakeholders’ needs and the information system solutions.

Deficiencies in the required EA capabilities may hinder the effectiveness and efficiency of process
assessment and process improvement initiatives. Indeed, the lack of organizational self-awareness
forces the assessment stakeholders to engage a disproportionate amount of resources in wasteful
evidence collection, evidence validation, and other interactive synchronization activities, thus leaving
less resources for performing the actual optimization of the organizational enablers.

In order to improve the outcomes of COBIT 5 process assessment and process improvement
initiatives, we propose to integrate the COBIT 5 process rationale in the EA representations, using the
standard ArchiMate extensions in order to promote easy adoption.

We designed the EA solution using four iterations of the Design Science Research Methodology
(DSRM) process model. This iterative learning approach enabled incremental improvements on each
design cycle, regarding the definition of the solution’s objectives, the artifacts’ design, and the
evaluation techniques.

We demonstrate the proposal by applying in three field studies, in the context of two public sector
organizations, using both ex-ante and ex-post scenarios. For the evaluation of this work we used the
demonstrations, evaluation forms, as well as interviews and group sessions, in order to evaluate the
goal efficacy, the environment consistency, and the structural quality of the proposed EA solution.

Keywords: COBIT, Enterprise Architecture, governance of enterprise IT, TOGAF, ArchiMate, business
and IT alignment, design science, design science research methodology.

v
Contents

Acknowledgments .................................................................................................................................. i

Resumo.................................................................................................................................................. iii

Abstract .................................................................................................................................................. v

List of Figures ........................................................................................................................................ x

List of Tables ........................................................................................................................................ xii

List of Acronyms ................................................................................................................................ xiii

Glossary .............................................................................................................................................. xiv

1 Introduction .................................................................................................................................... 1

1.1 Motivation for Developing EA Capabilities ............................................................................. 1

1.2 Research Problem ................................................................................................................. 2

1.3 Research Methodology.......................................................................................................... 3

2 Related Work .................................................................................................................................. 5

2.1 Research Scope .................................................................................................................... 5

2.2 Objectives and Requirements ............................................................................................... 6

2.3 Stakeholders .......................................................................................................................... 6

2.4 COBIT 5 ................................................................................................................................. 7


2.4.1 APO03 Manage Enterprise Architecture Process .................................................... 7
2.5 TOGAF .................................................................................................................................. 9
2.5.1 Mapping TOGAF ADM Components to COBIT 5 Practices ..................................... 9
2.5.2 Using TOGAF for Enabling COBIT 5 ...................................................................... 10
2.5.3 TOGAF Modelling and Content Framework ........................................................... 10
2.6 ArchiMate............................................................................................................................. 12
2.6.1 ArchiMate Extensions ............................................................................................. 12
2.7 DSRM Iterations .................................................................................................................. 13

2.8 Solution Objectives and Requirements ............................................................................... 13

3 Research Problem ....................................................................................................................... 15

3.1 Problem Context .................................................................................................................. 15

3.2 Problem Statement .............................................................................................................. 16

vii
3.2.1 Value of the Solution ............................................................................................... 16
4 Theoretical Background .............................................................................................................. 19

4.1 Architecture Principles ......................................................................................................... 19

4.2 Architectural Method ............................................................................................................ 20


4.2.1 Using COBIT 5 with ITIL and Other GEIT-related Initiatives .................................. 20
4.3 Architectural Models and Artifacts ....................................................................................... 20

5 First DSRM Iteration .................................................................................................................... 21

5.1 Solution Proposal ................................................................................................................ 21


5.1.1 Solution Statement ................................................................................................. 21
5.1.2 Solution Objectives and Requirements .................................................................. 21
5.1.3 Proposed Artifacts .................................................................................................. 22
5.2 Demonstration ..................................................................................................................... 23

5.3 Evaluation ............................................................................................................................ 23


5.3.1 Lessons learned ..................................................................................................... 23
5.4 Communication .................................................................................................................... 24

5.5 Conclusion ........................................................................................................................... 24

6 Second DSRM Iteration ............................................................................................................... 25

6.1 Solution Proposal ................................................................................................................ 25


6.1.1 Constructs Mapping ................................................................................................ 25
6.1.2 Viewpoint Template ................................................................................................ 27
6.1.3 Guidelines for using the Viewpoint Template ......................................................... 27
6.2 Demonstration ..................................................................................................................... 28

6.3 Evaluation ............................................................................................................................ 29


6.3.1 Selecting the evaluators and validating the ratings ................................................ 30
6.3.2 Evaluation ratings ................................................................................................... 31
6.3.3 Evaluation rating results ......................................................................................... 33
6.3.4 Lessons learned ..................................................................................................... 34
6.4 Communication .................................................................................................................... 35

6.5 Conclusion ........................................................................................................................... 36

7 Third DSRM Iteration ................................................................................................................... 37

7.1 Solution Proposal ................................................................................................................ 37

7.2 Demonstration ..................................................................................................................... 38

7.3 Evaluation ............................................................................................................................ 38

7.4 Communication .................................................................................................................... 39

viii
7.5 Conclusion ........................................................................................................................... 40

8 Fourth DSRM Iteration ................................................................................................................. 41

8.1 Solution Proposal ................................................................................................................ 41


8.1.1 ArchiMate Constructs ............................................................................................. 42
8.1.2 Integrating the Assessment and Improvement Perspectives ................................. 42
8.1.3 Modelling Capability and Capability Improvement ................................................. 43
8.2 Demonstration ..................................................................................................................... 47

8.3 Evaluation ............................................................................................................................ 47

8.4 Communication .................................................................................................................... 48

8.5 Conclusion ........................................................................................................................... 49

9 Conclusion ................................................................................................................................... 51

9.1 Research Communication ................................................................................................... 51

9.2 Contributions ....................................................................................................................... 53


9.2.1 Contributions to COBIT 5 and EA practice ............................................................. 54
9.2.2 Contributions to the COBIT 5 and EA knowledge base ......................................... 54
9.3 Future Work ......................................................................................................................... 55

Bibliography ......................................................................................................................................... 57

Appendixes ............................................................................................................................................ 1

Appendix A: First DSRM Iteration - Constructs Mapping................................................................. 1

Appendix B: First DSRM Iteration - Viewpoints ............................................................................. 12

Appendix C: Third DSRM Iteration – Viewpoints ........................................................................... 14

Appendix D: Fourth DSRM Iteration – Viewpoints ......................................................................... 31

ix
List of Figures
Main Body Figures

Figure 1-1: The DSRM Process Model. [4] ............................................................................................. 3

Figure 2-1: GEIT opportunity space. ....................................................................................................... 5

Figure 6-1: Generic ArchiMate template, for viewpoints used in COBIT 5 Process Performance
Assessments. ........................................................................................................................................ 27

Figure 6-2: ArchiMate viewpoint, showing an instantiation of the viewpoint template for the APO02
process. ................................................................................................................................................. 29

Figure 6-3: Ratings regarding the agreement level with the research problem approach. ................... 34

Figure 6-4: Ratings regarding the agreement level with solution’s usefulness claims. ......................... 34

Figure 7-1: Ratings regarding the agreement level with the research problem approach. ................... 39

Figure 7-2: Ratings regarding the agreement level with solution’s usefulness claims. ......................... 39

Figure 8-1: The Implementation Life Cycle, taken from “COBIT 5 Implementation” [7] ........................ 42

Figure 8-2: Process improvement structure, instantiated for Level 0 to Level 1, and for process APO12
Manage Risk. ......................................................................................................................................... 44

Figure 8-3: Ratings regarding the agreement level with the research problem approach. ................... 48

Figure 8-4: Ratings regarding the agreement level with solution’s usefulness claims. ......................... 48

Appendix Figures

Appendix Figure 1: Goals Cascade viewpoint. ..................................................................................... 12

Appendix Figure 2: Enabling Process Performance viewpoint. ............................................................ 13

Appendix Figure 3: COBIT 5 Goals and Principles ............................................................................... 14

Appendix Figure 4: Principle 1 (Meeting Stakeholder Needs) .............................................................. 15

Appendix Figure 5: Enterprise Goals .................................................................................................... 16

Appendix Figure 6: IT-related Goals ..................................................................................................... 16

Appendix Figure 7: IT-related goals and base practices, for process APO12 Manage Risk ................ 17

Appendix Figure 8: Enabling Processes ............................................................................................... 18

Appendix Figure 9: Enabling Processes Stakeholders, Goals and Requirements View, for process
APO12 Manage Risk ............................................................................................................................. 19

x
Appendix Figure 10: Process Capability Level 1 Performed process, for process APO12 Manage Risk
............................................................................................................................................................... 20

Appendix Figure 11: Activities View, for process APO12 Manage Risk ................................................ 21

Appendix Figure 12: Work Products (Inputs and Outputs) View for APO12 Manage Risk ................... 22

Appendix Figure 13: Work Products (Outputs) View for APO12 Manage Risk ..................................... 23

Appendix Figure 14: Generic Work Products (GWPs), for process APO12 Manage Risk.................... 24

Appendix Figure 15: Process Capability Level 2 Managed Process, for process APO12 Manage Risk
............................................................................................................................................................... 25

Appendix Figure 16: Process Capability Level 3 Established process, for process APO12 Manage Risk
............................................................................................................................................................... 26

Appendix Figure 17: Process Capability Level 4 Predictable process, for process APO12 Manage Risk
............................................................................................................................................................... 27

Appendix Figure 18: Process Capability Level 5 Optimizing process, for process APO12 Manage Risk
............................................................................................................................................................... 28

Appendix Figure 19: Generic Work Products for Process Capability Level 2: Managed Process, for
process: APO12 Manage Risk .............................................................................................................. 29

Appendix Figure 20: Generic Work Products for Process Capability Level 3: Established process, for
process: APO12 Manage Risk .............................................................................................................. 30

Appendix Figure 21: Process Capability Improvement and GEIT View, for APO12 Manage Risk ....... 31

Appendix Figure 22: Process Capability Improvement View, for process APO12 Manage Risk .......... 32

Appendix Figure 23: Process Capability Improvement View, Level 0 to Level 1, for process APO12
Manage Risk .......................................................................................................................................... 33

Appendix Figure 24: Process Capability Improvement View, Level 1 to Level 2, for process APO12
Manage Risk .......................................................................................................................................... 34

Appendix Figure 25: Process Capability Improvement View, Level 2 to Level 3, for process APO12
Manage Risk .......................................................................................................................................... 35

Appendix Figure 26: Process Capability Improvement View, Level 3 to Level 4, for process APO12
Manage Risk .......................................................................................................................................... 36

Appendix Figure 27: Process Capability Improvement View, Level 4 to Level 5, for process APO12
Manage Risk .......................................................................................................................................... 37

xi
List of Tables
Table 2-1 - Mapping TOGAF ADM components to COBIT 5 practices ................................................... 9

Table 2-2: Solution objectives, before the theoretical background analysis. ......................................... 13

Table 5-1: Solution objectives and requirements, for the first DSRM iteration. ..................................... 22

Table 6-1: COBIT 5 to ArchiMate Ontological Mapping ......................................................................... 26

Table 6-2: Evaluating the Solution’s Usefulness. .................................................................................. 32

Table 6-3: CBI2015 conference review analysis. .................................................................................. 35

Table 8-1: Conceptual mapping between the COBIT 5 Implementation Life Cycle phases and the EA
artifacts. ................................................................................................................................................. 45

xii
List of Acronyms
Acronym Term Sources

ACF Architecture Content Framework (TOGAF context) [1]

ADM Architecture Development Method (TOGAF context) [1]

COBIT Control Objectives for Information and related Technology [2]

DS Design Science [3]

DSRM Design Science Research Methodology [3] [4] [5]

EA Enterprise Architecture [6]

GEIT Governance of Enterprise IT [7]

IS Information system(s) [8]

IT Information technology(ies) [8]

ITIL IT Infrastructure Library [9]

TIPA Tudor’s ITSM Process Assessment [10]

TOGAF The Open Group Architecture Framework [1]

xiii
Glossary
Term Definition Sources

Architecture A formal description of a system, or a detailed plan of the [1] [11]


system at component level, to guide its implementation (source:
ISO/IEC/IEEE 42010:2011 [11]).

The structure of components, their inter-relationships, and the


principles and guidelines governing their design and evolution
over time.

Architecture A conceptual structure used to develop, implement, and sustain [1]


framework an architecture.

Artifact Artifacts may include: [4]

(in the context of: - Constructs, models, methods, and instantiations.


design science)
- They may also include social innovations or new properties of
technical, social, or informational resources.

Note: In short, this definition includes any designed object with


an embedded solution to an understood research problem. [4]

Artifact An artifact is an architectural work product that describes an [1]


aspect of the architecture.
(in the context of:
TOGAF) Artifacts are generally classified as:

- Catalogs (lists of things);

- Matrices (showing relationships between things), and;

- Diagrams (pictures of things).

Note: Artifacts will form the content of the Architecture


Repository. [1]

xiv
COBIT 5 Formerly known as Control Objectives for Information and [2]
related Technology (COBIT); now used only as the acronym in
its fifth iteration. A complete, internationally accepted framework
for governing and managing enterprise information and
technology (IT) that supports enterprise executives and
management in their definition and achievement of business
goals and related IT goals.

Scope Note: Earlier versions of COBIT focused on control


objectives related to IT processes, management and control of
IT processes and IT governance aspects.

Capability An ability that an organization, person, or system possesses. [1]


Capabilities are typically expressed in general and high-level
terms and typically require a combination of organization,
people, processes, and technology to achieve. For example,
marketing, customer contact, or outbound telemarketing.

Concern(s) The key interests that are crucially important to the [1]
stakeholders in a system, and determine the acceptability of the
system. Concerns may pertain to any aspect of the system’s
functioning, development, or operation, including
considerations such as performance, reliability, security,
distribution, and evolvability.

Constructs The conceptual vocabulary of a domain. [12] [12] [5]

Note: Constructs provide the vocabulary and symbols used to


define problems and solutions. [5]

Design Science Design science creates and evaluates IT artifacts intended to [4]
solve identified organizational problems.

Note: Such artifacts may include constructs, models, methods,


and instantiations. They may also include social innovations or
new properties of technical, social, or informational resources;
in short, this definition includes any designed object with an
embedded solution to an understood research problem.

xv
Design Science A methodological guideline for effective Design Science [4] [12]
Research research.
Methodology (DSRM)
Note: In this work we will use the process model developed by
Peffers et al [4]. Other process models may be found in [12].

Enterprise The highest level (typically) of description of an organization [1]


and typically covers all missions and functions. An enterprise
will often span multiple organizations.

Enterprise The architecture of an enterprise, see the definitions for the [1] [11]
Architecture terms “architecture” [1] [11] and “enterprise” [1].

(in the context of:


organization of the
enterprise; or formal
description of such
organization)

Enterprise Discipline or process area that aims to establish and maintain a [13]
Architecture common architecture consisting of business process,
information, data, application and technology architecture
(in the context of:
layers for effectively and efficiently realizing enterprise and IT
discipline; or
strategies by creating key models and practices that describe
process area)
the baseline and target architectures.

It achieves this objectives by representing the different building


blocks that make up the enterprise and their inter-relationships
as well as the principles guiding their design and evolution over
time, enabling a standard, responsive and efficient delivery of
operational and strategic objectives.

Framework A structure for content or process that can be used as a tool to [1]
structure thinking, ensuring consistency and completeness.

Governance Governance ensures that stakeholder needs, conditions and [2]


options are evaluated to determine balanced, agreed-on
enterprise objectives to be achieved; setting direction through
prioritization and decision making; and monitoring performance
and compliance against agreed-on direction and objectives.

xvi
Governance of A governance view that ensures that information and related [2]
enterprise IT technology support and enable the enterprise strategy and the
achievement of enterprise objectives. It also includes the
functional governance of IT, i.e., ensuring that IT capabilities
are provided efficiently and effectively.

Information An asset that, like other important business assets, is essential [2]
to an enterprise’s business. It can exist in many forms: printed
or written on paper, stored electronically, transmitted by post or
electronically, shown on films, or spoken in conversation.

Information system Interrelated components working together to collect, process, [8]


store, and disseminate information to support decision making,
coordination, control, analysis, and visualization in an
organization.

Instantiations The operationalization of constructs, models, and methods. [12]

Information All the hardware and software technologies a firm needs to [8]
technology achieve its business objectives.

Management Management plans, builds, runs and monitors activities in [2]


alignment with the direction set by the governance body to
achieve the enterprise objectives.

Methods A set of steps used to perform a task – how-to knowledge. [12]

Models A set of propositions or statements expressing relationships [12]


between constructs.

Risk The combination of the probability of an event and its [2]


consequence (ISO/IEC 73).

Stakeholder Anyone who has a responsibility for, an expectation from or [2]


some other interest in the enterprise — e.g., shareholders,
(in the context of:
users, government, suppliers, customers and the public.
enterprise)

xvii
Stakeholder An individual, team, or organization (or classes thereof) with [1]
interests in, or concerns relative to, the outcome of the
(in the context of:
architecture. Different stakeholders with different roles will have
enterprise
different concerns.
architecture)

View The representation of a related set of concerns. A view is what [1]


is seen from a viewpoint. An architecture view may be
represented by a model to demonstrate to stakeholders their
areas of interest in the architecture. A view does not have to be
visual or graphical in nature.

Viewpoint A definition of the perspective from which a view is taken. It is a [1]


specification of the conventions for constructing and using a
view (often by means of an appropriate schema or template). A
view is what you see; a viewpoint is where you are looking from
— the vantage point or perspective that determines what you
see.

xviii
1 Introduction
Information systems (IS) and related information technologies (IT) are implemented and managed for
the purpose of improving the effectiveness and efficiency of organizations [5]. The automation and
intelligence capabilities provided by IS are becoming increasingly relevant for achieving strategic goals
and supporting the operational excellence of enterprises [8].

Indeed, IT has become pervasive in enterprises, enabling business objectives in many areas, namely:
supporting efficiency gains through automation of key business processes, facilitating remote
collaboration, providing business innovation and competitive advantage through the production and
delivery of digital goods, enabling greater customer intimacy by providing knowledge about the
customer’s habits, needs, and preferences, and supporting business decisions by providing timely and
high quality information and knowledge [7] [8].

Therefore, enterprises increasingly recognize information and related technologies as critical business
assets, that need to be governed and managed effectively [7] [2] [14]. Other key drivers for
implementing and maintaining excellence in governance of enterprise IT (GEIT) are the need to
comply with regulatory requirements [7] [2], as well as to address information-related and IT-related
risks [15] [16] [17] [18] [19].

1.1 Motivation for Developing EA Capabilities


The COBIT 5 framework [2] can be instrumental in providing a good practice approach for
implementing GEIT initiatives, in order to maximize the value from IT investments, manage IT-related
risks and achieve compliance [7].

The framework itself provides specific guidance for implementing governance using COBIT 5 [7]. This
means that each successful COBIT 5 initiative may facilitate its own future path, by enhancing the
capabilities required for supporting governance and management.

In particular, COBIT 5 recommends a set of processes that are instrumental in guiding the creation
and maintenance of GEIT enablers. Among these processes, the APO03 Manage Enterprise
Architecture is proposed as one of the key GEIT initiative enablers [7]. This recommendation should
come as no surprise; indeed, Enterprise Architecture (EA) is instrumental for providing holistic
organizational self-awareness [20], including relevant entities and their relationships, cutting across
business domains, as well as bridging business and technology divides [21] [6] [22] [11] [1] [23].
Therefore, EA provides shared inter-domain knowledge and shared viewpoints, which enable different
stakeholder to conduct effective conversations for engaging in compliance assessments [24] [25], risk
management [15] [16], and change initiatives [26] [27] [28] [14] [29] [30] [13].

As related guidance for the APO03 process, the COBIT 5 frameworks recommends the TOGAF
architecture framework, an Open Group standard [1]. The ArchiMate architecture modelling language
[31], which is another Open Group standard, provides a good match for TOGAF [32], enabling the

1
analysis and visualization of inter-related architectures, by providing views and viewpoints for
addressing stakeholders’ concerns.

Therefore, the COBIT 5 framework and the TOGAF standard are synergistic sources for requirements,
regarding the enhancement of EA capabilities for enabling governance initiatives.

1.2 Research Problem


During the years 2014 and 2015, the author was engaged in GEIT initiatives using COBIT 5, which
required assessing the performance of governance and management processes and providing
recommendations on improving organizational capabilities. The experience thus gained by COBIT 5
practice demonstrated, we argue, how EA capabilities can be valuable for GEIT initiatives. Indeed, the
lack of adequate organizational self-awareness and interactive synchronization tools forces the
assessment stakeholders to engage a disproportionate amount of resources in the auditing and
consulting activities, especially during the evidence collection and evidence validation iterations.
Besides the efficiency penalty, resource-constrained initiatives will also suffer from effectiveness
hindrances because less resources will be made available for performing the actual assessment,
documenting exceptions and gaps, communicating the assessment conclusions, and optimizing the
improvement recommendations and roadmaps [18] [17].

However, the current TOGAF standard does not provide specific architectural building blocks for
COBIT 5. Note that TOGAF version 9.1 - currently the latest TOGAF release - was published in 2011,
i.e. before the initial 2012-2013 COBIT 5 framework specifications were published.

Generally, the research problem may thus be defined as a search for a solution with the purpose of
providing adequate EA capabilities, in order to assist process assessment and process improvement
initiatives using COBIT 5.

In this work, we propose an EA approach that integrates COBIT with EA principles, methods, and
models, using the ArchiMate standard as the architecture modeling language to describe the EA. In
order to maximize the solution effectiveness, we propose to embed the COBIT best practices (formerly
control objectives [33]) rationale directly in the EA models.

Note that this work is not focused on finding optimal solutions for detailed analysis (such as
dependency analysis) and specific engineering optimization efforts. Instead, it seeks to provide
standards-based EA instruments which may serve as a basis for improving self-awareness and
interactive synchronization efforts, for the benefit of GEIT stakeholders engaged in the typical auditing
and consulting activities. Nevertheless, the high-level views and viewpoints which we propose may
serve a basis for providing architectural representations on top of which such detailed work may be
designed and performed.

2
1.3 Research Methodology
In this thesis work, we propose to use the Design Science Research Methodology (DSRM) [5] [4] [3] in
order to guide the construction and evaluation of the architectural IS artifacts that will be developed.
DSRM incorporates principles, practices and a process model which are adequate [3] to conduct
design science (DS) research in applied research disciplines, such as engineering research and –
more recently [5] [4] – information systems research, whose cultures value incrementally effective
solutions. The design science paradigm seeks to create and evaluate “what is effective” [5] in the
problem space.

Regarding the architectural IS artifacts that we propose in this work, we will use a definition of artifacts
that includes instantiations (implemented and prototype systems), constructs, models, and methods
[5] that enable Enterprise Architecture capabilities [1]. We will design and evaluate the artifacts not
simply by their own intrinsic features, but by their effectiveness in a specific context and in order to
achieve a defined goal: the capacitation of the COBIT 5 process APO03 [13] [30], in the context of
governance of enterprise IT initiatives [7] [15] [16] [24] [29] [25], for the typical auditing and consulting
activities for process assessment and process improvement.

In this work, we define constructs as artifacts that “provide the vocabulary and symbols used to define
problems and solutions” [5], enabling the construction of models or representations of the problem
domain.

The DSRM process model includes six activities (see Figure 1-1): problem identification and
motivation, definition of the objectives for a solution, design and development, demonstration,
evaluation, and communication.

Figure 1-1: The DSRM Process Model. [4]

Note that the DSRM process model includes process iteration paths, that allow for cycling between
generation-related activities (i.e. define objectives and design & development) and testing-related
activities (i.e. demonstration, evaluation, and communication). Indeed, design science is an iterative

3
search process, where heuristic-based search iterations seek to produce feasible and effective
solutions [1].

In order to obtain frequent and valuable feedback for the design process, we executed four iterations
of the generation and testing cycle. With this approach we were able to collect frequent feedback,
achieve better risk control for the design process and, ultimately, more valuable outcomes from this
work.

For each generate/test cycle iteration, we used the COBIT and TOGAF guidelines and best practices
for identifying, (re)defining, and managing the solution objectives and requirements, as well as for
further developing the desired EA capabilities. Note that according to the DSRM, “a methodology to
support research within a specific stream in IS might incorporate elements specific to the context of
that research” and that “with respect to specific activities in the research process, future researchers
may enhance the DSRM, for example, by developing subsidiary processes” [25].

4
2 Related Work
We begin our analysis of related work in a systematic way, by first trying to establish the search space
boundaries.

2.1 Research Scope


We will kick-start our search endeavour by recognizing and asserting that:

Statement 2-1: EA is a key enabler for GEIT using COBIT 5.

The COBIT 5 process APO03 Manage enterprise architecture is key for helping to guide the
creation and maintenance of governance and management enablers. [3]

The previous assertion allows for the definition of the opportunities search space (see Figure 2-1): we
will search for design opportunities regarding the improvement of EA capabilities, pondering the
enabling value for governance of enterprise IT.

In the remainder of this section we will aim to identify the success factors, as well as the solution
requirements, that may facilitate the achievement of an adequate capability level for the APO03
COBIT 5 process.

Figure 2-1: GEIT opportunity space.

Along the search process, we will identify and define relevant EA capability needs and gaps, regarding
governance of enterprise IT. Then we will seek to bridge these gaps with the help of an effective
solution.

5
2.2 Objectives and Requirements
Out related work search should provide relevant references and creative ideas for each activity of the
DSRM process model. In practice, we want to elicit objectives, requirements, and generating/testing
best practices in order to conduct and enrich the systematic design and development process.

For the initial stages of the first iteration of the DSRM process, namely the “Identify Problem &
Motivate” activity and the first execution of “Define Objectives of a Solution” activity (see Figure 1-1),
we used the following approach:

 Search for relevant stakeholders’ needs and requirements, using globally recognized state-of-
the-art standards and best practice frameworks. This search strategy aims to establish a solid
foundation for future acceptance and adoption of the solution. We followed the steps:
o Use the COBIT 5 framework, as the starting point for best practice.
o Prioritize further search efforts in COBIT-related guidance, namely TOGAF.
o Define the main design variables:
 Define the design target: who are the relevant stakeholders?
 Define the objectives and requirements: fundamentally, what do the
stakeholders need, want, and require?
 Search for relevant state-of-the-art scientific and professional literature, which may provide
(partial) solutions for similar/related problems.
o Assess the scope and applicability.
o Define the objectives and requirements: adopt, tailor or extend some of the proposals.
 Finally, identify and systematize the design opportunities. This search outcome will thus
provide the foundational basis for the following DSRM activities:
o Activity “Identify Problem & Motivate”: we will identify the gaps and motivations.
o Activity “Define Objectives of a Solution”: we provided the design opportunity
rationale, for the first generation/test iteration.

2.3 Stakeholders
We can define several classes of stakeholders, depending on the relevant scope and concerns:

 Enterprise stakeholders;
 Enterprise Architecture (EA) stakeholders;
 Governance of enterprise IT (GEIT) stakeholders;
 APO03 Manage enterprise architecture process stakeholders.

In this work we are directly concerned with enabling GEIT through capacitation of the APO03 process,
in order to assist process assessment and process improvement activities. Therefore we will focus
primarily on implementation of GEIT [7] and the enabling process APO03 [13], thus on the guidance
provided by related COBIT 5 best practices.

6
We may now state our stakeholder focus requirement:

Requirement 2-1: Stakeholder focus.

Focus on GEIT and APO03 related stakeholders.

2.4 COBIT 5
In the introductory section we have presented the COBIT 5 rationale for promoting EA, noting that the
APO03 Manage Enterprise Architecture is proposed as a key process for helping to guide the creation
and maintenance of governance and management enablers [3]. The purpose of the APO03 process is
to “represent the different building blocks that make up the enterprise and their inter-relationships as
well as the principles guiding their design and evolution over time, enabling a standard, responsive
and efficient delivery of operational and strategic objectives” [30]. Therefore, the COBIT 5 framework
provides the business case for improving EA capabilities, for the purposes of improving GEIT and thus
enable value creation.

2.4.1 APO03 Manage Enterprise Architecture Process


A general description of the EA stakeholders’ needs can be found in the APO03 process description
and process purpose statement [13]. From these statements we can identify and define the following
capability needs, as well as the corresponding requirements:

Capability 2-1: Models and practices for architectural descriptions.

Enable the creation of key models and practices that describe the baseline and target
architectures. [17]

Capability 2-2: Establish a common architecture.

Establish a common architecture consisting of business process, information, data, application and
technology architecture layers, for effectively and efficiently realizing enterprise and IT strategies.
[17]

These capabilities can be related to the following requirements:

Requirement 2-2: Shared vocabulary.

Define requirements for taxonomy. [17]

7
Requirement 2-3: Methods and standards.

Define requirements for standards, guidelines, and procedures. [17]

Requirement 2-4: Templates and tools.

Define requirements for templates and tools. [17]

Requirement 2-5: Architectural representations for all building blocks.

Represent the different building blocks that make up the enterprise and their inter-relationships, as
well as the principles guiding their design and evolution over time, including those related to the
COBIT 5 principles and practices.

Note that we have included, in the above definition, the COBIT 5-related motivational layer as a
required building block, with the goal of “enabling a standard, responsive and efficient delivery of
operational and strategic objectives” [13].

Capability 2-3: EA development method and architecture services.

The TOGAF standard is recommended by COBIT 5, as related guidance for EA. At the core of
TOGAF is the Architecture Development Method (ADM), which maps to several COBIT 5
practices. Some TOGAF components map to the COBIT 5 practice of providing enterprise
architecture services. [17]

8
2.5 TOGAF
The APO03 Manage Enterprise Architecture process promotes TOGAF [1] as related guidance for an
EA framework. The TOGAF Architecture Development Method (ADM), as well as other TOGAF
components, map to the APO03 process key management practices [13].

The current TOGAF 9.1 specification recognizes the need for a tailored approach for COBIT, by stating
that TOGAF tailoring “may include adopting elements from other architecture frameworks, or
integrating TOGAF methods with other standard frameworks, such as ITIL, CMMI, COBIT, PRINCE2,
PMBOK, and MSP” [1].

Therefore we can conclude that TOGAF is helpful in establishing capabilities that facilitate the
implementation of GEIT initiatives and thus the achievement of the governance objectives and goals.
However the standard TOGAF approach is generic, in the sense that it is not specifically tailored to the
COBIT 5 rationale.

2.5.1 Mapping TOGAF ADM Components to COBIT 5 Practices


As part of the analysis phase, it is important to understand how the TOGAF ADM phases, guidelines,
and techniques map to the COBIT 5 practices, in order to understand how TOGAF can be used to
enable GEIT.

Table 2-1 - Mapping TOGAF ADM components to COBIT 5 practices

TOGAF ADM components COBIT 5 practices

Phase A, Architecture Principles APO03.01 Develop the enterprise architecture


vision

Phases B, C, D APO03.02 Define reference architecture

Phase E APO03.03 Select opportunities and solutions

Phases F, G APO03.04 Define architecture implementation

Requirements Management, Architecture APO03.05 Provide enterprise architecture services


Principles, Stakeholder Management,
Business Transformation Readiness
Assessment, Risk Management, Capability-
Based Planning, Architecture Compliance,
Architecture Contracts

9
From the mapping presented in Table 2-1, we conclude that the TOGAF standard can be instrumental
for coverage of the relevant COBIT 5 practices, generally providing a good fit for the EA methodology
needs. So now we have a sound basis for stating the following requirement:

Requirement 2-6: Use the TOGAF ADM for architecture development.

Use the TOGAF ADM as a methodology for developing the enterprise architecture. [17]

2.5.2 Using TOGAF for Enabling COBIT 5


The TOGAF 9.1 specification recognizes the need for a tailored approach for COBIT, by stating that
TOGAF tailoring “may include adopting elements from other architecture frameworks, or integrating
TOGAF methods with other standard frameworks, such as ITIL, CMMI, COBIT, PRINCE2, PMBOK,
and MSP” [1].

We can conclude that TOGAF is helpful in establishing capabilities that facilitate the implementation of
GEIT initiatives and thus the achievement of the governance objectives and goals. However the
standard TOGAF approach is generic, in the sense that it is not specifically tailored to the COBIT 5
rationale.

Requirement 2-7: Tailor TOGAF for COBIT 5.

Tailor the TOGAF ADM for COBIT 5 stakeholders’ specific needs. [17] [18]

Some of the TOGAF components which are relevant for COBIT 5 stakeholders are presented and
analysed in the following section.

2.5.3 TOGAF Modelling and Content Framework


TOGAF provides guidelines for the type of artifacts that are relevant for architectural conversations,
meaning facilitating understanding and cooperation between stakeholders at different levels [1]. The
TOGAF recommendations concerning the need to represent the motivational rationale are especially
important in the COBIT 5 context.

Requirement 2-8: Stakeholder concerns, views, and viewpoints.

Develop viewpoints and views of the architecture that show how the concerns and requirements
are going to be addressed. [18]

10
In the TOGAF context, several types of artifacts should be pondered. These artifacts should be part of
an architectural repository.

Requirement 2-10: Content concepts: catalogues, matrices, and diagrams.

A TOGAF architecture is based on defining a number of architectural building blocks within


architecture catalogues, specifying the relationships between those building blocks in architecture
matrices, and then presenting communication diagrams that show in a precise and concise way
what the architecture is. [18]

Requirement 2-9: Artifacts and architectural repository.

An artifact is an architectural work product that describes an aspect of the architecture. Artifacts
are generally classified as catalogues (lists of things), matrices (showing relationships between
things), and diagrams (pictures of things). Artifacts will form the content of the Architecture
Repository [18].

Furthermore, the artifacts should be tailored for GEIT using COBIT 5:

Requirement 2-11: Artifacts tailored for COBIT 5.

TOGAF deliverables may be replaced or extended by a more specific set, defined in any other
framework that the architect considers relevant. [18]

The TOGAF specification states that the “benefits of using this (motivation) extension are as follows:
highlights misalignment of priorities across the enterprise and how these intersect with shared
services (e.g., some organizations may be attempting to reduce costs, while others are attempting to
increase capability); shows competing demands for business services in a more structured fashion,
allowing compromise service levels to be defined” [1].

Requirement 2-12: Represent the COBIT 5 motivational rationale.

The motivation extension is intended to allow additional structured modelling of the drivers, goals,
and objectives that influence an organization to provide business services to its customers. The
scope of this extension is as follows [18]:

 Driver: that shows factors generally motivating or constraining an organization;


 Goal: shows the strategic purpose and mission of an organization;
 Objective: shows near to mid-term achievements that an organization would like to attain.

11
The standards also recommend that templates or schemas be defined, for promoting re-use, sharing
and adoption:

Requirement 2-13: Modelling viewpoints: templates and tools.

An architecture description be encoded in a standard language, to enable a standard approach to


the description of architecture semantics and their re-use among different tools.

A viewpoint should be developed, visualized, communicated, and managed using a tool. Standard
viewpoints (i.e., templates or schemas) should be developed, so that different tools that deal in the
same views can interoperate, the fundamental elements of an architecture can be re-used, and the
architecture description can be shared among tools. [18].

2.6 ArchiMate
The TOGAF and ArchiMate specifications are both developed by The Open Group, and their
development efforts are becoming increasingly coordinated [34] [32] [35] [36] [37]. The ArchiMate
standard is also widely supported by modelling tool providers [38] [39] [40] [41] [42] [43] [44] [45],
which makes the language an attractive option for fast and easy adoption.

Requirement 2-14: Use ArchiMate as a visual modelling language for EA.

Use the ArchiMate standard for modelling the EA.

2.6.1 ArchiMate Extensions


The language provides extension mechanisms to extend the core language, through adding attributes
to ArchiMate concepts and relationships, as well as specialization of concepts and relationships [31].
Extending the ArchiMate language can be useful for optimizing the ontological fit of the architectural
representations. Business concepts like organizational competencies [46] and key performance
indicators [47] can be modeled using such an approach. Also, ontology-related techniques can be
used assist the modeling of enterprise architectures through the analysis and validation of models
[48].

However, in this work we aim to maximize the value of the COBIT 5 window of opportunity. So we
based our artifacts on constructs that are an integral part of the ArchiMate specification, for fast and
widespread adoption, thus facilitating the use of popular ArchiMate-compatible modeling tools. Other
proposals have used the standards-based modeling approach that we adopt in this work, namely for
modeling ITIL [49] [9], ITIL process assessments [50] [10], as well as security and risk [51].

12
Therefore, in this work we based our artifacts on constructs that are part of globally-adopted
specifications, thus lowering the implementation costs and easing fast and widespread adoption of the
solution.

Requirement 2-15: Use standard ArchiMate constructs.

Use standard ArchiMate constructs for modelling the COBIT 5 rationale.

2.7 DSRM Iterations


Note that the DSRM process model is iterative. Therefore it is important that we keep track of all
relevant feedback as work progresses along the different process model activities and cycle iterations.
Namely, we will seek to elicit the lessons learned at any stage in order to incorporate this valuable
knowledge as feedback for future latter stages. We may restate this statement by saying that the
lessons learned in one stage become relevant related work for latter stages.

2.8 Solution Objectives and Requirements


The goal of the proposed thesis work is to provide a reference EA approach that effectively supports
governance initiatives that use COBIT. From the related work analysis, we can identify the solution
objectives for the proposed EA approach, as presented in Table 2-1. These solution objectives and
requirements will be developed further in the “Theoretical Background” section.

Table 2-2: Solution objectives, before the theoretical background analysis.

Solution objectives (SO) Objective’s rationale Related requirements

SO1: Provide a tailored EA Enable fast and easy adoption, Requirement 2-2, Requirement 2-3,
approach for GEIT, based on the standards-based. Requirement 2-6
TOGAF framework.

SO2: Integrate the COBIT 5 Provide an effective EA approach Requirement 2-5, Requirement 2-7,
rationale in the EA principles, for enabling GEIT using COBIT 5. Requirement 2-10, Requirement 2-12,
methods, and models. Requirement 2-1

SO3: Use standard ArchiMate Enable fast and easy adoption, Requirement 2-13, Requirement 2-14,
constructs to describe the EA. standards-based. Requirement 2-15

SO4: Provide templates for Enable effective architectural Requirement 2-4, Requirement 2-8,
relevant viewpoints. conversations, addressing all the Requirement 2-9, Requirement 2-11,
relevant stakeholder’s concerns. Requirement 2-13

13
3 Research Problem
In the previous section, we identified the stakeholder’s needs and reviewed the state-of-the-art
regarding the problem space. In this section we will define the specific research problem which we
have addressed in this work.

In order to define the research problem, we will provide definitions for the problem context and the
problem statement:

 The problem context:


o Clearly stating the problem setting is important, in order to remove ambiguity
regarding the problem space.
o The problem statement assumes the problem context as its workspace. This enables
a more compact formulation for the problem statement.
o A clear and precise definition of the problem context will also help ensure that the
correct conditions will be set for the demonstration scenarios (i.e. the field studies)
and the evaluation activities.
 The problem statement:
o The problem statement will be key in understanding the value of the proposed solution
- and thus justify its development and later usage.
o The problem statement, along with the solution’s objectives, is the basis for the
development and evaluation of the artifacts that will provide the proposed solution.

Note that the definition of the full set of solution objectives will be deferred to the DSRM iteration
sections, after a review of the relevant theoretical background. This staged approach will enable us to
carefully ponder – and take into account – globally recognized standards and best practice
frameworks, which will be valuable in identifying, defining, and justifying the solution’s objectives and
requirements.

3.1 Problem Context


From the analysis of the related work, made in the previous section, we have concluded that some
qualified EA approaches, especially those based on TOGAF and ArchiMate, may be helpful in
establishing and maintaining capabilities that facilitate the implementation of GEIT initiatives and thus
the achievement of governance objectives.

So now we can formulate the problem context statement, as:

Statement 3-1: The problem context.

COBIT 5 identifies the need - and recommends guidance – for qualified EA approaches like
TOGAF, in order to enable GEIT.

15
3.2 Problem Statement
The notion that EA is instrumental as a communication tool between business and IT stakeholders is
well established [21] [7] [13] [22]. EA practices help promote dialog, in order to foster shared meaning
and promote alignment of enterprise’s means and ends [6].

Furthermore, the COBIT 5 framework explicitly recommends [7] the APO03 Manage Enterprise
Architecture process as an enabler for governance of enterprise IT. The framework also recommends
the TOGAF standard as related guidance [13] for enterprise architecture.

However, the TOGAF and the ArchiMate standards are generic specifications, as far as COBIT 5
practice is concerned, in the sense that they are not specifically tailored to the COBIT 5 rationale.

In other words – borrowing from the engineering terminology – we need an enabling adapter to
compensate for the likely impedance mismatch; otherwise, a significant amount of energy may be
wasted. More precisely, the ontological mismatch between the COBIT and EA domains implies an
enabler performance risk: the threat of missing the expected targets for benefits and costs for the
governance of enterprise IT in general; and for governance initiatives in particular.

Therefore, we can formulate the problem statement, as:

Statement 3-2: The problem statement.

The COBIT 5 framework and the TOGAF 9.1 related guidance do not, by themselves, provide a
specific EA approach that helps bridge the gap between the desired COBIT 5 EA enabler
performance goals and the actual EA practice, namely for COBIT 5 process assessment and
process improvement initiatives.

Specifically, the TOGAF 9.1 standard does not provide specific enabling support for COBIT 5
governance initiatives, namely a tailored EA methodology and COBIT-specific building blocks.

3.2.1 Value of the Solution


In simple terms we may argue that, without an adequate EA approach, we risk missing the desired
alignment between governance motivations (i.e. governance goals and strategy) and actual EA
outcomes (i.e. enabler performance and strategy execution).

On the upside, we may argue that by adopting and customizing an adequate EA approach, enterprises
may optimize the value of implementing governance of enterprise IT initiatives using COBIT 5 [7],
namely in the context of COBIT 5 process assessment and process improvement initiatives.

16
Risks related to higher governance costs

The lack of adequate EA tools for enabling governance implies the use of less efficient approaches; in
such a scenario, a significant amount of additional effort may be spent, for each governance initiative,
in wasteful collection and validation activities, as well as in burdensome and demotivating
misunderstanding-related corrections. We can argue that these wasteful activities need not have
occurred if EA had been adequately managed on an ongoing basis, with repeatable and well-
understood methodologies that promoted shared knowledge, and used common and up-to-date
architectural descriptions.

Note that we should not underestimate the complexity of the tasks involved; the COBIT 5 and TOGAF
9.1 frameworks are not, by any reasonable measure of effort, conceptual models which are easy to
understand and implement. It should be expected that someone new to these subject matters will
have to endure quite a steep learning curve, both in order to dominate the knowledge domains, as well
as to be able to take full advantage of the application potential. Actually, it should come as no surprise
that the high complexity of the frameworks may be invoked as a showstopper for their use in
governance-related initiatives - especially in enterprises with low levels of governance maturity and
capability.

Therefore, we can argue that there is value in designing and using a solution that helps to handle the
conceptual and practical complexities, thus facilitating adoption of COBIT 5 for GEIT initiatives.

Risks related to loss of governance benefits

Generic EA approaches, i.e. which do not explicitly take into account the COBIT 5 rationale, risk being
ineffective when used as governance enablers. Indeed, if the COBIT 5 control objectives are not
embedded in the EA models and practices, further architecture change initiatives risk “forgetting” the
COBIT 5 motivational rationale, namely when analysing opportunities and solutions, as well as making
decisions regarding implementation changes; and as time goes by, it is reasonable to expect that such
gaps between strategy and execution will become larger, thus seriously degrading the expected
governance benefits.

17
4 Theoretical Background
In the ”Related Work” section we have identified and defined relevant EA capabilities and
requirements for enabling governance of enterprise IT. Those EA capability needs and EA
requirements represent statements of need, which the proposed solution should address. The related
work review process also allowed us to define the design research problem that we want to solve.

Now that we have defined the problem with the help of state-of-the-art references, we further aim to
understand the state-of-the-art regarding the solution space. So in this section we will review
additional theoretical background that will help us design a solution for the stated problem.

4.1 Architecture Principles


Architecture principles are, arguably, the cornerstones of EA [23]. According to the TOGAF framework
architecture principles “are a set of principles that relate to architecture work. They reflect a level of
consensus across the enterprise, and embody the spirit and thinking of existing enterprise principles”
[1]. Therefore we should identify and define the following solution requirement:

Requirement 4-1: Provide representations for enterprise principles and architecture


principles.

The ArchiMate models and viewpoint templates should provide architectural representations for
enterprise principles and architectural principles.

In particular, for the context of GEIT using COBIT 5, we should also identify and define the following
solution requirement:

Requirement 4-2: Provide representations for COBIT 5 principles.

The ArchiMate models and viewpoint templates should provide architectural representations for
the COBIT 5 framework principles.

Note that these solution requirements are relevant for achieving the solution objective “SO2: Integrate
the COBIT 5 rationale in the EA principles, methods, and models” (see Table 2-2).

19
4.2 Architectural Method
We have already established TOGAF as the framework of choice for the proposed solution. The
TOGAF specification provides guidance [1] on how to establish and maintain EA capabilities, stating
that as “with any business capability, the establishment of an enterprise Architecture Capability can be
supported by the TOGAF Architecture Development Method (ADM)” [18]. So we should define the
solution requirement:

Requirement 4-3: Establish and maintain EA capabilities using the ADM.

The establishment and maintenance of an enterprise Architecture Capability should be supported


by the TOGAF Architecture Development Method (ADM). [18]

4.2.1 Using COBIT 5 with ITIL and Other GEIT-related Initiatives


Note that any GEIT change initiative entails a potential for architectural change, to be addressed with
the ADM. In order to ensure coordination of all governance and management initiatives with EA
impact, we should define the requirement:

Requirement 4-4: GEIT change initiatives should use the ADM.

The implementation of GEIT change initiatives should be supported by the TOGAF Architecture
Development Method (ADM) and described in the Architecture Landscape at the Strategic
Architecture level. [18]

This requirement ensures that GEIT and EA activities are adequately synchronized and that the
architectural descriptions regarding the GEIT initiative are represented at the adequate abstraction
level, thus enabling integration (both of the rationale and the architectural descriptions) with other
strategic, governance and management initiatives that use EA methodologies, such as ITIL-related
[49] and TIPA-related [50].

4.3 Architectural Models and Artifacts


As a source of good practice for modelling in ArchiMate, we have pondered the practical guidance
offered in the “Mastering ArchiMate” book by Gerben Wierda [51]. Examples of recommended practice
range from the simple (and effective) inclusion of an element’s type name - e.g. “(Business Process)”
– in the element’s label, to more elaborate guidelines for modelling risk, security, and capability.

20
5 First DSRM Iteration
In the first DSRM iteration, we designed and tested a proof-of-concept EA proposal, in order to
demonstrate and evaluate the feasibility of the scientific project as a whole, as well as to collect early
feedback for the following DSRM iterations.

The proof-of-concept proposal is composed of:

 The solution statement (of intent), which aims to address the problem statement;
 The initial set of solution objectives and requirements, which are work products of the first
execution of the DSRM “Define Objectives of a Solution” research activity;
 Draft artifacts, corresponding to draft outcomes of the first execution of the DSRM “Design &
Development” research activity.

5.1 Solution Proposal


We begin by discussing and defining the solution statement (of intent), which aims to address the
problem statement (see Section 3.2).

5.1.1 Solution Statement


The solution statement is formulated in the same context as the problem statement: COBIT 5 identifies
the need - and recommends guidance – for qualified EA approaches like TOGAF, in order to enable
governance of enterprise IT (see Statement 3-1).

The solution statement is a counterpart for the problem statement (see Statement 3-2) and has the
following definition:

Solution statement 5-1

Provide an Enterprise Architecture approach (EA) which:

 Is based on TOGAF and tailored for GEIT initiatives using COBIT 5, in order to assist in
process assessment and process improvement activities;
 Integrates the COBIT 5 rationale in the EA principles, methods, and models; and
 Uses standard ArchiMate constructs for describing the EA.

5.1.2 Solution Objectives and Requirements


The full set of initial solution objectives and requirements is presented in Table 5-1.

21
Table 5-1: Solution objectives and requirements, for the first DSRM iteration.

Solution objectives Objective’s rationale Related requirements

(first DSRM iteration)

SO1: Provide a tailored EA Enable fast and easy adoption, Requirement 2-2,
approach for GEIT, based on standards-based. Requirement 2-3,
the TOGAF framework. Requirement 2-6,
Requirement 4-3,
Requirement 4-4

SO2: Integrate the COBIT 5 Provide an effective EA approach for Requirement 2-5,
rationale in the EA enabling GEIT using COBIT 5. Requirement 2-7,
principles, methods, and Requirement 2-11,
models. Requirement 2-12,
Requirement 2-1,
Requirement 4-1,
Requirement 4-2,
Requirement 4-4

SO3: Use standard Enable fast and easy adoption, Requirement 2-13,
ArchiMate constructs to standards-based. Requirement 2-14,
describe the EA. Requirement 2-15

SO4: Provide templates for Enable effective architectural Requirement 2-4,


relevant viewpoints. conversations, addressing all the Requirement 2-8,
relevant stakeholder’s concerns. Requirement 2-9,
Requirement 2-10,
Requirement 2-13,
Requirement 4-4

Each solution objective is associated with a corresponding list of related requirements – derived from
analysis of the related work (see Section 2.8) and the theoretical background (see Section 4). The
objectives’ effectiveness rationale is presented in the column “Objective’s rationale”.

5.1.3 Proposed Artifacts


The proof-of-concept ArchiMate served as modeling prototypes, in order to demonstrate and evaluate
the feasibility of the project as a whole, as well as to collect early feedback.

The proof-of-concept artifacts are:

22
 A Constructs Mapping table set (see Appendix A: First DSRM Iteration - Constructs Mapping),
which maps the relevant COBIT 5 concepts to the corresponding ArchiMate constructs, thus
demonstrating the ontological fit of the proposed solution. The mapping is defined by the
following tables:
o The COBIT 5 framework end goals;
o The COBIT 5 Goals Cascade;
o Pain Points and Trigger Events.
 Viewpoints for addressing the relevant stakeholder’s concerns:
o The Goals Cascade viewpoint (see Appendix Figure 1: Goals Cascade viewpoint.):
this templates shows how the COBIT 5 “Meeting stakeholders’ needs” principles
cascades down to the enablers – in particular to the enabling processes.
o The Enabling Process Performance viewpoint (see Appendix Figure 2: Enabling
Process Performance viewpoint.): this viewpoint shows the main concepts and
relationship involved in a process performance assessment and uses – thereby
proving links to – concepts from the Goals Cascade viewpoint.

5.2 Demonstration
For the purposes of demonstration, the proof-of-concept viewpoints were instantiated for the APO 02
Manage Strategy process. Note that, for the remaining COBIT 5 processes, the corresponding
viewpoints would have - mutatis mutandis - a very similar structure [30] [13].

5.3 Evaluation
For this proof-of-concept stage, an interim dissertation report was produced, presented, discussed,
and evaluated at Instituto Superior Técnico, in a public session, as part of the formal evaluation
process for the “Master Project in Information and Software Engineering” course. The proof-of-concept
proposal was thus validated.

5.3.1 Lessons learned


The evaluation discussion brought to light a major concern regarding the practical value of the work -
as a whole -, namely how real world organizations should be approached, in order to maximize the
value of the proposed solutions and foster adoption.

As a consequence, the following DSRM iterations used a more pragmatic approach to the evaluation
process, targeting evaluation criteria which were more directly related to the solutions’ usefulness. In
particular, we replaced the proposed evaluation approach based on the Wand and Weber Method [52]
[53] (ontological expressiveness) and on the Moody and Shanks Framework [54] (quality), with a new
approach that explicitly included goal efficacy evaluation criteria.

23
Also, three more DSRM iterations were performed as field studies - thus in real world organizations –
thus broadening the initial statement of work and making it more pragmatic (a theoretical case study
proposal was dropped).

The field studies provided practical experience, both formal (i.e. evaluation forms) and informal
(interviews, group sessions), on how to approach and motivate the organization stakeholders. In
particular, the evaluation forms used in the field study included questions regarding the degree of
confidence of the evaluators.

5.4 Communication
As stated in the previous section, an interim dissertation report was produced, presented, discussed,
and evaluated at Instituto Superior Técnico, in a public session, as part of the formal evaluation
process for the “Master Project in Information and Software Engineering” course.

5.5 Conclusion
The proof-of-concept DSRM iteration provided the foundational work necessary to demonstrate and
validate the feasibility and scientific correctness of the thesis work, and provide feedback for the
following DSRM iterations.

The main lessons learned concern conducting the following DSRM iterations with the goal of
maximizing the value of the proposed solutions and ultimately fostering adoption in real world
organizations.

24
6 Second DSRM Iteration
For the second DSRM iteration, the generating and testing work was based on the following
guidelines:

 Field study:
o The study was performed in a military organization setting, in order to demonstrate
and evaluate a new solution design (see Section 5.3.1);
o We have also used a secondary group for experimental control purposes.
 Design focus (agile approach):
o Focus on artifacts for assisting process assessments activities;
o Focus on the process capability level 1 rationale, i.e. process performance. Note that
this capability level relates directly to the goals cascade methodology, therefore
bringing the business-IT alignment perspective into play.
 Demonstration and evaluation:
o Use an ex-ante demonstration and evaluation strategy, in order to get early field study
feedback for this interim iteration.
 Evaluation criteria:
o We have explicitly included goal efficacy evaluation criteria (see Section 5.3.1).

With these guidelines we have sought to incorporate the lessons learned from the previous DSRM
iteration.

6.1 Solution Proposal


The proposed solution, designed in order to model the COBIT 5 process performance indicators and
the related assessment context, consists of the following IS artifacts:

 Constructs mapping: we propose a mapping between the relevant COBIT 5 process


performance assessment concepts and the standard ArchiMate constructs;
 Model: we propose an ArchiMate viewpoint template, that addresses the specific concerns of
the stakeholders engaged in COBIT 5 process performance assessments;
 Method: we propose general guidelines for applying the proposed viewpoint template, in order
to tailor the solution for the specific setting of each enterprise.

6.1.1 Constructs Mapping


The proposed ontological mapping between the COBIT 5 process performance assessment concepts
and the ArchiMate constructs is presented in Table 6-1Table 6-1: COBIT 5 to ArchiMate Ontological Mapping.
The column “COBIT 5 concept description” contains definitions and explanations taken from the
relevant COBIT 5 publications [2] [25] [13] [30]. The column “ArchiMate concept description” contains
definitions and explanations taken from the ArchiMate specification [31]. The semantic matching

25
between the contents of these two columns provides the justification for the proposed ontological
mapping.

Table 6-1: COBIT 5 to ArchiMate Ontological Mapping

COBIT 5 COBIT 5 ArchiMate ArchiMate


concept concept description concept description notation
[2] [25] [13] [30] [31]
IT-related goal A statement describing a desired outcome of A goal is defined as an end state that a
enterprise IT in support of enterprise goals. An stakeholder intends to achieve.
outcome can be an artefact, a significant change of
a state or a significant capability improvement.
Stakeholder Anyone who has a responsibility for, an A stakeholder is defined as the role of an
expectation from or some other interest in the individual, team or organization (or classes
enterprise— e.g., shareholders, users, government, thereof) that represents their interests in, or
suppliers, customers and the public. concerns relative to, the outcome of the
architecture.
Process Generally, a collection of practices influenced by A business process is defined as a behavior
the enterprise’s policies and procedures that takes element that groups behavior based on an
inputs from a number of sources (including other ordering of activities. It is intended to
processes), manipulates the inputs and produces produce a defined set of products or business
outputs (e.g., products, services). services.
Activity The main action taken to operate the process. A business process is defined as a behavior
Activities describe a set of necessary and sufficient element that groups behavior based on an
action-oriented implementation steps to achieve a ordering of activities. It is intended to
practice; consider the inputs and outputs of the produce a defined set of products or business
process; are non-prescriptive and need to be services.
adapted and developed into specific procedures.
Process Assessment output: all of the tangible results from An assessment is defined as the outcome of
performance an assessment. some analysis of some driver. An
assessment Process attribute rating: a judgment of the degree assessment may reveal strengths,
result of achievement of the characteristic for the weaknesses, opportunities, or threats for
assessed process. some area of interest. These outcomes need
(assessment Note: The proposed model represents only the to be addressed by adjusting existing goals
output, process process attribute PA 1.1 Process Performance. or setting new ones, which may trigger
attribute rating) changes to the enterprise architecture.
Process A description of the overall purpose of the process. A goal is defined as an end state that a
purpose The high-level measurable objectives of stakeholder intends to achieve.
statement performing the process and the likely outcomes of
effective implementation of the process.
Process An overview of what the process does and a high- It is a description that expresses the intent of
description level overview of how the process accomplishes its a representation; i.e., how it informs the
purpose. external user. Meaning is defined as the
knowledge or expertise present in a business
object or its representation, given a
particular context.
Process An observable result of a process. A goal is defined as an end state that a
outcomes stakeholder intends to achieve.

Base practice An activity that, when consistently performed, A requirement is defined as a statement of
contributes to achieving a specific process purpose. need that must be realized by a system.
Base practices are the activities or tasks required to Requirements model the properties of these
achieve the required outcome for the process. They elements that are needed to achieve the
are specified in the COBIT PAM at a high level “ends” that are modeled by the goals. In this
without specifying how they are carried out. respect, requirements represent the “means”
to realize goals.
Process metrics At each level of the goals cascade, hence also for A requirement is defined as a statement of
processes, metrics are defined to measure the need that must be realized by a system.
extent to which goals are achieved. Metrics can be Requirements model the properties of these
defined as ‘a quantifiable entity that allows the elements that are needed to achieve the
measurement of the achievement of a process goal. “ends” that are modeled by the goals. In this
Metrics should be SMART—specific, measurable, respect, requirements represent the “means”
actionable, relevant and timely’. to realize goals.
Inputs and The process work products/artefacts considered A business object is defined as a passive
Outputs necessary to support operation of the process. element that has relevance from a business
perspective. Sometimes, business objects
represent actual instances of information
produced and consumed by behavior
elements such as business processes.

26
6.1.2 Viewpoint Template
The proposed viewpoint template (see Figure 6-1) represents the main concept and relationships
related to COBIT 5 process performance assessments, according to the COBIT 5 PAM [30] and
COBIT 5 Enabling Processes [13] publications.

The template is generic (note the “[GENERIC]” notation in Figure 6-1), meaning that is may be used
for any of the 37 COBIT 5 governance and management processes.

Figure 6-1: Generic ArchiMate template, for viewpoints used in COBIT 5 Process Performance Assessments.

6.1.3 Guidelines for using the Viewpoint Template


In order to use the template, it should first be instanced for all relevant COBIT 5 processes (i.e. all the
processes under performance assessment), by replacing the “[GENERIC]” notation with the process
name (e.g. “APO02 Manage Strategy”).

27
The viewpoint for individual processes may then be tailored to the specific enterprise setting, using the
following steps:

 Represent all instances for groups and aggregates: some of the ArchiMate template elements
may represent more than one instance, according to the COBIT 5 PAM [30] and COBIT 5
Enabling Processes [13] specifications, such as: IT-related goals, stakeholders; process
outcomes, process outcome metrics, base practices, activities, inputs, and outputs. Also note
that the COBIT 5 guidelines allow for some flexibility in adding other instances (e.g. other IT-
related goals) that the enterprise wishes to adopt, represent, and manage. The template only
represents one such instance. In practice, the ArchiMate modeling tools should provide an
easy way to duplicate these elements and tailor the copies according to the instance
characteristics.
 Relate the COBIT 5 rationale with the IS/IT implementation: the tailored viewpoints should be
integrated in the EA repository, using the functionalities provided by the ArchiMate modeling
tool adopted by the enterprise. Once they are made available to the tool user, the assessment
stakeholders may establish the relationship links between the viewpoint elements and the
ArchiMate elements that represent the IS/IT implementation. We suggest that the ArchiMate
association relationship (i.e. the semantically weakest relationship) is used for this purpose,
although other relationship types may be used, in order to enforce more specific relationship
semantics [31].

6.2 Demonstration
During the group sessions and interviews, for demonstration purposes, we presented and discussed
an instantiation of the generic template, which consists of a viewpoint for the APO02 Manage Strategy
process, as shown in Figure 6-2.

We have also presented and discussed the corresponding APO02 process performance indicator
tables, as specified in the COBIT 5 PAM [30] and COBIT 5 Enabling Processes [13] publications, in
order to justify the ontological rationale for the construct mappings.

The demonstration viewpoint was built and presented with the following demonstration criteria, in
order to support the evaluation activities:

 All concepts from the generic template were represented in the viewpoint, in order to help
demonstrate the model’s completeness;
 All 3 IT-related goals, 5 process outcomes, and 6 management base practices, defined for the
APO02 Manage Strategy process, were represented in the viewpoint;
 For simplicity and ease of graphical interpretation of the demonstration artifact, the viewpoint
presents the following simplifications:

28
Figure 6-2: ArchiMate viewpoint, showing an instantiation of the viewpoint template for the APO02 process.

o Process outcome metrics are shown in the view, but only for the first process outcome
(APO02-O1).
o The viewpoint shows only single instances for stakeholders, activities, inputs, and
outputs. Note that in practice modelling tools [45] are used to manage the full model
richness, providing adequate graphical interfaces for easing the usage complexities.

6.3 Evaluation
For the evaluation activities, we used the following approach:

 Selection of the evaluators:


o Select a primary test group, as representatives of the solution’s target stakeholders;
o Select a secondary test group, for evaluation control purposes (as discussed below),
with lower EA maturity than the primary group;
o Selection pre-requisites: all of the evaluators were volunteers and had prior (practical
or academic) knowledge of at least one of the key three evaluation subject matters:
COBIT 5, ArchiMate, and managing IS/IT assessments;

29
o Post-assessment validation: at the end of the assessment activities, each evaluator
was asked to rate the overall quality of the demonstration and evaluation activities, as
well as the evaluation form, in terms of clarity, understandability, and representability
of personal opinion. The validation of the each evaluation form depended on the
evaluator’s agreement with the following statement: “I felt comfortable in proving the
evaluation ratings. They represent my current opinion.”
 Evaluation sessions: for the testing sessions, we conducted:
o Group sessions: we conducted group sessions (focus groups) with the practitioners
from the primary group, as well as a separate group session with the students from
the secondary group. The agenda for these session included a formal presentation
which covered the subject matters and the demonstration, an informal debate,
detailed instructions for filling out the evaluation form, and performing the actual
evaluation assessment using an evaluation form (questionnaire);
o Interviews: we conducted individual interviews with two members of the primary
group, who had relatively higher expertise levels in the relevant subject matters (i.e.
COBIT 5, ArchiMate, and managing IS/IT assessments).
o The evaluation forms are anonymous and were classified only on the basis of group
type (primary or secondary) for the purposes of comparing the primary and secondary
ratings.

6.3.1 Selecting the evaluators and validating the ratings


For the primary test group, we have selected a group of 10 enterprise IT practitioners composed of
military officers, having a mix of both business and IS/IT experience and knowledge, namely:

 70% (i.e. 7 officers) had both business and IS/IT working experience;
 10% (i.e. 1 officer) had mainly business working experience; and
 20% (i.e. 2 officers) had mainly IS/IT working experience.

The purpose of this primary test group is to represent a practitioner’s group that is relatively mature, in
terms of EA capabilities and managing IS/IT assessments, as well as motivated to improve existing EA
capabilities. Note that military officers are practitioners who understand and value, we argue, the
importance of adequate blueprints which provide shared knowledge and views on top of which
situational awareness and cooperative action can be leveraged, for informing strategic, operational
and tactical decisions. Military officers are surely no strangers to cartographical practice, skills and
capabilities. Hence, in a military environment, the value of EA can be easily grasped using
cartographical analogies for enterprise IS/IT blueprints, principles, and methodologies.

Given that the selected primary group is expected to be somewhat biased towards higher usefulness
ratings (relative to the market average) we selected a secondary test group in order to obtain lower
expected ratings, both in terms of rating the research problem approach and in rating the solution’s
usefulness. Together, we hope that the two sets of ratings (i.e. primary and secondary) will provide a
broader representation of the market target and thus a more informed basis for discussing the

30
generalization potential, as well as the expected usefulness and fitness of the proposed solution in
other organizational settings.

For the secondary test group we have selected a group of 15 IS/IT master students, working towards a
major in Enterprise Information Systems. At the time of the evaluation sessions, these students were
preparing their first IS/IT field assessment using COBIT 5, as an academic exercise in the context of
the discipline of “Organization and IT Function Management”. Only 1 of the 15 members had
business-related experience. The informal group feedback, provided during the debate period of the
demonstration and evaluation session, hinted at characterizing the secondary group as less
enthusiastic than the primary group, regarding agreement with research problem approach and the
solution’s usefulness claims. The formal ratings, provided by the evaluation form results (see Section
6.3.3), are consistent with the informal feedback received during the evaluation session. It is also
interesting to note that the initial pool of evaluation volunteers was composed of 19 candidates, of
which only the selected 15 provided valid evaluation forms (see “Post-assessment validation” above).
On the other hand, all 10 evaluation volunteers from the primary group provided valid evaluation
forms, which may be indicative of a higher degree of confidence in evaluating the proposed solution.

In order to provide a basis for discussing and generalizing the evaluation results, we asked the
evaluators to consider the following contextual requests:

 Generalization request: in order to test for usability, we chose two specific naturalistic settings
(i.e. a military setting and an academic IS/IT assessment exercise), thus using real users with
real systems and real problems (practitioner or academic). For generalization purposes, the
evaluation form (questionnaire) included the following introductory request: “When rating the
following evaluation statements, the Evaluator is required to generalize his/her evaluation by
generalizing in two dimensions: for a generic enterprise setting for COBIT 5 assessments; and
for a generic COBIT 5 process, i.e. any of the 37 processes of the COBIT 5 Process
Reference Model.”
 Settings assumptions: for the purposes of evaluation, we asked the evaluators to assume that
the enterprise had put in place a reasonable level of enabling capabilities for performing IS/IT
Assessments, namely adequate capabilities regarding the governance system, policies, and
management of human resources [3] [18].

6.3.2 Evaluation ratings


We asked the evaluators to rate the usefulness of the proposed solution, which was designed to help
integrate the COBIT 5 Assessment rationale in Enterprise Architecture models, using ArchiMate as the
visual modeling language.

Following the DSRM methodology, each generating and testing iteration should (re)define the
solution’s objectives and requirements, as well as the design and development of the IS artifacts [25]
[26]. Therefore, we have evaluated the levels of agreement regarding two dimensions: agreement with
the solution’s objectives rationale (related to the research problem approach) and agreement with the

31
solution’s usefulness claims. These two separate sets of ratings may thus provide feedback for
different DSRM activities, in the context of future research iterations:

 Agreement with the solution’s objectives rationale: provides feedback for the DSRM activity
“Define Objectives of a Solution”;
 Agreement with the solution’s usefulness claims: provides feedback for the DSRM activity
“Design & Development”.

We used the same rating scale for all statement evaluations, with the following four agreement levels:
“Strongly Disagree”, “Disagree”, “Agree”, and “Strongly Agree”.

Regarding the research problem approach (related to the solution’s objectives rationale), we asked the
evaluators to rate their agreement level with the following statements (see Figure 6-3):

 “Enterprise Architecture facilitates Assessments”;


 “The Assessment criteria should be included in the architectural diagrams”;
 “ArchiMate is useful for providing architectural diagrams”.

After providing ratings for the statements above, the evaluators then provided ratings for the solution’s
usefulness claims, classified along the system dimensions and evaluation criteria presented in Table
6-2 [27].

Table 6-2: Evaluating the Solution’s Usefulness.

System Dimension Evaluation criteria /sub Statement (claim) regarding objectives / requirements
criteria

The Solution is useful for facilitating architectural conversations between the


Assessment stakeholders, enabling a shared understanding of the assessment
Consistency with organization
rationale (“why”) and providing a link for system implementation
Environment
representations (“what”).
/Utility; Fit with organization
Therefore, the Solution is useful to speed up the initial assessment activities
(planning, data collection, and data validation).

The Solution is useful for facilitating architectural conversations between the


Assessment stakeholders, enabling a shared understanding of the assessment
rationale (“why”) and providing a link for system implementation
representations (“what”).
Goal Efficacy Thus, it may be used to improve the effectiveness of the assessment, by speeding
up the initial assessment activities (planning, data collection, and data validation)
and thus providing more resources for the value-added assessment activities
(performing the actual assessment, documenting exceptions and gaps, and
communicating the assessment results and conclusions) .

Consistency with people The Solution is useful for providing an architectural representation of the
Assessment rationale and providing a link to the system implementation. The
Environment
/Utility; Understandability; graphical notation is easy to understand and the template is easy to use in
Ease of use practice.

The Solution is complete, meaning that it provides a template for representing all
Completeness the key concepts required for process performance assessments: stakeholders,
Structure
[54] assessment result, process purpose, process outcomes, base practices, inputs, and
outputs.

Homomorphism The Solution provides a model which conforms to the modeled assessment
framework, presenting an adequate ontological mapping between the assessment
Structure / Correspondence with concepts and the ArchiMate constructs.
another model
[52] [53]

32
6.3.3 Evaluation rating results
Overall, the primary group provided ratings dominated by the “Strongly Agree” level, whereas the
secondary group ratings were dominated by “Agree” rating levels. The results obtained (see Figure 6-3
and Figure 6-4) are aligned with the author’s expectations, due to the way the demonstration and
evaluation activities were designed and conducted:

 The primary group evaluators gave higher (than the secondary group) ratings regarding the
solution’s usefulness evaluation criteria (see Fig. 4).
o This result is consistent with the informal feedback exchanged during the discussions
and interviews;
o This result was somehow expected, due to the relatively higher EA maturity levels of
the primary group members (see section VI.A Selecting the evaluators and validating
the ratings);
o This result is also consistent with the higher ratings given to the agreement with the
research problem approach (see Fig.3).
 The secondary group’s results are overall high, in absolute terms (i.e. adding the “Agree” and
“Strongly Agree” ratings).
o However, not all evaluators in the secondary group agreed with the solution’s
usefulness claims, which may be indicative of adoption issues that need to be
addressed, especially if we want to target market segments with lower EA maturity.
o It is interesting to note that the ArchiMate usefulness ratings (see Fig.3) were not very
reassuring, for the secondary group. In a future DSRM iteration it would be interesting
to further investigate this issue, which may impact adoption.

Note that we have performed an ex-ante evaluation, meaning that the relatively high ratings that were
obtained should be interpreted with moderation, given that we demonstrated and evaluated a
preliminary version of the artefact [55].

33
Figure 6-3: Ratings regarding the agreement level with the research problem approach.

Figure 6-4: Ratings regarding the agreement level with solution’s usefulness claims.

Finally, it is interesting to note that the set of ratings regarding the research problem approach (see
Figure 6-3) are useful for providing valuable feedback for the next DSRM iteration (for discussing and
redefining the solution’s objectives), as well as to help explain the usefulness ratings differences that
were observed between the primary group (higher EA maturity) and secondary group (lower EA
maturity).

6.3.4 Lessons learned


We have identified potential adoption issues regarding the current ArchiMate version, which may be
addressed in future work.

34
6.4 Communication
For the purposes of communication, a paper was submitted to the CBI 2015 conference, reporting on
the second DSRM iteration. The paper was evaluated by three reviewers, one of which gave a low
rating. The rationale of this reviewer – flawed, in our opinion - provided information regarding adoption
barriers, thus providing feedback for the following DSRM iterations. The main objection of the negative
review is analyzed in Table 6-3.

Table 6-3: CBI2015 conference review analysis.

Reviewer evaluation Analysis Lessons learned

«First, the idea of modelling COBIT using EA The reviewer recognizes that vendors In future communications, stress
modelling languages is not a new one. This is offer COBIT related contents in their the importance of standards-
according to the reviewer, who has sound EA tools. based solutions.
experience with EA and IT Governance
However, the BOC tool is not based In future communications, stress
projects, usually done in enterprises that use
on a standards-based EA approach. the importance of using a
COBIT and do EAM, especially when using
scientific approach like DSRM,
dedicated EAM tools. To achieve this, tool
for designing and validating
vendors (e.g. BOC for ADOit) offer reference
solutions.
models that contain modelled COBIT content
ready for use»

«The task of the architect is to create the The reviewer seems to assume that In future communications, stress
integration between the COBIT and the EA auditors are the main (or only) the points: 1) Auditors are not the
models. The task of the auditor is to know stakeholders in COBIT 5 process only stakeholders in COBIT 5
COBIT and search for gaps. For this she/he assessments, which contradicts the process assessments; 2) The
does not need a model of COBIT content COBIT 5 framework. importance of active
(especially the processes, roles/ synchronization tools, for
The reviewer seems to assume that a
responsibilities and indicators), but basically fostering communication and
holistic view of the organization is not
just the relations between goal indicators and reaching agreement regarding
required for performing process
customer processes» the organizational status; 3)
assessments, which contradicts the
Auditors are stakeholders of EA;
COBIT 5 principles.
4) «The architecture vision
The reviewer seems to assume that all describes how the new capability
the relevant assessment information is will meet enterprise goals and
readily available for the auditors; and strategic objectives and address
that EA artifacts are not useful for stakeholder concerns when
active synchronization between all the implemented» [13].
relevant stakeholders.

«The assumption that without a completely This assertion seems to contradict the
modelled COBIT model an audit is harder, is previous assertions, where a case is
not quite true according to my experience» made of commercial and operational
value of EA artifacts which integrate
the COBIT 5 rationale in the EA
descriptions.

35
6.5 Conclusion
In the second DSRM iteration we have evaluated both the research problem approach and the
solution’s usefulness claims, using a field study. For this purpose, we have used a naturalistic setting
and an ex-ante demonstration and evaluation approach, thus complementing the theoretical validation
provided by the first DSRM iteration.

Also, interesting feedback was collected from a reviewer who fundamentally opposed the problem and
solution approaches that we have followed in the first two iterations. Indeed, one conference reviewer
presented an interpretation of COBIT 5 process assessments that seems to contradict the COBIT 5
framework - especially the COBIT 5 holistic principle and the role of EA in assisting the implementation
of this principle. This occurrence is indicative, we argue, of misunderstandings regarding the holistic
approach of COBIT 5 – especially among scholars and practitioners who are used to working with
earlier versions of COBIT. Therefore, in future communication efforts, we should stress the relevance
and implications of the holistic rationale, as well as the importance of active synchronization tools, for
fostering communication and reaching agreement regarding the organizational status. Also, we should
stress the importance of standards-based solutions and of the use of a scientific approaches (like
DSRM) to build them.

The evaluation ratings that were obtained in this iteration, regarding both the research problem
approach and the solution’s usefulness claims, were considered sufficiently reassuring to envisage
broadening the scope for the following iterations.

36
7 Third DSRM Iteration
For the third DSRM iteration, the generating and testing scope was considerably changed:

 Field study: (diversified scope – a different public organization)


o The field study was performed in the information systems department of a public
organization with nationwide scope and reach.
 Design focus: (enlarged scope – all capability levels)
o As for the previous iterations, focus on artifacts for assisting process assessments
activities;
o But now we will address all five process capability levels.
 Demonstration and evaluation: (ex-post and ex-ante setting, enlarged scope of four
processes)
o This organization had already performed a process assessment based on COBIT 5,
so we were able to demonstrate and evaluate parts of the proposed solution using an
ex-post setting.
o We have prepared ex-post demonstration and evaluation artifacts up to the maximum
capability level previously assessed in the organization – level 3. Therefore the
artifacts related to the capability levels 4 and 5 were demonstrated and evaluated in
an ex-ante setting. However, note that the capability levels 2 to 5 share,
fundamentally, the same ontological structure, therefore allowing for ex-ante
arguments to be made for the upper level (i.e. levels 4 and 5) based on the
experimental outcomes for the lower levels (i.e. levels 2 and 3).
o We have demonstrated the proposed solution using four COBIT 5 processes, selected
by a goals cascade exercise.
 Evaluation criteria: (same as for the second iteration)
o We have kept the same evaluation approach that was used in the second iteration,
therefore allowing for a comparison between the second and third iterations’
experimental outcomes.

7.1 Solution Proposal


For the third iteration, besides extending the scope to encompass all five capability levels, the
capability level 1 viewpoints (i.e. for process performance assessments) were refined based on the
feedback collected from the previous iterations:

 IT-related goals were removed from the process performance viewpoint (see Appendix Figure
2) and the IT goals and base practices cascade were detailed in a new viewpoint (see
Appendix Figure 7);
 Work products and activities were modelled using ArchiMate concepts from the motivation
extension, in order to allow for a more explicit representation of the entities and relations that

37
represent capability evidence, capability motivations, and their corresponding relations (see
Appendix Figure 4, Appendix Figure 10, Appendix Figure 11, Appendix Figure 12, and
Appendix Figure 13);
 A separate viewpoint was introduced, representing base practices, activities, and detailed
activities (see Appendix Figure 11);
 The importance of stakeholder’s need and responsibilities was stressed (see Section 6.5), by
introducing a new viewpoint (see Appendix Figure 9) which relates stakeholder concepts and
the associated goals and requirements cascade.

7.2 Demonstration
For demonstration purposes, we performed interactive modelling sessions with the process owners of
four COBIT 5 processes. This approach enabled the process owners (who later performed the
evaluations) to become acquainted with concrete ArchiMate modelling tools and techniques, as well
as the activities and costs involved in creating and maintaining EA capabilities for managing COBIT 5
process assessments. In order to avoid presenting redundant information, only artifacts related to one
of the processes are shown in the appendixes (see Appendix C: Third DSRM Iteration – Viewpoints).

This ex-post approach was also made possible by the fact the process owners had previously been
engaged in COBIT 5 process assessments. These assessments had been assisted with the help of
several architectural representations, both ArchiMate and non-ArchiMate based. This means that the
evaluators were able to draw conclusions based on their previous assessment and modelling
experiences.

7.3 Evaluation
For the third iteration, we have used the same approach and evaluation criteria as for the second
iteration, which allows for comparison between the two sets of evaluation ratings:

 Overall, the two sets of results are not too dissimilar, even though the demonstration and
evaluation scope was broadened for the third iteration;
 However, the following differences can be observed:
o Research problem approach ratings (see Figure 7-1): the civil (third iteration) setting
shows higher agreement levels concerning the assumptions “EA is useful” and “The
assessment criteria should be included in the diagrams”. However these evaluators
provided lower agreement ratings concerning the usefulness of the modelling
language (ArchiMate).
o Solution usefulness ratings (see Figure 7-2): the civil (third iteration) setting shows
higher agreement levels concerning the “goal efficacy” and the “utility for people”.
However the structural quality ratings (i.e. “completeness” and “homomorphism”) were
lower in his third iteration – which is consistent with the lower observed ArchiMate
usefulness related ratings.

38
Figure 7-1: Ratings regarding the agreement level with the research problem approach.

Figure 7-2: Ratings regarding the agreement level with solution’s usefulness claims.

7.4 Communication
The outcomes from both the second and third iterations provide material for a scientific paper on using
EA for assisting COBIT 5 process assessment activities. Therefore a paper on this subject is being

39
prepared for submission to the CAiSE conference (http://caise2016.si/, the paper submission deadline
is the 30th November 2015).

7.5 Conclusion
In this DSRM iteration we observed maximum evaluation ratings regarding “goal efficacy” and “utility
for the people” (i.e. providing architectural representations which are easy to understand and easy to
use). Therefore we feel comfortable in broadening the work scope for the next DSRM iteration.

However, the ratings related to the usefulness of the ArchiMate language and the structural quality of
the artifacts were not as high. This fact may be indicative that the ArchiMate modelling language may
be improved. These results are consistent with known criticism regarding the current language
expressiveness shortcomings - e.g. regarding modelling the capacity concept [51] [31].

40
8 Fourth DSRM Iteration
For the fourth DSRM iteration, the generating and testing work focused on addressing COBIT 5
process improvement activities:

 Field study:
o The field study was performed in the same setting (i.e. organization and stakeholders)
as the third iteration, allowing for direct comparison between the two sets of ratings.
 Design focus:
o Complementing the process assessment solution proposal, we now addressed
process improvement EA needs and requirements.
o An additional solution requirement was defined for the design: links should be
provided between the process assessment descriptions and the process improvement
descriptions, in order to leverage synergies between all related activities.
o As in the previous iteration, all five process capability levels were addressed.
 Demonstration and evaluation:
o The demonstration for this iteration was performed after performing the third iteration
demonstration. However, we used a single evaluation form for recording the ratings of
both the third and fourth iterations, in order to allow a more meaningful comparison
between the two sets of ratings.
 Evaluation criteria: (same as for the second iteration)
o We have kept the same evaluation approach that was used in the second and third
iterations, therefore allowing for a direct comparison between the second, third, and
fourth iterations’ experimental outcomes.

8.1 Solution Proposal


COBIT 5 provides guidance on how to implement GEIT initiatives [7], based on a continual
improvement life cycle is tailored to suit the enterprise’s specific needs.

According to COBIT 5 best practice, improvement initiatives should be driven using a programme and
project approach, up to the point when the acquired capacity has become embedded in the ongoing
business activity [7].

Therefore our solution proposal seeks to address the fundamental concepts which are involved in
such an improvement approach.

In this work we will focus on process improvement initiatives. Note, however, that the focus on
improving enabling processes does not contradict the COBIT 5 holistic principle; indeed, the COBIT 5
body of knowledge provides links between all enablers. Therefore, GEIT initiatives may use process
improvement initiatives as a convenient entry point for enhancing all GEIT enablers.

41
8.1.1 ArchiMate Constructs
In order to model the programme and project concepts, we may use constructs from the ArchiMate
Implementation and Migration Extension [31].

Note that we have not used these constructs in any of the previous three DSRM iterations, which were
focused on EA solutions for performing process assessments. This does not mean that we could not
have modelled the rationale of specific assessment projects with these constructs; it just means that
the improvement rationale is intrinsically related to the concept of change (i.e. a dynamic perspective),
whereas the assessment rationale concerns getting a snapshot of the status quo, in a specific point in
time (i.e. a static perspective).

8.1.2 Integrating the Assessment and Improvement Perspectives


One of the objectives for the design of the solution is to enable the integration of the process
assessment and process improvement perspectives. In order to address this objective, the solution
that we have designed and developed in this iteration is based on the COBIT 5 implementation
guidelines (see Figure 8-1).

Figure 8-1: The Implementation Life Cycle, taken from “COBIT 5 Implementation” [7]

42
Indeed, in Phase 2 of the Implementation Life Cycle we are concerned with assessing the current
state. This approach has inspired the idea of including an assessment construct in the improvement
viewpoints, thereby providing a COBIT 5 perspective of the relation between assessments and
improvements, which the EA descriptions may help enforce.

8.1.3 Modelling Capability and Capability Improvement


A severe hindrance had to be removed in order to proceed with the solution design: the current
ArchiMate standard does not provide a construct - nor related recommendations – for modelling the
“capability” concept, which is central to process assessment and process improvement according to
COBIT 5.

Indeed, the ArchiMate standard explicitly recognizes this shortcoming, proposing this concept for
inclusion in a future version of the language [31].

This problem is also discussed in the book “Mastering ArchiMate”, by Gerben Wierda.

We have solved this modelling problem by first defining the following objectives and requirements:

1. Objective: Design a future proof solution. This objective relates to the following requirement:
1.1. Do not map the capability concept directly to an ArchiMate construct (as a future version of
the language may introduce this construct). Instead, represent relevant aspects related to the
COBIT 5 capability concept, which may later be linked to a future capability-specific construct;
2. Objective: Represent concepts which may be associated with all relevant aspects of the COBIT 5
Implementation Life Cycle phases. This objective relates to the following requirements:
2.1. Provide representations that may be mapped to the phase “What are the drivers?”
2.2. Provide representations that may be mapped to the phase “Where are we now?”
2.3. Provide representations that may be mapped to the phase “Where do we want to be?”
2.4. Provide representations that may be mapped to the phase “What needs to be done?”
2.5. Provide representations that may be mapped to the phase “How do we get there?”
2.6. Provide representations that may be mapped to the phase “Did we get there?”
2.7. Provide representations that may be mapped to the phase “How do we keep the momentum
going?”
3. Objective: Promote the business-IT alignment rationale. This objective relates to the following
requirement:
3.1. Provide representations that relate the motivational aspects with the implementation aspects.

Note that the COBIT 5 PAM [30] defines five capability levels. However, note that from a COBIT 5
improvement perspective there is nothing that recommends against modelling the capability
improvement initiatives from any one level to the next one using similar conceptual structures. This
structural similarity enables us to split the problem into smaller problems that share a similar structure:

43
we therefore chose to model the improvement initiative from level N to level N+1 in a consistent
manner, for N={1,2,3,4}.

The resulting process improvement structure is represented in Figure 8-2, instantiated for Level 0 to
Level 1, and for process APO12 Manage Risk.

Figure 8-2: Process improvement structure, instantiated for Level 0 to Level 1, and for process APO12 Manage
Risk.

Note that this structure meets the first design objective, by providing representations for capability
concept aspects, using all the available ArchiMate constructs from the Implementation and Migration
Extension; by using all such constructs we intended on maximizing the expressiveness of the
descriptions, thus allowing for more accurate conceptual mappings.

Note also that the third design objective was met: the motivational aspects are represented as:

 The process improvement driver, i.e. achieving a certain process capability level;
 The process improvement goal, which is realized by the improvement initiative deliverables.
 The process assessment outcomes, which relate to the gap analysis, and to the previously
described motivational aspects of the process improvement initiative (i.e. the improvement
driver and the improvement goal); note that this element provides a bridge between the
process assessment rationale and the process improvement rationale.

In order to meet the second objective, Table 8-1 provides a proposal for a conceptual mapping
between the COBIT 5 Implementation Life Cycle phases and the EA artifacts.

44
Table 8-1: Conceptual mapping between the COBIT 5 Implementation Life Cycle phases and the EA artifacts.

COBIT 5 EA artifact Notes and guidelines for practical


Implementation Life usage
(NOTE: instantiated for Level 0 to
Cycle phases Level 1, and for process APO12
Manage Risk)

What are the drivers?  Continual improvement phase:


recognize the need to act [7].
 This driver element may be linked
to the concept of Stakeholder
Drivers (see Appendix Figure ).
 A change driver is an internal or
external event, condition or key
issue that serves as a stimulus for
change [7].

Where are we now?  Continual improvement phase:


assess current state [7].
 Management needs to know its
current capability and where
deficiencies may exist. This is
achieved by a process capability
assessment of the as-is status of
the selected processes. [7]
 Note that the assessment element
provides a bridge between the
process assessment rationale and
the process improvement rationale.

Where do we want to  Continual improvement phase:


be? define target state. [7]
 This phase sets a target for
improvement followed by a gap
analysis to identify potential
solutions. [7]

45
What needs to be  Continual improvement phase::
done? build improvements. [7]
 This phase plans feasible and
practical solutions by defining
projects supported by justifiable
business cases and developing a
change plan for implementation.

How do we get there?  Continual improvement phase:


implement improvements. [7]
 This phase provides for the
implementation of the proposed
solutions into day-to-day practices
and the establishment of measures
and monitoring systems to ensure
that business alignment is achieved
and performance can be measured.
[7].

Did we get there?  Continual improvement phase:


operate and measure [7].
 This phase focuses on sustainable
transition of the improved
governance and management
practices into normal business
operations and monitoring
achievement of the improvements
using the performance metrics and
expected benefits. [7]

46
How do we keep the  Continual improvement phase:
momentum going? monitor and evaluate [7].
 All motivational elements are
relevant in order to describe and
demonstrate the will to sustain the
change.
 This phase reviews the overall
success of the initiative, identifies
further governance or management
requirements and reinforces the
need for continual improvement. It
also prioritizes further opportunities
to improve GEIT. [7]

8.2 Demonstration
For demonstration and evaluation purposes, we used the artifacts presented in Appendix D: Fourth
DSRM Iteration – Viewpoints. The artifacts were instantiated for one of the COBIT 5 processes that
had been previously assessed, namely the APO12 Manage Risk process.

Note that the ArchiMate assessment constructs are shared by both the process assessment and the
process improvement viewpoints, thereby linking the assessment rationale with the improvement
rationale (confront e.g. Appendix Figure 10Appendix Figure with Appendix Figure 23).

8.3 Evaluation
For the fourth iteration, we have used the same approach and evaluation criteria as for the third
iteration, which allows for direct comparison between the ratings regarding process assessment (third
iteration) and those regarding process improvement (fourth iteration):

 Each and every rating obtained for the process improvement solution is either equal of higher
that the corresponding rating obtained for the process assessment solution;
 Higher ratings were obtained for the following evaluation ratings, for the process improvement
solution:
o Research problem approach ratings (see Figure 7-1 and Figure 8-3):
 “ArchiMate is useful for providing diagrams”;
o Solution usefulness ratings (see Figure 7-2 and Figure 8-4):
 “Utility for the enterprise”.

47
Figure 8-3: Ratings regarding the agreement level with the research problem approach.

Figure 8-4: Ratings regarding the agreement level with solution’s usefulness claims.

8.4 Communication
The outcomes from the fourth iteration provide material for a scientific paper on using EA for assisting
COBIT 5 process improvement activities. Therefore a paper on this subject is being prepared for

48
submission to the ECIS conference (http://www.ecis2016.eu, the paper submission deadline is the 27th
November 2015).

8.5 Conclusion
In the fourth (and last) DSRM iteration we observed high evaluation ratings regarding the solution’s
usefulness, regarding both efficacy (i.e. goal efficacy) and utility (i.e. people and enterprise) measures.

However, these ratings contrast with the relatively lower ratings regarding the expressiveness of the
modelling artifacts based on the ArchiMate language. The same conclusion can be drawn when
considering the field study ratings as a full set, i.e. when considering the second, third, and fourth
iterations as a whole.

49
9 Conclusion
In this work we have designed and validated an integrated EA solution for assisting COBIT 5 process
assessments and process improvement initiatives.

We have applied the DSRM methodology, which enables an agile approach to the generating and
testing activities, thus allowing for the incorporation of frequent feedback during the design process.
This feedback was instrumental for improving the solution design in an incremental and controlled
manner, as well as for identifying and overcoming adoption barriers.

By providing a link between the process assessment and process improvement artifacts, the solution
enables integration between several architectural aspects:

 The enterprise motivational aspects, modelled with constructs taken from the ArchiMate
Motivation Extension;
 The assessment aspects, modelled with the use of constructs taken from the ArchiMate
Motivation Extension, as well as core ArchiMate concepts;
 The improvement aspects, modelled with the use of constructs taken from both the ArchiMate
Implementation and Migration Extension and the ArchiMate Motivation Extension, as well as
core ArchiMate concepts;
 The ArchiMate framework core concepts, belonging to the business, application, and
technology layers.

This means that the enterprise may adopt a single consolidated EA database that integrates the core
layers with both the COBIT 5 PAM [30] rationale and the COBIT 5 GEIT process improvement [7]
rationale.

This integration feature may be leveraged to enable synergies and to avoid duplication of work,
regarding the creation and maintenance of EA capabilities for GEIT - which use process assessment
and improvement activities - purposes.

9.1 Research Communication


This work enabled a total of five research communication opportunities, whose outcomes, feedback
(i.e. internal) value, and communication (i.e. external) value are summarized in the following list of
research communication contributions:

 First DSRM iteration:


o Outcome: an interim dissertation report was produced, presented, discussed, and
evaluated at Instituto Superior Técnico, in a public session, as part of the formal
evaluation process for the “Master Project in Information and Software Engineering”
course.

51
o Design feedback value: as a consequence of the feedback received, the work scope
was more directed towards applied research, focused on real-world field studies. Also,
the sought contributions - to the practice and to the knowledge base - were biased
towards enabling adoption success and overcoming adoption barriers.
o Communication value: the report which was produced has since been requested for
helping inform other master thesis works performed at Instituto Superior Técnico.
 Second DSRM iteration:
o Outcome: a paper was submitted to the CBI 2015 conference, reporting on the
second DSRM iteration.
o Design feedback value: the paper was evaluated by three reviewers, one of which
gave a low rating. The rationale of this reviewer – flawed, in our opinion – provided
valuable information regarding adoption barriers, thus providing feedback for the
following DSRM iterations.
o Communication value: due to the low rating which was provided by one of the three
reviewers, the communication efficacy of this second iteration was hindered.
 Third DSRM iteration:
o Expected outcome: the second and third iterations provide material for a scientific
paper on using EA for assisting COBIT 5 process assessment activities. Therefore a
paper on this subject is being prepared for submission to the CAiSE conference
(http://caise2016.si/, the paper submission deadline is the 30th November 2015).
o Expected design feedback value and communication value: we expect that the
reviewer comments and ratings will provide feedback for future work. We also expect
to promote adoption of EA solutions for COBIT 5 process assessment initiatives -
besides providing a concrete solution for the stated purpose.
 Fourth DSRM iteration:
o Expected outcome: the fourth iteration provides material for a scientific paper on using
EA for assisting COBIT 5 process improvement activities. Therefore a paper on this
subject is being prepared for submission to the ECIS conference
(http://www.ecis2016.eu , the paper submission deadline is the 27th November 2015).
o Expected design feedback value and communication value: we expect that the
reviewer comments and ratings will provide feedback for future work. We also expect
to promote adoption of EA solutions for COBIT 5 process improvement initiatives -
besides providing a concrete solution for the stated purpose.
 Final dissertation report:
o Outcome: this dissertation report.
o Expected design feedback value: we expect that the dissertation examination
committee will provide valuable feedback for informing future work, namely the PhD
thesis work which the author in currently engaged in – and which relates also to EA
research problems and solutions.

52
o Expected communication value: as was the case for the interim report produced at the
end of the first DSRM iteration, we hope that this dissertation report will help inform
other thesis works, namely the PhD thesis work which the author in currently engaged
in – and which relates also to EA research problems and solutions.

9.2 Contributions
In this section we summarize the dissertation contributions to the practice and to the knowledge base.

The final contributions are to be understood as the outcomes of the third and fourth DSRM iterations,
respectively for the purposes of process assessment initiatives and process performance initiatives. All
the other outcomes presented in this work are to be taken as interim contributions.

We will start by first defining the reasonable generalization scope, which stems from the reach and
limitations of the field studies that were performed, as well as the evidence which was presented in the
work:

 Field study scope:


o We have performed three field studies in two large (Portuguese-scale) public sector
organizations. However, note that these organizations are different in their nature: one
is military and the other is non-military (civil).
o Therefore we may make an argument regarding the likely applicability of the proposed
solution for other large public sector organizations.
 Goal efficacy scope:
o We have formally evaluated the goal efficacy using the assertions:
 Process assessment initiatives: «the Solution is useful for improving the
effectiveness of Process Assessment initiatives»;
 Process Improvement initiatives: «the Solution is useful for improving the
effectiveness of Process Improvement initiatives».
o The context of these assertions is limited to COBIT 5 GEIT initiatives.
 Agreement with the solution approach:
o The evaluators provided a high level of agreement regarding the assumptions:
 EA is useful for process assessment initiatives;
 EA is useful for process improvement initiatives.
o Furthermore, in the second DSRM iteration we found a counter-example instance,
where a reviewer fundamentally opposed the thesis assumptions.
o Therefore, based on the evidence that was presented, we cannot recommend
generalizing the solution for settings where key stakeholders fundamentally oppose
the thesis claims, i.e. that using EA solutions may assist in performing process
assessment or process improvement initiatives using COBIT 5.

53
It is important to clarify that this work is not focused on finding optimal solutions for detailed analysis
(such as dependency analysis) and other specific lower-level engineering optimization efforts. Instead,
it seeks to provide standards-based EA instruments, primarily for stakeholder-to-stakeholder (i.e.
meaning human) communication, which may serve as a basis for improving self-awareness and
interactive synchronization efforts, for the benefit of GEIT stakeholders engaged in the typical auditing
and consulting activities. Nevertheless, the high-level views and viewpoints which we propose may
serve a basis for providing architectural representations on top of which such detailed work may be
designed and performed.

Minding the above caveats, we may now describe the main contributions presented in this
dissertation.

9.2.1 Contributions to COBIT 5 and EA practice


This thesis work provided the following contributions to the COBIT 5 and EA practice:

 A set of EA viewpoints, for use in COBIT 5 process assessment and process improvement
initiatives, with the following characteristics:
o Their efficacy was validated in two large public sector organizations;
o Their constructs are based on ArchiMate (version 2.1);
o May be integrated with other ArchiMate architectural descriptions that use standard
ArchiMate constructs;
o Were developed using a DSRM process model, DSRM-related evaluation techniques,
and taking into account objectives and requirements derived from COBIT 5 and
TOGAF best practice.

These viewpoint are presented in the appendixes “Appendix C: Third DSRM Iteration – Viewpoints”
and “Appendix D: Fourth DSRM Iteration – Viewpoints”.

9.2.2 Contributions to the COBIT 5 and EA knowledge base


This thesis work provided the following contributions to the COBIT 5 and EA knowledge base:

 New knowledge:
o The formal evaluation ratings provide new sources of knowledge, which confirm that
organizations indeed appreciate the value of EA in enabling GEIT using COBIT 5.
o However, we found one occurrence of strong opposition, concerning the usefulness
claims of EA for assisting COBIT 5 process assessments. Although we believe that
the stated argument is flawed, this occurrence may be indicative that works needs to
be done regarding promotion of the COBIT 5 holistic principle and engagement of
auditors in EA activities.
o The evaluation ratings regarding the usefulness of the ArchiMate (version 2.1)
constructs were consistently lower than the usefulness ratings regarding the
viewpoints which were based on these same constructs. This fact may be indicative

54
that there is room for improvement regarding the expressiveness of the ArchiMate
language.
 Dissemination of knowledge:
o The demonstrations and group sessions enabled the dissemination of COBIT 5
knowledge and EA knowledge in two large public sector organizations.
o This dissemination of knowledge is relevant for the following reasons:
 These organizations are currently building their initial capabilities, both in
terms of COBIT 5 and of EA;
 The high evaluation ratings obtained in this work may encourage other similar
organization to adopt the COBIT 5 framework for GEIT, as well as improving
EA capabilities and adopting EA solutions.

Finally, this thesis work also enabled five research communication opportunities, as described in
Section 9.1.

9.3 Future Work


Based on the outcomes of this work, we may point to the following opportunities for related future
work:

 Conduct further DSRM work, integrating more aspects of the COBIT 5 framework, as well as
related guidance (e.g. ITIL and ISO 27000);
 Extend the scope and/or depth of the thesis work, e.g. by:
o Demonstrating and evaluating the proposed (or an enhanced) EA solution in more
public sector organizations;
o Demonstrating and evaluating the proposed (or an enhanced) EA solution in private
sector organizations, eventually comparing the results with those obtained in the
public sector domain;
o Demonstrating and evaluating the proposed (or an enhanced) EA solution in small or
medium size organizations, eventually comparing the results with those obtained in
larger organizations.
 Understand the shortcomings of the current ArchiMate version, regarding its expressiveness
for use in GEIT use cases, and provide recommendations for a future version of the standard.

55
Bibliography

[1] The Open Group, "TOGAF Version 9.1," 2011.

[2] ISACA, "COBIT 5: A Business Framework for the Governance and Management of Enterprise IT,"
2012.

[3] A. Hevner and S. Chatterjee, Design Research in Information Systems, Springer, 2010.

[4] K. Peffers, T. Tuunanen, M. A. Rothenberger and S. Chatterjee, "A Design Science Research
Methodology for Information Systems Research," Journal of Management Information Systems,
vol. Vol. 24 No.3, 2007.

[5] A. Hevner, S. March, J. Park and S. Ram, "Design Science in Information Systems Research,"
MIS Quarterly, vol. Vol. 28 No.1, March 2004.

[6] B. Cameron, N. Malik and e. a. , "A Common Perspective on Enterprise Architecture," The
Federation of Enterprise Architecture Professional Organizations (FEAPO), 2013.

[7] ISACA, "COBIT 5 Implementation," 2012.

[8] K. Laudon e J. Laudon, Management Information Systems, Pearson, 2012.

[9] P. Bernard, Foundations of ITIL 2011 Edition, Van Haren Publishing, 2012.

[10] B. Barafort, V. Betry, S. Cortina, M. Picard, M. St-Jean, A. Renault e O. Valdés, ITSM Process
Assessment Supporting ITIL, J. Chittenden, Ed., Van Haren Publishing, 2009.

[11] ISO/IEC/IEEE, “42010:2011 Systems and software engineering — Architecture description,” 2011.

[12] V. Vaishnavi e W. Kuechler, “Design Science Research in Information Systems,” 2013.

[13] ISACA, “COBIT 5: Enabling Processes,” 2012.

[14] ISACA, "COBIT 5: Enabling Information," 2013.

[15] ISACA, "COBIT 5 for Risk," 2013.

57
[16] ISACA, "COBIT 5 for Information Security," 2012.

[17] NIST - National Institute of Standards and Technology, “Framework for Improving Critical
Infrastructure Cybersecurity - Version 1.0,” NIST - National Institute of Standards and Technology,
2014.

[18] ISACA, “Implementing the NIST Cybersecurity Framework,” 2014.

[19] NIST - Joint Task Force Transformation Initiative, “Managing Information Security Risk:
Organization, Mission, and Information System View, NIST Special Publication 800-39,” 2011.

[20] J. Tribolet, J. Pombinho e D. Aveiro, “Organizational Self-Awareness: A Matter of Value,” em


Organization Design and Engineering: Coexistence, Cooperation or Integration, Palgrave
Macmillan, 2014.

[21] M. Lankhorst and e. a. , Enterprise Architecture at Work: Modelling, Communication and Analysis,
third ed., Springer-Verlag, 2013.

[22] J. A. Zachman, “A framework for information systems architecture,” IBM Systems Journal, vol. Vol.
26 No.3, pp. 276-292, 1987.

[23] D. Greefhorst and E. Proper, Architecture Principles: The Cornerstones of Enterprise Architecture,
Springer-Verlag, 2011.

[24] ISACA, "COBIT 5 for Assurance," 2013.

[25] ISACA, “COBIT Assessor Guide: Using COBIT 5,” 2013.

[26] A. Uhl e L. A. Gollenia, Business Transformation Essentials, Gower, 2013.

[27] R. Abraham, J. Tribolet e R. Winter, “Transformation of Multi-level Systems – Theoretical


Grounding and Consequences for Enterprise Architecture Management,” em Advances in
Enterprise Engineering VII, Springer, 2013.

[28] A. Uhl e L. A. Gollenia, Business Transformation Management Methodology, Gower, 2012.

[29] ISACA, "COBIT Self-assessment Guide: Using COBIT 5," 2013.

[30] ISACA, "COBIT Process Assessment Model (PAM): Using COBIT 5," 2013.

[31] The Open Group, "ArchiMate 2.1 Specification," 2013.

58
[32] H. Jonkers, I. Band, D. Quartel, H. Franken, M. Adams, P. Haviland and E. Proper, "Using the
TOGAF 9.1 Architecture Content Framework with the ArchiMate 2.0 Modeling Language," The
Open Group, 2012.

[33] E. Guldentops, "Where Have All the Control Objectives Gone?," ISACA Journal, vol. 4, 2011.

[34] W. Estrem, S. Gonzalez e S. Thorn, A Practitioner’s Guide to Using the TOGAF Framework and
the ArchiMate Language, The Open Group, 2014.

[35] S. Thorn, Viewpoints Mapping - TOGAF Framework and ArchiMate Modeling Language
Harmonization, The Open Group, 2014.

[36] S. Thorn, Glossaries Comparison - TOGAF Framework and ArchiMate Modeling Language
Harmonization, The Open Group, 2014.

[37] W. Estrem e S. Gonzalez, Content Metamodel Harmonization: Entitles and Relationships, The
Open Group, 2014.

[38] Archi, “Archi - The Free ArchiMate Modelling Tool,” [Online]. Available:
http://www.archimatetool.com/.

[39] Avolution, “Abacus Suite,” [Online]. Available: http://www.avolution.com.au/.

[40] BiZZdesign, “BiZZdesign Architect,” [Online]. Available: http://www.bizzdesign.com/.

[41] Corso, “Archimate 2.1 for IBM Rational System Architect,” [Online]. Available:
http://www.corso3.com/.

[42] HP, “HP Enterprise Maps,” [Online]. Available: http://www.hp.com/.

[43] Orbus Software, “Official Website,” [Online]. Available: http://www.orbussoftware.com/.

[44] Software AG, “ARIS Alfabet and Aris 9.7 Cloud,” [Online]. Available: http://www.softwareag.com/.

[45] The Open Group, “ArchiMate 2 Tool Certification Register,” [Online]. Available:
http://www.opengroup.org/certifications/archimate/ts-register. [Acedido em 09 01 2015].

[46] P. Mota, “Management of Organizational Competencies (master thesis report),” Instituto Superior
Técnico, Portugal, 2011.

[47] A. Reis, “Key Performance Indicators Representation in ArchiMate Framework (master thesis

59
report),” Instituto Superior Técnico, Portugal, 2012.

[48] A. Morais, “Aplicação de Ontologias à Representação e Análise de Modelos de Arquitetura


Empresarial (master thesis report),” Instituto Superior Técnico, Portugal, 2013.

[49] M. Vicente, “Enterprise Architecture and ITIL (master thesis report),” Instituto Superior Tecnico,
Portugal, 2013.

[50] N. Silva, “Modeling a Process Assessment Framework in ArchiMate (master thesis report),”
Instituto Superior Técnico, Portugal, 2014.

[51] G. Wierda, Mastering ArchiMate, Second Edition, R&A, The Netherlands, 2014.

[52] Y. Wand and R. Weber, "On the ontological expressiveness of information systems analysis and
design grammars," Information Systems Journal, Vols. 3, No.4, 1993.

[53] P. Fettke and P. Loos, "Ontological evaluation of reference models using the Bunge-Wand-Weber
Model," in Americas Conference on Information Systems (AMCIS), Tampa, FL, USA, 2003.

[54] D. Moody e G. Shanks, “Improving the quality of data models: empirical validation of a quality
management framework,” Information Systems, vol. 28, pp. 619-650, 2003.

[55] P. Johannesson e E. Perjons, An Introduction to Design Science, Springer Cham Heidelberg,


2014.

[56] J. Venable, J. Pries-Heje e R. Baskerville, “A Comprehensive Framework for Evaluation in Design


Science Research,” em 7th International Conference, DESRIST 2012, Las Vegas, NV, USA, 2012.

[57] N. Prat, I. Comyn-Wattiau and J. Akoka, "Artifact Evaluation in Information Systems Design-
Science Research - a Holistic View," in PACIS 2014 Proceedings. Paper 23, 2014.

[58] H. Jonkers, I. Band and D. Quartel, "ArchiSurance Case Study," The Open Group, 2012.

[59] An Introduction to Design Science, Springer Cham Heidelberg, 2014.

60
Appendixes
Appendix A: First DSRM Iteration - Constructs
Mapping
Appendix Table 1: The COBIT 5 framework end goals.

COBIT®5 COBIT®5 ArchiMate


ArchiMate
concept
concept concept description notation
description

The Governance Enterprises exist to create value for their A goal is


Create optimal
Objective: Value stakeholders. Consequently, any defined as an value from IT
(Goal)
Creation enterprise—commercial or not—will have end state that a
value creation as a governance objective. stakeholder
intends to
Create optimal achieve.
value from IT

Implement/ COBIT®5 framework goal: to help create A goal is Implement/


Improve/ Assure
Improve/ Assure optimal value from IT. COBIT 5 provides defined as an GEIT based on
COBIT5
GEIT based on a comprehensive framework that assists end state that a (Goal)
COBIT®5 enterprises in achieving their objectives stakeholder
adoption for the governance and management of intends to
enterprise IT. achieve.

The goals cascade is important because


it allows the definition of priorities for
implementation, improvement and
assurance of governance of enterprise IT
based on (strategic) objectives of the
enterprise and the related risk.

Principle 1: The COBIT 5 framework is built on five A principle is !


Principle 1:
Meeting basic principles. defined as a Meeting Stakeholder
Needs
(Principle)
Stakeholder normative
Principle 1: Meeting Stakeholder
Needs property of all
Needs—Enterprises exist to create value
systems in a
for their stakeholders by maintaining a
given context,
balance between the realisation of
or the way in

Appendixes - 1
benefits and the optimisation of risk and which they are
use of resources. COBIT 5 provides all of realized.
the required processes and other
enablers to support business value
creation through the use of IT. Because
every enterprise has different objectives,
an enterprise can customise COBIT 5 to
suit its own context through the goals
cascade, translating high-level enterprise
goals into manageable, specific, IT-
related goals and mapping these to
specific processes and practices.

Principle 2: The COBIT 5 framework is built on five A principle is Principle 2: !


Covering the
Covering the basic principles. defined as a Enterprise End-to-
end
Enterprise End- normative (Principle)
Principle 2: Covering the Enterprise End-
to-end property of all
to-end—COBIT 5 integrates governance
systems in a
of enterprise IT into enterprise
given context,
governance:
or the way in
– It covers all functions and processes which they are
within the enterprise; COBIT 5 does not realized.
focus only on the ‘IT function’, but treats
information and related technologies as
assets that need to be dealt with just like
any other asset by everyone in the
enterprise.

– It considers all IT-related governance


and management enablers to be
enterprisewide and end-to-end, i.e.,
inclusive of everything and everyone—
internal and external—that is relevant to
governance and management of
enterprise information and related IT.

Principle 3: The COBIT 5 framework is built on five A principle is Principle 3: !


Applying a Single,
Applying a basic principles. defined as a Integrated
Framework
Single, normative (Principle)
Principle 3: Applying a Single, Integrated
Integrated property of all
Framework—There are many IT-related
systems in a

Appendixes - 2
Framework standards and best practices, each given context,
providing guidance on a subset of IT or the way in
activities. COBIT 5 aligns with other which they are
relevant standards and frameworks at a realized.
high level, and thus can serve as the
overarching framework for governance
and management of enterprise IT.

Principle 4: The COBIT 5 framework is built on five A principle is !


Principle 4:
Enabling a basic principles. defined as a Enabling a Holistic
Approach
(Principle)
Holistic Approach normative
Principle 4: Enabling a Holistic
property of all
Approach—Efficient and effective
systems in a
governance and management of
given context,
enterprise IT require a holistic approach,
or the way in
taking into account several interacting
which they are
components. COBIT 5 defines a set of
realized.
enablers to support the implementation of
a comprehensive governance and
management system for enterprise IT.
Enablers are broadly defined as anything
that can help to achieve the objectives of
the enterprise. The COBIT 5 framework
defines seven categories of enablers:

– Principles, Policies and Frameworks

– Processes

– Organisational Structures

– Culture, Ethics and Behaviour

– Information

– Services, Infrastructure and


Applications

– People, Skills and Competencies

Principle 5: The COBIT 5 framework is built on five A principle is Principle 5: !


Separating
Separating basic principles. defined as a Governance From
Management
Governance normative (Principle)
Principle 5: Separating Governance From
From property of all
Management—The COBIT 5 framework
systems in a

Appendixes - 3
Management makes a clear distinction between given context,
governance and management. These two or the way in
disciplines encompass different types of which they are
activities, require different organisational realized.
structures and serve different purposes.
COBIT 5’s view on this key distinction
between governance and

management is:

– Governance: ensures that stakeholder


needs, conditions and options are
evaluated to determine balanced,
agreed-on enterprise objectives to be
achieved; setting direction through
prioritisation and decision making; and
monitoring performance and compliance
against agreed-on direction and
objectives.

In most enterprises, overall governance


is the responsibility of the board of
directors under the leadership of the
chairperson. Specific governance
responsibilities may be delegated to
special organisational structures at an
appropriate level, particularly in larger,
complex enterprises.

– Management: Management plans,


builds, runs and monitors activities in
alignment with the direction set by the
governance body to achieve the
enterprise objectives.

In most enterprises, management is the


responsibility of the executive
management under the leadership of the
chief executive officer (CEO).

Appendixes - 4
Appendix Table 2: The COBIT 5 Goals Cascade.

ArchiMate
COBIT®5 COBIT®5 ArchiMate
concept notation
concept concept description
description

Stakeholder Appendix H -Glossary A stakeholder is


defined as the name
Stakeholder: Anyone who has a (Stakeholder)
role of an
responsibility for, an expectation from or
individual, team
some other interest in the enterprise —
or organization
e.g., shareholders, users, government,
(or classes
suppliers, customers and the public.
thereof) that
represents their
interests in, or
Internal Stakeholders:
concerns
• Board relative to, the
outcome of the
• Chief executive officer (CEO)
architecture.
• Chief financial officer (CFO)

• Chief information officer (CIO)

• Chief risk officer (CRO)

• Business executives

• Business process owners

• Business managers

• Risk managers

• Security managers

• Service managers

• Human resource (HR)

managers

• Internal audit

• Privacy officers

• IT users

Appendixes - 5
• IT managers

• Etc.

External Stakeholders:

• Business partners

• Suppliers

• Shareholders

• Regulators/government

• External users

• Customers

• Standardisation organisations

• External auditors

• Consultants

• Etc.

Stakeholder Stakeholder needs are influenced by a A driver is


driver number of drivers, e.g., strategy defined as name
(Driver)
changes, a changing business and something that
regulatory environment, and new creates,
technologies. motivates, and
fuels the change
in an
organization.

Stakeholder Stakeholder needs drive the governance A driver is


needs objective of value creation: defined as name
(Driver)
something that
 Benefits Realization
 Risk optimisation creates,
 Resource Optimisation motivates, and
Enterprises have many stakeholders, fuels the change
and ‘creating value’ means different— in an
and sometimes conflicting—things to organization.
each of them. Governance is about
negotiating and deciding amongst

Appendixes - 6
different stakeholders’ value interests. By
consequence, the governance system
should consider all stakeholders when
making benefit, risk and resource
assessment decisions. For each
decision, the following questions can and
should be asked: For whom are the
benefits? Who bears the risk? What
resources are required?

Governance Stakeholder needs drive the governance A goal is defined


objective of value objective of value creation: as an end state name
(Goal)
creation that a
 Benefits Realization
 Risk optimisation stakeholder
 Resource Optimisation intends to
The goals cascade is important because achieve.
it allows the definition of priorities for
implementation, improvement and
assurance of governance of enterprise IT
based on (strategic) objectives of the
enterprise and the related risk.

COBIT 5 defines 17 generic (enterprise)


goals, as shown in figure 5, which
includes the following information:

• The BSC dimension under which the


enterprise goal fits

• Enterprise goals

• The relationship to the three main


governance objectives—benefits
realisation, risk optimisation and
resource optimisation. (‘P’ stands for
primary relationship and ‘S’ for
secondary relationship, i.e., a less strong
relationship.)

Appendix H – Glossary

Value creation: The main governance

Appendixes - 7
objective of an enterprise, achieved
when the three underlying objectives
(benefits realisation, risk optimisation
and resource optimisation) are all
balanced.

Risk The goals cascade is important because A driver is


it allows the definition of priorities for defined as name
(Driver)
implementation, improvement and something that
(NOTE: check assurance of governance of enterprise IT creates,
also risk based on (strategic) objectives of the motivates, and
scenarios) enterprise and the related risk. fuels the change
in an
organization.
Appendix H –Glossary

Risk: The combination of the probability


of an event and its consequence
(ISO/IEC 73).

Enterprise Goals Stakeholder needs can be related to a A goal is defined


set of generic enterprise goals. These as an end state name
(Goal)
enterprise goals have been developed that a
using the balanced scorecard (BSC) stakeholder
dimensions, and they represent a list of intends to
commonly used goals that an enterprise achieve.
may define for itself. Although this list is
not exhaustive, most enterprise-specific
goals can be mapped easily onto one or
more of the generic enterprise goals.

COBIT 5 defines 17 generic (enterprise)


goals, as shown in figure 5, which
includes the following information:

• The BSC dimension under which the


enterprise goal fits

• Enterprise goals

• The relationship to the three main


governance objectives—benefits

Appendixes - 8
realisation, risk optimisation and
resource optimisation. (‘P’ stands for
primary relationship and ‘S’ for
secondary relationship, i.e., a less strong
relationship.)

IT-related Goals Achievement of enterprise goals requires A goal is defined


a number of IT-related outcomes, which as an end state name
(Goal)
are represented by the IT-related goals. that a
IT-related stands for information and stakeholder
related technology, and the IT-related intends to
goals are structured along the achieve.
dimensions of the IT balanced scorecard
(IT BSC). COBIT 5 defines 17 IT-related
goals.

Enabler Goals Enablers include processes, A goal is defined


organisational structures and as an end state name
(Goal)
information, and for each enabler a set of that a
specific relevant goals can be defined in stakeholder
support of the IT-related goals. intends to
achieve.
name
A requirement is (Requirement)
defined as a
statement of
need that must
be realized by a
system.

(See paper
“Where have all
the CO gone?”
CO-> control
requirements,
i.e. practices.

See also COBIT


5: Enabling
Processes,
figure 13)

Appendixes - 9
Appendix Table 3: Pain Points and Trigger Events.

ArchiMate
COBIT®5 COBIT®5 ArchiMate
concept notation
concept concept description
description

Pain point Examples of some of the typical pain A driver is


points for which new or revised defined as name
(Driver)
governance or management of IT something that
enablers can be a solution (or part of a creates,
solution), as identified in COBIT 5 motivates, and
Implementation, are: fuels the change
in an
• Business frustration with failed
organization.
initiatives, rising IT costs and a
perception of low business value

• Significant incidents related to IT risk,


such as data loss or project failure

• Outsourcing service delivery problems,


such as consistent failure to meet
agreed-on service levels

• Failure to meet regulatory or


contractual requirements

• IT limiting the enterprise’s innovation


capabilities and business agility

• Regular audit findings about poor IT


performance or reported IT quality of
service problems

• Hidden and rogue IT spending

• Duplication or overlap between


initiatives or wasting resources, such as
premature project termination

• Insufficient IT resources, staff with


inadequate skills or staff
burnout/dissatisfaction

• IT-enabled changes failing to meet


business needs and delivered late or

Appendixes - 10
over budget

• Board members, executives or senior


managers who are reluctant to engage
with IT, or a lack of committed and
satisfied business sponsors for IT

• Complex IT operating models

Trigger Events in the enterprise’s internal and A driver is


external environment can signal or defined as name
(Driver)
trigger a focus on the governance and something that
management of IT. Examples from creates,
chapter 3 in the COBIT 5 Implementation motivates, and
publication are: fuels the change
in an
• Merger, acquisition or divestiture
organization.
• A shift in the market, economy or
competitive position

• A change in the business operating


model or sourcing arrangements

• New regulatory or compliance


requirements

• A significant technology change or


paradigm shift

• An enterprisewide governance focus or


project

• A new CEO, CFO, CIO, etc.

• External audit or consultant


assessments

• A new business strategy or priority

Appendixes - 11
Appendix B: First DSRM Iteration - Viewpoints

Appendix Figure 1: Goals Cascade viewpoint.

Appendixes - 12
Appendix Figure 2: Enabling Process Performance viewpoint.

Appendixes - 13
Appendix C: Third DSRM Iteration – Viewpoints

Appendix Figure 3: COBIT 5 Goals and Principles

Appendixes - 14
Appendix Figure 4: Principle 1 (Meeting Stakeholder Needs)

Appendixes - 15
Appendix Figure 5: Enterprise Goals

Appendix Figure 6: IT-related Goals

Appendixes - 16
Appendix Figure 7: IT-related goals and base practices, for process APO12 Manage Risk

Appendixes - 17
Appendix Figure 8: Enabling Processes

Appendixes - 18
Appendix Figure 9: Enabling Processes Stakeholders, Goals and Requirements View, for process APO12
Manage Risk

Appendixes - 19
Appendix Figure 10: Process Capability Level 1 Performed process, for process APO12 Manage Risk

Appendixes - 20
Appendix Figure 11: Activities View, for process APO12 Manage Risk

Appendixes - 21
Appendix Figure 12: Work Products (Inputs and Outputs) View for APO12 Manage Risk

Appendixes - 22
Appendix Figure 13: Work Products (Outputs) View for APO12 Manage Risk

Appendixes - 23
Appendix Figure 14: Generic Work Products (GWPs), for process APO12 Manage Risk

Appendixes - 24
Appendix Figure 15: Process Capability Level 2 Managed Process, for process APO12 Manage Risk

Appendixes - 25
Appendix Figure 16: Process Capability Level 3 Established process, for process APO12 Manage Risk

Appendixes - 26
Appendix Figure 17: Process Capability Level 4 Predictable process, for process APO12 Manage Risk

Appendixes - 27
Appendix Figure 18: Process Capability Level 5 Optimizing process, for process APO12 Manage Risk

Appendixes - 28
Appendix Figure 19: Generic Work Products for Process Capability Level 2: Managed Process, for process:
APO12 Manage Risk

Appendixes - 29
Appendix Figure 20: Generic Work Products for Process Capability Level 3: Established process, for process:
APO12 Manage Risk

Appendixes - 30
Appendix D: Fourth DSRM Iteration – Viewpoints

Appendix Figure 21: Process Capability Improvement and GEIT View, for APO12 Manage Risk

Appendixes - 31
Appendix Figure 22: Process Capability Improvement View, for process APO12 Manage Risk

Appendixes - 32
Appendix Figure 23: Process Capability Improvement View, Level 0 to Level 1, for process APO12 Manage Risk

Appendixes - 33
Appendix Figure 24: Process Capability Improvement View, Level 1 to Level 2, for process APO12 Manage Risk

Appendixes - 34
Appendix Figure 25: Process Capability Improvement View, Level 2 to Level 3, for process APO12 Manage Risk

Appendixes - 35
Appendix Figure 26: Process Capability Improvement View, Level 3 to Level 4, for process APO12 Manage Risk

Appendixes - 36
Appendix Figure 27: Process Capability Improvement View, Level 4 to Level 5, for process APO12 Manage Risk

Appendixes - 37

You might also like