You are on page 1of 40

DATA GOVERNANCE MATURITY ASSESSMENT TOOL: A DESIGN SCIENCE

APPROACH

Philippe Marchildon, Simon Bourdeau, Pierre Hadaya, Aldrin Labissière

De Boeck Supérieur | « Projectics / Proyéctica / Projectique »

2018/2 n°20 | pages 155 à 193


ISSN 2031-9703
ISBN 9782807392373
DOI 10.3917/proj.020.0155
Article disponible en ligne à l'adresse :
--------------------------------------------------------------------------------------------------------------------
https://www.cairn.info/revue-projectique-2018-2-page-155.htm
--------------------------------------------------------------------------------------------------------------------

Distribution électronique Cairn.info pour De Boeck Supérieur.


© De Boeck Supérieur. Tous droits réservés pour tous pays.
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


La reproduction ou représentation de cet article, notamment par photocopie, n'est autorisée que dans les
limites des conditions générales d'utilisation du site ou, le cas échéant, des conditions générales de la
licence souscrite par votre établissement. Toute autre reproduction ou représentation, en tout ou partie,
sous quelque forme et de quelque manière que ce soit, est interdite sauf accord préalable et écrit de
l'éditeur, en dehors des cas prévus par la législation en vigueur en France. Il est précisé que son stockage
dans une base de données est également interdit.

Powered by TCPDF (www.tcpdf.org)


Data governance maturity assessment tool: A design science approach

DATA GOVERNANCE
MATURITY ASSESSMENT
TOOL: A DESIGN SCIENCE
APPROACH
HERRAMIENTA DE EVALUACIÓN DE LA MADUREZ
DE GOBERNABILIDAD DE DATOS: UN ENFOQUE
DE LA CIENCIA DEL DISEÑO
OUTIL D’ÉVALUATION DE LA MATURITÉ
DE LA GOUVERNANCE DES DONNÉES :
UNE APPROCHE FONDÉE SUR LA SCIENCE DU DESIGN

Philippe Marchildon, Ph.D.1


© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


Professor, Department of Management and Technology
École des sciences de la gestion (ESG)
Université du Québec à Montréal (UQAM)

Simon Bourdeau, Ph.D., PMP


Professor, Department of Management and Technology
École des sciences de la gestion (ESG)
Université du Québec à Montréal (UQAM)

Pierre Hadaya, Ph.D.


Professor, Department of Management and Technology
École des sciences de la gestion (ESG)
Université du Québec à Montréal (UQAM)

Aldrin Labissière, M.Sc.


Ministère de l’Immigration, de la Diversité et de l’Inclusion du Québec
Montreal (Quebec), Canada

1. Corresponding author: marchildon.philippe@uqam.ca

proyéctica / projectics / projectique – n° 20 155


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

practices and structures gaps and develop, deploy


ABSTRACT necessary to orchestrate and/or improve their DGF
and optimize the collec- accordingly, the present
Nowadays, data have tion, storage, use and paper develops, using a
become strategic assets dissemination of data as design science research
by allowing organizations organizational assets. Yet, approach, a data gover-
to uncover unforeseen most organizations fail to nance maturity assess-
patterns and develop- implement a DGF adapted ment tool. Our proposed
ing sharper insights about to their needs since they artifact, which includes
their customers and ignore the level of maturity 11 dimensions and
partners as well as the of their data management 72 questions, allows orga-
markets and environments practice and thus, do not nizations to assess where
in which they operate. To know where to start when they stand in terms of data
properly manage their implementing a DGF. To governance and, in turn,
data, organizations rely on help organizations evalu- to better define and prior-
a data governance frame- ate their operations against itize the goals, content and
work (DGF) that defines data governance best prac- activities of their data gov-
the processes, policies, tices as well as identify key ernance initiatives.

Keywords: data governance, data management, design science, maturity assess-


ment tool

necesarias para orques- identificar las brechas cla-


RE SUME N tar y optimizar la reco- ves y desarrollar, imple-
lección, almacenamiento, mentar y / o mejorar en
Hoy en día, los datos se uso y difusión de datos consecuencia su DGF, el
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


han convertido en activos como activos de la orga- presente documento se
estratégicos al permitir a nización. Sin embargo, la desarrolla con un enfoque
las organizaciones descu- mayoría de las organiza- de ciencia de diseño, una
brir patrones imprevistos ciones no logran imple- herramienta de evaluación
y desarrollar perspectivas mentar un DGF adaptado de la madurez de goberna-
más precisas sobre sus a sus necesidades, ya que bilidad de datos. Nuestro
clientes y socios, así como ignoran el nivel de madu- artefacto propuesto, que
sobre los mercados y rez de su práctica de ges- incluye 11 dimensiones y
entornos en los que ope- tión de datos y, por lo 72 preguntas, permite a
ran. Para gestionar ade- tanto, no saben por dónde las organizaciones evaluar
cuadamente sus datos, empezar cuando imple- su posición en términos de
las organizaciones se mentan un DGF. Para ayu- manejo de datos y, a su vez,
basan en un marco de ges- dar a las organizaciones a definir y priorizar mejor
tión de datos (DGF en sus evaluar sus operaciones los objetivos, contenidos y
siglas en ingles) que define en relación con las mejo- actividades de sus iniciati-
los procesos, políticas, res prácticas de goberna- vas futuras de gobernabili-
prácticas y estructuras bilidad de datos, así como dad de datos futuros.

Palabras claves: gobernabilidad de datos, gestión de datos, ciencia del diseño,


herramienta de evaluación de madurez

156 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

les pratiques et les struc- données et de les aider à


RÉ SUM É tures nécessaires pour développer, déployer et/
orchestrer et optimiser ou améliorer leur CGD
De nos jours, les données la collecte, le stockage, en fonction de leurs prin-
sont devenues des atouts l’utilisation et la diffu- cipales lacunes, le pré-
stratégiques puisqu’elles sion de données en tant sent article développe, à
permettent désormais qu’atouts organisation- l’aide d’une approche de
aux organisations de nels. Cependant, la plu- la recherche en science
découvrir de nouvelles part des organisations ne du design, un outil d’éva-
tendances ainsi que de parviennent pas à mettre luation de la maturité de
développer une connais- en place un CGD adapté à la gouvernance des don-
sance plus approfondie de leurs besoins puisqu’elles nées. L’artefact proposé,
leurs clients et de leurs ignorent le niveau de qui comprend 11 dimen-
partenaires, ainsi que de maturité de leurs pra- sions et 72 questions,
l’environnement et des tiques de gestion des don- permet aux organisa-
marchés dans lesquels nées et ne savent donc tions d’évaluer où elles se
elles opèrent. Pour gérer pas par où commencer situent en matière de gou-
correctement leurs don- lors de la mise en œuvre vernance des données et,
nées, les organisations d’un CDG. Afin d’aider les ainsi, de mieux définir et
s’appuient sur un cadre organisations à évaluer hiérarchiser les objectifs,
de gouvernance des don- leurs opérations par rap- le contenu et les activités
nées (CGD) qui définit les port aux meilleures pra- de leurs initiatives en gou-
processus, les politiques, tiques en gouvernance des vernance des données.

Mots-clés : gouvernance des données, gestion des données, science du design, outil
d’évaluation de la maturité
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


INTRODUCTION
In today’s digital age, data have become strategic and valuable assets by
allowing organizations to uncover unforeseen patterns and develop sharper
insights about their customers and partners as well as the markets and envi-
ronments in which they operate (Dallemule & Davenport, 2017; Khatri &
Brown, 2010; Lee, Madnick, Wang, Wang, & Zhang, 2014). Indeed, data are
used to improve organizational products/services, customer service as well as
support strategic decision and business intelligence (Alhassan, Sammon, &
Daly, 2018; Fleckenstein & Fellows, 2018; Khatri & Brown, 2010; Ross, Beath,
& Quadgraas, 2013). However, for data to be potent assets, they need to be of
good quality and well-managed (Fleckenstein & Fellows, 2018; Gregory, 2011;
Redman, 2013). This is no easy feat since data are ubiquitous and organiza-
tions rely on as much timely and precise data as possible to make effective and
efficient decisions (Davenport, 2013; Fleckenstein & Fellows, 2018).
To properly improve and maintain the quality of their data, organizations
should put in place a data governance framework (DGF) that encompasses the
processes, policies, practices and structures necessary to orchestrate their
people, processes and technologies and optimize the collection, storage, use

proyéctica / projectics / projectique – n° 20 157


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

and dissemination of data (DAMA International, 2014; Ladley, 2012; Soares,


2010, 2012). A DGF is about “the formulation of policy to optimize, secure, and
leverage information as an enterprise asset by aligning the objectives of mul-
tiple functions” (Soares, 2014, p. 3). It has to do with the decision rights and
responsibilities regarding the management of data assets within an organi-
zation (Otto, 2011) and rests on three pillars: people, process and technology
(Soares, 2014). Accordingly, a DGF allows an organization to know where its
data are, how they are used, where and when they are combined with other
data or assets and, ultimately, what they are worth.
Yet, most organizations fail to deploy a DGF adapted to their needs. One
of the main reasons for such failures is that most organizations ignore their
level of data governance maturity. By data governance maturity we mean the
extent to which an organization has developed and deployed the (1) processes,
(2) policies, (3) practices and (4) structures necessary to optimize the collec-
tion, storage, use and dissemination of its data (as an organizational asset). As
such, by not knowing where they stand in terms of data governance, organi-
zations do not know where to start when implementing a DGF (Fleckenstein
& Fellows, 2018). To solve this issue, various approach for developing and
deploying a DGF have been proposed (e.g. DAMA International, 2009; Ladley,
2012; Soares, 2010; Thomas, 2006). For instance, the Capability Maturity Model
Integration (CMMI) Institute has developed the Data Management Maturity
(DMM) model, which defines the fundamental business processes and spe-
cific capabilities necessary for the management of an organization’s data
assets (CMMI Institute, 2014). In addition, by proposing five levels of matu-
rity for each of the processes and capabilities it identifies (i.e., performed,
managed, defined, measured, optimized), the DMM model presents a gra-
dated or evolutionary path to implement a DGF. Nevertheless, most of these
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


approaches are incomplete as they do not provide the necessary tools to help
organizations accurately assess their own level of maturity in terms of data
governance. Take for example the DDM model discussed previously. It pro-
vides concrete examples of requirements and activities that correspond to
each level of data governance maturity, but it does not provide any tools to help
organizations assess their own level of maturity in regards to these require-
ments and activities. As such, it is still difficult for organizations to evaluate
their own data governance processes, practices, policies and structures.
To help organizations (1) evaluate their operations against the best prac-
tices of data governance, (2) identify key gaps, and (3) develop, deploy and/or
improve their DGF accordingly, the present paper develops, using a design
science research (DSR) approach (Hevner, March, Park, & Ram, 2004; Peffers,
Tuunanen, Rothenberger, & Chatterjee, 2007), a data governance maturity
(DGM) assessment tool. DSR is different than traditional behavioral science
approaches which are used to explain human/organizational phenomena as
well as to develop theories (Hevner et al., 2004; March & Smith, 1995). DSR
aims to create and evaluate artifacts and tools to solve problems identified in
organizations (2007-08). As data management is one of the most important
issues in the current technological landscape (Luftman et al., 2015) and hav-
ing established the lack of key tools to help organizations assess their level
of maturity when developing and deploying a DGF, our proposed DGM assess-
ment tool, which includes 11 dimensions and 72 questions, should allow

158 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

organizations to assess their level of maturity in terms of data governance


and, in turn, to better define and prioritize the goals, content and activities of
their data governance initiatives.
The remainder of this article is structured as follows. First, we present a
brief literature review on data governance and maturity models. The following
section presents each step and related outputs of the methodology we used to
create and evaluate our data governance maturity assessment tool. The paper
concludes by highlighting the study’s contributions, limitations, and directions
for future research.

LITERATURE REVIEW2
Data governance is a growing trend in today’s business environment and it
has generated substantial interest from both practitioners and researchers
(Alhassan, Sammon, & Daly, 2016; Logan, Popkin, & Faria, 2016). It is based on
the idea that data is a valuable organizational asset which must be maintained
(Otto, 2011). Accordingly, the overarching goal of data governance is to ensure
“strong participation across the organization for critical decisions affecting
the data assets” (CMMI Institute, 2014, p. 43) and to warrant an effective over-
sight of data management practices. Hence, data governance aims to specify,
in the form of guidelines and rules for data management, who within an orga-
nization can make what decisions regarding the manipulation of data, and
what are the tasks related to these decisions. Data governance can thus be
seen as the exercise of decision-making and authority for everything related
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


to organizational data (Thomas, 2006). It involves “the formulation of policy to
optimize, secure, and leverage information as an enterprise asset by aligning
the objectives of multiple functions (p. 3)” and it is often implemented in the
form of a companywide data governance framework (DGF) that assigns deci-
sion-related rights and duties to particular individuals to encourage desirable
behaviors regarding data use across an organization (Begg & Caira, 2012;
Khatri & Brown, 2010; Otto, 2011). A good DGF aligns the data‑related pro-
cesses, policies, practices and structures with the organizational mission,
strategy, values, norms and cultures. While a good DGF will allow organiza-
tions to improve and maintain the quality of their data, a bad DGF will engen-
der negative consequences that will be felt in several business areas and at
various hierarchical levels of the organization (2010).
Through time, several DGF have been proposed. For instance, Kahtri
et al. (2010) proposed a DGF structured around five decision areas: (1) Data
Principles to clarify the role of data as an asset, (2) Data Quality to establish

2. A literature review on data governance tools as well as the maturity models was conducted. Methodological
recommendations suggested by different researchers were followed (Rowe, 2014; Templier & Paré, 2015; Webster
& Watson, 2002). Keywords related to the field of study, namely data governance, governance framework, maturity
models, tools, assessment and evaluation were used in combination. In the hopes of identifying both academic and
professional articles, several databases were consulted: JSTOR, ACM Digital Library, ABI ProQuest aggregator and
Google Scholar. The inclusion criteria used were: governance framework and assessment tools. Results of all these
searches did not permit us to identify a single scientific article presenting such a tool. For that reason, our focus then
shifted to the tools developed in the professional field.

proyéctica / projectics / projectique – n° 20 159


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

the prerequisites for the expected use of the data, (3) Metadata to define the
semantics of the data, (4) Data Access to determine the prerequisites and
rules for accessing data, and (5) Data Lifecycle to establish the definition, pro-
duction, retention and archiving of data.
The Data Governance Institute (DGI) has also proposed a DGF, which is
articulated around six key data governance areas: (1) Policy, Standards and
Strategy (i.e., formal data policies supported by cross-functional data stew-
ards), (2) Data Quality (i.e., data quality criteria and monitoring systems),
(3) Privacy, Compliance and Security (i.e., privacy, compliance and security
data programs mandated by management), (4) Architecture and Integration
(i.e., data needs tied to architecture and integration challenges), (5) Data
Warehouses and Business Intelligence (i.e., data warehouses and BI pro-
grams), and (6) Management Support (i.e., data programs focusing on getting
managerial support) (Thomas, 2006).
Yet, to this day, the most comprehensive DGF is certainly the one proposed
by Soares (2010) which is articulated around 11 data governance competences
or categories:
1. Data Risk Management and Compliance. A method by
which risks are identified, qualified, quantified, avoided,
accepted, mitigated, or transferred out;
2. Value Creation. A process by which data assets are qual-
ified and quantified to enable the business to maximize
the value created by data assets;
3. Organizational Structures and Awareness. The level of
mutual responsibility between business and IT, and the
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


recognition of fiduciary responsibility to govern data at
different levels of management;
4. Stewardship. A quality-control discipline designed to
ensure the custodial care of data for asset enhancement,
risk mitigation, and organizational control;
5. Policy. The written articulation of desired organizational
behavior;
6. Data Quality Management. The methods used to mea-
sure, improve, and certify the quality and integrity of pro-
duction, test, and archival data;
7. Information Lifecycle Management. A systematic policy-
based approach for information collection, use, reten-
tion, and deletion;
8. Information Security and Privacy. The policies, practices,
and controls used by an organization to mitigate risk and
protect data assets;
9. Data Architecture. The architectural design of struc-
tured and unstructured data systems and applications
that makes data available to appropriate users;
10. Classification and Metadata. The methods and tools used
to create common semantic definitions for business and
IT terms, data models, and repositories;

160 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

11. Audit Information Logging and Reporting. The organi-


zational processes for monitoring and measuring data
value and risks as well as the effectiveness of data gov-
ernance.
Developing and deploying a DGF represents a tedious effort and to do so,
it is essential to identify the strategic objectives underlying the data gover-
nance initiatives. According to Tallon et al. (2013), organizations will gener-
ally deploy governance frameworks for two main reasons. First, to maximize
the value of their data by ensuring that it is reliable, secure and available for
operations and business decision-making. Second, to protect their data from
human errors, inappropriate uses, or mishaps of any kind. When developing
a DGF, organizations need to address several key questions regarding the
management of their data. For example, how does the organization manage
data? How are the data used to support decision-making? Which data sources
are used and how are they managed? Which employees are responsible for
extracting, evaluating and reporting the data? Do employees know and under-
stand the data, sources and measures well enough? (Tallon et al., 2013). Then,
when the time comes to deploy their DGF, organizations may adopt either a
big bang approach which simultaneously tackles all the data governance pro-
cesses, policies, practices and structures at once or a progressive, phase-by
phase, approach where the key processes, policies, practices and/or struc-
tures of the DGF are selected and prioritized based on the organization’s data
governance maturity level (CMMI Institute, 2014; Fleckenstein & Fellows,
2018; Khatri & Brown, 2010; Soares, 2010). According to several research-
ers and practitioners, adopting the second approach is wiser as it yields the
best chances of success (CMMI Institute, 2014; Fleckenstein & Fellows, 2018;
Khatri & Brown, 2010; Soares, 2010). Hence, when developing and deploying
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


a DGF, one of the most important challenges for an organization is to deter-
mine its level of data governance maturity as it will determine its strengths
and weaknesses and help prioritize the DGF elements to be developed and
deployed.
The concept of maturity is based on the Capability Maturity Model (CMM)
developed in 1984 by the Software Engineering Institute (SEI) to offer a
methodology for developing and optimizing software development processes.
Within this particular model, the notion of maturity refers to the degree to
which an organization has explicitly and consistently deployed processes
that are documented, managed, measured, controlled, and continually
improved (CMMI Institute, 2014). Usually, the higher the organizational matu-
rity, the higher will be the degree of formality and optimization of the related
processes, practices and/or structures. Accordingly, the concept of matu-
rity allows organizations to gradually introduce the best-practices related to
a specific domain (e.g., data governance) and to continuously improve spe-
cific processes and practices. To do so, organizations must develop an action
plan and deploy initiatives to “mature” from one level to another. Several
authors have identified the advantages of maturity models as they provide
structured frameworks for defining (1) a starting point for comparing oneself,
(2) an assessment of the current situation (3) a prioritization of actions to be
undertaken, and (4) an evaluation methodology to record changes in the level
of maturity (CMMI Institute, 2014; PMI, 2003).

proyéctica / projectics / projectique – n° 20 161


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

The original CMM identifies five level of maturity:


1. Performed (level 1). Processes are performed in an ad
hoc fashion without planning. Processes are typically not
applied across all organizational areas and are primar-
ily reactive. Foundational improvements may exist but are
limited and no standard, no common approach to manag-
ing or sharing organizational data exists.
2. Managed (level 2). Processes are planned and executed in
accordance to a policy. Processes are monitored and con-
trolled by relevant stakeholders. Processes are not used
in all organizational functions. There is an appearance of
data governance in the organization with a data integra-
tion platform and a concern for data quality.
3. Defined (level 3). Standardized processes are consistently
followed and specific processes are tailored according to
organizational guidelines. The organization has defined
a DGF where business rules are captured by a data
management service.
4. Measured (level 4). Process metrics are defined and used,
and process performance is managed across the orga-
nization. The organization has defined and implemented
a DGF with a data governance center and a data quality
approach.
5. Optimized (level 5). Process performance is optimized,
improvements are identified and best practices are
shared. Quantitative objectives for business processes
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


improvements have been established and are continually
revised to adjust for changing business objectives. The
organization is confident, agile and intelligent. The infor-
mation is shared in full transparency.
Recently, the CMMI Institute adapted the original CMM to the domain of
data governance. This new model, labelled data management maturity (DMM)
model, was developed using the same principles and structure of the original
model. However, instead of providing guidelines in regards to software devel-
opment processes like the original CMM, the new DMM model provides a set
of best practices to help organizations evaluate their own capabilities in terms
of data governance (CMMI Institute, 2014). This new DGF, which includes
the notion of maturity, is articulated around three data governance areas:
(1) Governance management which encompasses the activities facilitating
collaborative decision making and effectively implementing the “building, sus-
taining and compliance functions of governance bodies (p. 43)”, (2) Business
glossary which encompasses the practices supporting a shared understand-
ing of the ever expanding business terms as well as supporting the prioriti-
zation and development of new data terms, and (3) Metadata management
which encompasses the practices supporting the architecting, planning, pop-
ulating and management of the metadata repository within the organization.
Other models similar to the DMM model have also been developed to guide
organizations in the appraisal of their data governance capabilities (DAMA

162 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

International, 2009; Soares, 2010). Unfortunately, even though they include


the notion of maturity and provide a gradated or evolutionary path to develop
and deploy an adequate DGF, these models, including the one from the CMMI,
do not provide any tool to help organizations accurately assess their own level
of data governance maturity.
Taken as a whole, the literature on data governance has several strengths
and one important shortcoming. On the plus side, this literature provides a
definition of what data governance is and what is its purpose. It also identifies
the key dimensions that a good DGF should encompass. In addition, it high-
lights the importance of the maturity concept in any data governance initiative
as it provides a sound gradated or evolutionary path to properly implement a
DGF. On the minus side, past efforts do not provide the necessary tools to help
organizations accurately assess their own level of maturity in terms of data
governance. In the next section of this paper, we explain how we used a DSR
approach to develop and test such a tool.

RESEARCH METHOD AND RESULTS


A DSR approach is adopted here to develop and test our new artifact: a data
governance maturity (DGM) assessment tool. Specifically, the six steps of the
DSR approach proposed by Peffers et al. (2007) were followed: (1) Problem
identification and motivation, (2) Definition of the objectives for a solu-
tion, (3) Design and development, (4) Demonstration, (5) Evaluation, and
(6) Communication. These steps were chosen since they are clear and sim-
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


ple and also because they cover the key requirements of DSR as stated by
Hevner’s et al. (2004). In addition, throughout steps 1, 4 and 5, members of the
industry and experienced domain experts (e.g., IT security analysts, informa-
tion resources experts, business analysts, IT managers) were consulted and
asked to provide key suggestions, comments and feedback in order to help us
develop our new artifact. More precisely, members of the industry and experi-
enced domain experts were consulted to clarify the artifact’s objectives and to
assess its usefulness, relevancy, reliability, validity and effectiveness (Gregor
& Hevner, 2013; Hevner et al., 2004). They were identified via personal con-
tacts and semi-structured interviews were used to collect their suggestions,
comments and feedback during these steps. For each interview, research
notes were taken and then transcribed immediately after each interview. The
following paragraphs detail what was done and the key findings of each step.

Step 1 – Problem identification and motivation


During step 1, exploratory interviews were conducted with eleven members
of the industry3 to assess (1) whether data governance was a preoccupation

3. Amongst these eleven members of the industry, 3 were project managers, 1 was an IT security analyst, 4 were
business analysts, 1 was a marketing analyst and 3 were managers. They had, on average, thirteen years of work
experience.

proyéctica / projectics / projectique – n° 20 163


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

for them, (2) what were the challenges they faced regarding data governance,
(3) whether they knew what were their organization’s strengths and weak-
nesses regarding their data governance processes, policies, practices and
structures, (4) if they knew which data governance initiatives they should pri-
oritize and what each should address, and (5) if an DGM assessment tool could
help them develop and deploy a DGF tailored to their needs.
Amongst key findings from this step, transcripts from the exploratory
interviews indicate that all respondents agreed that data were now considered
strategic assets and thus that data governance had become a central organi-
zational preoccupation. Respondents also mentioned that several data gover-
nance rollout projects failed in the past. Specifically, respondents explained
that, although data governance projects were needed and important for their
organization, these projects failed because they were misaligned with the
organization’s needs and context and because business units had diverging
views in regards to the issues these projects should address as well as how
and when these projects should be conducted. In addition, respondents men-
tioned that leaders of data governance projects had a hard time determining
their projects’ goals and contents as well as to prioritize their data governance
initiatives since they did not know exactly what their organization’s strengths
and weaknesses in terms of data governance were. In other words, respon-
dents mentioned that leaders of data governance projects did not know about
their organization’s level of data governance maturity, which, in turn, seriously
undermined their ability to identify the data governance processes, policies,
practices and structures to be developed and deployed.
To help them develop, deploy and improve the data governance processes,
policies, practices and structures of their organization, respondents men-
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


tioned they used various data governance methodologies (e.g. Ladley, 2012;
Soares, 2010, 2014) as a guide. However, respondents indicated that these
methodologies did not provide any tool to assess the maturity level of their
organization in terms of data governance. Respondents said that such a tool
would be more then welcomed as it would help them pinpoint the strengths
and weaknesses of the their actual DGF. As such, respondents saw our DSR
effort as a way to obtain such a tool and a means by which they could ade-
quately define and prioritize the data governance content to be develop and
deployed within their organization.
Finally, respondents also mentioned that most of the data governance
strategies put in place by their organization were outdated and maladapted to
actual needs. They mentioned that this situation was due to the fact that their
organization had no tool to evaluate the performance of their current data gov-
ernance strategies.

Step 2 – Define the objectives of a Solution


During step 2, findings from our interviews conducted in Step 1 and the
knowledge gathered during a review of the relevant literature was used
to infer the objective of our artifact. In broad terms, we wanted to develop
an artifact to help organizations assess their own level of data governance

164 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

maturity. Specifically, our objective was threefold. First, our artifact should
help organizations know, before the realization of their data governance
initiatives, which data governance processes, policies, practices and/or struc-
ture should be developed and prioritized. Second, our artifact should also help
organizations evaluate, after the implementation of data governance initia-
tives, if those initiatives have allowed them to evolve in terms of data gover-
nance maturity. Third, our artifact should be aligned to already existing data
governance maturity frameworks (CMMI Institute, 2014; Soares, 2010) and
data governance methodologies (Ladley, 2012; Soares, 2014). With this aim
and objective in mind, we elected to design a data governance maturity (DGM)
assessment tool.

Step 3 – Design and Development


During step 3, because no such tool had been previously developed, we used a
“general solution” strategy to develop the first version of the DGM assessment
tool (Iivari, 2015, p. 111). To do so, we anchored our efforts on Soares’ (2010) com-
prehensive DGF that identifies 11 data governance dimensions. We elected to
anchor our efforts on this particular DGF since the 11 dimensions it includes are
sufficiently detailed to be manageable and useful for practitioners. In addition,
these 11 dimensions overlap with the six key data governance areas proposed
by the Data Governance Institute (Thomas, 2006) as well as the three data gov-
ernance areas proposed by the CMMI (CMMI Institute, 2014). Accordingly, we
were confident that our DGM assessment tool would not omit any important
aspects of data governance and that it would be compatible with existing data
governance methodologies. However, since none of the DGF identified in the
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


academic and practitioner literatures, including Soares’ (CMMI Institute, 2014;
Soares, 2010), provide questions to evaluate the proposed data governance cat-
egories, dimensions or areas they identified, these questions were developed
here. In the first version, 55 questions were developed to assess the 11 data
governance dimensions of Soares’ (2010) comprehensive DGF.
To develop these questions and their respective scale, existing data
governance methodologies and frameworks were used. More specifically,
we first analyzed the definitions and conceptualizations of the various data
governance categories, dimensions and areas provided in already existing
data governance methodologies and frameworks (CMMI Institute, 2014;
Fleckenstein & Fellows, 2018; Lam, 2011; Soares, 2010; Thomas, 2006) and
then used this information to develop our own questions to assess each of
Soares’s (2010) 11 data governance dimensions. Accordingly, the questions
developed attempted to capture the essence of each dimension by focusing
on its key characterizing elements. In addition, the scale of each question was
developed to reflect the five levels of maturity defined by the CMMI (CMMI
Institute, 2014). In doing so, we made sure that each level of our developed
scales, would not only reflect these five levels of maturity but also the essence
of each dimension and question. Accordingly, the original response scale of
each question comprised five possible answers that each represents a cer-
tain level of maturity for this specific dimension: (1) Performed, (2) Managed,
(3) Defined, (4) Measured or (5) Optimized.

proyéctica / projectics / projectique – n° 20 165


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

Once the questions and scales of our DGM assessment tool were com-
pleted, we crystalized our DGM assessment tool in a Microsoft Word docu-
ment that took the form of a questionnaire. We also developed a Microsoft
Excel spreadsheet to compile and present the results for each respondent on
an intuitive dashboard. To do so, several Microsoft Excel formulas were cre-
ated to calculate a cumulative score for each of the 11 data governance dimen-
sions. Specifically, the cumulative score of each dimension was obtained by
averaging the score of all questions it comprised. Figure 1 provides a screen-
shot of the Microsoft Excel dashboard we developed.

Figure 1. Data Governance Maturity Dashboard

Step 4 – Demonstration
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


During step 4, our DGM assessment tool was presented to and used by five
domain experts4, to get their impressions and feedback on the usability and
relevance of the tool. For each demonstration, an email comprising our DGM
assessment tool (i.e., Microsoft Word document) was sent to each respon-
dent. In this email, the respondent was asked to read and complete the ques-
tionnaire by referring to his own organization and to note his comments and/
or suggestions for improvements. A semi-structured interview was con-
ducted with each respondent once he completed the questionnaire. The first
part of this semi-structured interview focused on the dimensions, questions
and scales of the DGM questionnaire (i.e. the data governance dimensions as
well as their respective questions and corresponding scales). During this part
of the interview, open-ended questions intended to assess the clarity of the
questions and concepts included in the tool. Comments, remarks, sugges-
tions and questions raised by the respondent were discussed and duly noted.
Then, the second part of the semi‑structured interview focused on the rel-
evancy and usability of the DGM assessment tool (Patton, 2002). During this
part of the interview, open-ended questions were used to assess the tool’s
ease of use, compatibility, ergonomic, relative advantage, demonstrability.

4. Amongst these five domain experts, 1 was an IT security analyst, 2 were IT managers, 1 was an IT consultant and
1 was an IT project managers. They have, on average, seventeen years of work experience.

166 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

Once again comments, remarks, suggestions and questions raised by the


respondent were discussed and duly noted.
After each demonstration, based on the information collected, we made the
necessary modifications to our DGM assessment tool. The most important are
noted here. First, several respondents had difficulty understanding the answer
choices of each question that relates to each of the five level of maturity (i.e.,
performed, managed, defined, measured or optimized) of our scales or grasp-
ing the differences that existed between these answers. As such, we rewrote
every question and related answer choices of the questionnaire in such a man-
ner that each question is now assessed on a five‑point Likert scale. In addi-
tion, a short description for each data governance dimension was added to the
tool to help the respondents better understand the essence of each dimen-
sion while the wordings of some questions were modified or complemented to
reflect the terminology used in practice.
Second, because some dimensions covered a larger number of themes or
were judged more important than others, some questions were added to these
dimensions. For instance, seven questions were added to the Data Lifecycle
Management dimension and four questions were added to the Data Security
and Confidentiality dimensions. Appendix 1 present the final version of our
DGM assessment tool, refined after the Demonstration and Evaluation steps,
which includes 11 dimensions and 72 questions (see Table 1).

Data Governance Maturity Dimensions4 Nb. of questions


Data Risk Management and Compliance 6
Data Value Creation 6
Data Organizational Structure and Awareness 5
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


Data Policies and Rules 4
Data Stewardship 6
Data Quality Management 7
Data Lifecycle Management 12
Data Security and Confidentiality 9
Data Architecture 6
Data Classification and Metadata 5
Archiving Information Audits and Reporting 6
Total 72 questions

Table 1. Questions per Data Governance Maturity Dimension

Lastly, some respondents mentioned that they would have liked to get the
results of their efforts right after completing the questionnaire, without any
waiting time. There was a delay because once the questionnaire was com-
pleted by a respondent, the author responsible for the demonstration, had to
extract the respondent’s answers from the Microsoft Word document, trans-
fer it into the Microsoft Excel spreadsheet, generate the DGM dashboard and

5. A description of each dimension is provided in appendix 1 as well as the detailed questions and the scales used
for each of them.

proyéctica / projectics / projectique – n° 20 167


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

then send it back to the respondent. A future version of our DGM assessment
tool will overcome this technical limitation.
In a DSR project, knowing when to stop generating subsequent iterations of
the artifact during the demonstration step is a difficult and somewhat a sub-
jective decision as enhancements could be carried out forever. In the develop-
ment of our DGM assessment tool, the decision to stop this iterative process
was taken after five demonstrations since we felt we had reached theoretical
saturation at that point.

Step 5 – Evaluation
During step 5, we asked another set of five experts6 to assess the data gov-
ernance maturity of their organization using our tool. Contrary to the previ-
ous step which focused on improving the tool through various iterations, our
objective here was to evaluate the tool’s effectiveness and contribution. The
evaluation of each of the five respondents was completed in presence of one
of the authors to observe how the tool was used, provide direct assistance
and collect their immediate feedbacks. Once the respondent had completed
the questionnaire, a semi-structured interview (Patton, 2002) was conducted
to evaluate the quality and clarity of the instructions and questions as well as
the relevancy, ease of use, usability, effectiveness and added value of the tool.
During the evaluation step, the domain experts highlighted four key benefits
of the DGM assessment tool. First, all of the experts agreed on the fact that the
tool cannot, and should not, be used by only one individual or a limited group
of experts. Indeed, it is virtually impossible for one person in a large organi-
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


zation to have the full body of knowledge and expertise to properly complete
all of the questions of the assessment tool. The only situation where it might
be possible would be in a small and medium size enterprise. Most impor-
tantly, the experts underlined the fact that the tool should not be completed
only by a limited group of experts without the involvement of business rep-
resentatives of the various units or departments. Indeed, it is very important
that IT and data experts use the DGM assessment tool jointly with business
unit representatives to: (1) have precise and extent information; (2) sensitize
the business units to the various dimensions and challenges of data gover-
nance; and (3) to serve as a learning tool for business units. One of the experts
mentioned that “the tool could allow different departments in his organization
to make a diagnosis of their situation, to compare themselves to other depart-
ments as well as to become aware of the whole organization situation in terms
of data governance”.
Second, many of the experts mentioned that the tool can and should be
used at different levels of analysis. Indeed, it would be possible to use the tool
to assess the data governance maturity level of an organization as a whole. On
the other hand, organizations would also benefit from using the tool to assess
the maturity levels of their various organizational subunits/departments

6. Amongst these five domain experts, 1 was an IT manager, 1 was an IT consultant, 1 was university professor and
2 were senior business analysts. They had, on average, fifteen years of work experience.

168 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

in order to highlight disparities and encourage subunits/­ departments to


exchange amongst themselves. As mentioned by one of the experts “this tool
could serve as a basis for discussion between business units as well as the
development and sharing of common data management practices”.
Third, in terms of usability, relevancy and clarity, the experts who evaluated
the final version of the tool, agreed that the tool’s questions and responses
scales were clear and relatively easy to understand (see Appendix 1). As one
respondent mentioned: “What I like about your tool is that it can be under-
stood by both data specialists and IT people as well by business people. This
kind of tool can help bridge the gap between technical people and business
people. It allows to speak a common language about data governance. It is a
bridging tool that fosters collaboration”. Most of the experts also found the
DGM assessment tool relatively easy to use, not too long and well balanced.
As reported by one of the experts “the tool offers a great trade-off between
exhaustivity because it covers the main aspects of a data governance frame-
work and length because it is just long enough not to lose the respondents’
interests or simply lose them”.
Fourth, in terms of effectiveness, all experts commented that the data
governance maturity score provided by our tool was a good representation of
their own assessment of their organization’s real level of maturity in terms of
data governance. As mentioned by one of the experts “your assessment tool
serves as a sounding board for all the services and departments as it high-
lights how each service or department is doing regarding data governance
as well as how the overall organization is doing. Such diagnostic serves both
to support communication between departments, to develop a shared under-
standing as well as to guide our future investments in data governance”.
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


Another respondent reported that “the assessment tool, which is easy to use
and to understand by most business people, covers all the main elements
which should be covered by an organization who wishes to have an efficient
data governance framework in place. Also, it can help technical and business
people to better understand each other”. Based on these comments, we are
confident that our tool can be used to properly assess an organization’s level
of maturity in terms of data governance.
In addition to these four benefits, two of the experts mentioned that they
were not confident of the answers they gave to some of the questions. Indeed,
although they are data management experts, they had some hesitation provid-
ing answers to some questions because of a lack of knowledge and/or exper-
tise. Indeed, as mentioned above, it is fairly impossible for one person in a
large organization to know every detail about the organization’s data gover-
nance framework. They thus suggested to add a question to assess and cap-
ture the respondents’ level of confidence regarding their answers on each
of the 11 dimensions. We obliged to this request. For instance, for the Data
Policies and Rules dimension, the following question was added: How do you
evaluate your confidence level in answering the above questions related to the
Data policies and rules dimension? 1. Not at all confident, 2. Little confident,
3. Neutral, 4. Somewhat confident, 5. Very confident. If you have any com-
ments or questions related to the Data policies and rules dimension, please
specify them.

proyéctica / projectics / projectique – n° 20 169


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

DISCUSSION
Nowadays, data have become a central organizational asset and are often con-
sidered as important, or even more important (Bean, 2018), than financial and
human resources. Yet, for data to be considered as a strategic asset they must
be valid, precise and available in a timely manner (Fleckenstein & Fellows,
2018; Gregory, 2011; Redman, 2013). To do so, an organization must develop
and deploy a data governance framework (DGF) that orchestrates the peoples,
processes and technologies needed to optimize the collection, storage, use
and dissemination of data as organizational assets (DAMA International, 2014;
Ladley, 2012; Soares, 2010, 2012).
For most organizations, and especially larger ones, implementing a
DGF may be challenging since they do not always know the level of matu-
rity of their data management practices. Accordingly, managers responsi-
ble for data governance initiatives in such organizations do not always know
what their DGF should address as well as how and when it should be imple-
mented (Fleckenstein & Fellows, 2018). To overcome these issues, different
methods and approaches for developing and deploying a DGF have been pro-
posed (e.g. DAMA International, 2009; Ladley, 2012; Soares, 2010; Thomas,
2006). However, even though these methods and approaches could guide
organizational managers in the development and implementation of DGF,
they do not provide any tool to support managers in the evaluation of their
organization’s level of data governance maturity. As such, it is still difficult
for organizations to evaluate their own data governance processes, prac-
tices, policies and structures and determine where they stand in terms of data
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


governance.
To partially fill this gap and propose a practical solution to this problem,
the aim of this study was to develop, using a design science research (DSR)
approach (Hevner et al., 2004; Peffers et al., 2007), a data governance matu-
rity (DGM) assessment tool that could help organization know, before the real-
ization of their data governance initiatives, which data governance processes,
policies, practices and/or structure should be developed and prioritized as
well as evaluate, after the implementation of data governance initiatives,
if those initiatives have allowed the organization to evolve in terms of data
governance maturity. Furthermore, our artifact needed to be aligned with
already existing data governance maturity frameworks (CMMI Institute, 2014;
Soares, 2010) and data governance methodologies (Ladley, 2012; Soares,
2014). When compared to existing data governance maturity models, meth-
ods and frameworks (e.g. CMMI Institute, 2014; Soares, 2010, 2014; Thomas,
2006), one of the interesting contributions of the DGM assessment tool we
propose is that it operationalizes the concept of data governance maturity
and makes it more accessible to managers willing to assess their organiza-
tion’s or business units’ level of data governance maturity and, in turn, help
them better define and prioritize the goal, content and activities of their data
governance initiatives. As such, our tool allows managers to have a fair and
realistic portrait of the maturity level of their organization and/or business
unit in terms of data governance and, in turn, reduce the number of failures

170 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

associated with this type of initiative. Also, our DGM assessment tool, by being
easy to use and reusable, may facilitate the communication between tech-
nical and business people, sensitize all employees to the aspects and chal-
lenges tied to data governance as well as serve as a learning tool regarding
data governance. Furthermore, it could be used to evaluate technological
risks (Flyvbjerg & Budzier, 2011) and/or serve as a tool for internal auditing
practices (Gramling, Maletta, Schneider, & Church, 2004). Lastly, by including
all key dimensions of well-established DGF, our DGM assessment tool could
also be used by small or large organizations from the public or private sec-
tors to build trust with present or future business partners that understand
the value of good data.
This study is not without limitations and these should be taken into con-
sideration when using our DGM assessment tool in practice or in future
studies. First, and foremost, changing the DGM assessment tool’s ques-
tions and related scales as recommended by the experts consulted during
the demonstration phase, somewhat changed the nature of our intended arti-
fact. Specifically, it undermines our tool capacity to position an organiza-
tion at a precise level of data governance maturity. Indeed, maturity levels,
as described in the DMM model, for example, represent discrete categories
or steps where data governance is performed in markedly different man-
ners. Hence, by moving away from our original approach, our tool no longer
distinguishes between these key differences that characterize each level of
maturity. However, our tool still allows organizations to (1) know, before the
realization of their data governance initiatives, which data governance pro-
cesses, policies, practices and/or structure should be developed and prior-
itized as well as (2) evaluate, after the implementation of data governance
initiatives, if those initiatives have allowed the organization to evolve in terms
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


of data governance. To overcome this limitation, we recommend that future
research should aim to refine our DGM assessment tool by developing new
questions and scales that better correspond to the notion of maturity used
in maturity model currently available on the market. Second, we only inter-
viewed one respondent per organization throughout the various development
steps of our artifact. Yet, as mentioned by some of our respondents, it is fairly
impossible for one person, especially in a large organization, to know every
detail of their organization’s data governance framework and thus for that
person to answer confidently and provide valuable feedback for all the ques-
tions of our questionnaire. To overcome this limitation, we recommend that
future research aims to refine our DGM assessment tool in order to allow
multiple respondents from the same organization to collectively complete the
evaluation of their organization’s data governance practices. Third, we did not
attempt to define and evaluate design principles throughout the development
of our artifact. Specifically, we did not do so because it was not congruent with
the methodology we used. To overcome this limitation, we recommend that
future research should aim to adopt a design science research methodology
that explicitly integrates the definition and evaluation of design principles
within its framework.

proyéctica / projectics / projectique – n° 20 171


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

CONCLUSION
At the beginning of this paper we set out to design an artifact that would help
organizations assess their own level of data governance maturity. Specifically,
our objective regarding this artifact was threefold. First, our artifact needed to
help organizations know, before the realization of their data governance ini-
tiatives, which data governance processes, policies, practices and/or struc-
ture should be developed and prioritized. Second, our artifact needed to help
organizations evaluate, after the implementation of their data governance ini-
tiatives, if those initiatives allowed them to evolve in terms of data governance
maturity. Third, our artifact needed to be aligned with already existing data
governance maturity frameworks (CMMI Institute, 2014; Soares, 2010) and
data governance methodologies (Ladley, 2012; Soares, 2014). To do so, we fol-
lowed a DSR approach and the six steps methodology proposed by Peffers
et al. (2007). We anchored our work on Soares’ (2010) comprehensive DGF
and the concept of maturity as defined by the CMMI institute. Also, several
domain experts (e.g., IT security analysts, information resources experts,
business analysts, IT managers, etc.) were consulted and asked to provide key
suggestions, comments and feedback to help us design our artifact. In the end,
our research effort allowed us to develop and test a data governance matu-
rity (DGM) assessment tool that includes 11 dimensions and 72 questions. We
are confident that this artifact will allow organizations to assess their level of
maturity in terms of data governance and, in turn, to better define and prior-
itize the goals, content and activities of their data governance initiatives. As
such, we hope that our artifact in the form of a DGM assessment tool will help
organizations to implement a DGF that is tailored to their respective needs
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


and context. Lastly, and more importantly, we feel that, by using a design sci-
ence research approach, we were able to produce a contribution to knowledge
that is both different and complementary to what previous behavioral studies
have done in regards to data governance. Accordingly, we firmly believe that
behavioral and design science studies should not each be confined to different
topics or different fields of research but should rather be used in conjunction
to solve all kinds of organizational issues.

REFERENCES
Alhassan, I., Sammon, D. & Daly, M. (2016). Data governance activities: An analysis of the
literature. Journal of Decision Systems, 25(sup1), 64-75.
Alhassan, I., Sammon, D. & Daly, M. (2018). Data governance activities: A comparison between
scientific and practice-oriented literature. Journal of Enterprise Information Management, 31(2),
300-316.
Bean, R. (2018). How Big Data and AI Are Driving Business Innovation in 2018. Retrieved from https://
sloanreview.mit.edu/article/how-big-data-and-ai-are-driving-business-innovation-in-2018/
Begg, C. & Caira, T. (2012). Exploring the SME quandary: Data governance in practise in the small to
medium-sized enterprise sector. The Electronic Journal Information Systems Evaluation, 15(1), 3-13.
CMMI Institute. (2014). Data Management Maturity (DMM) Model. CMMI Institute.
Dallemule, L., & Davenport, T. H. (2017). What’s your data strategy? Harvard Business Review,
95(3), 112-121.

172 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

DAMA International. (2009). The DAMA Guide to the Data Management Body of Knowledge.
https://www.dama.org/content/body-knowledge: Technics Publications.
DAMA International. (2014). DAMA-DMBOK2 Framework. https://www.dama.org/sites/default/
files/download/DAMA-DMBOK2-Framework-V2-20140317-FINAL.pdf.
Davenport, T. H. (2013). Analytics 3.0. Harvard Business Review, 91(12), 64-72.
Fleckenstein, M. & Fellows, L. (2018). Data Governance. In Modern Data Strategy (pp. 63-76).
Springer.
Flyvbjerg, B. & Budzier, A. (2011). Why your IT project may be riskier than you think. Harvard
Business Review, 89(9), 23-25.
Gramling, A. A., Maletta, M. J., Schneider, A. & Church, B. K. (2004). The role of the internal
audit function in corporate governance: A synthesis of the extant internal auditing literature and
directions for future research. Journal of Accounting Literature, 23, 194-244.
Gregor, S. & Hevner, A. R. (2013). Positioning and presenting design science research for
maximum impact. MIS Quarterly, 37(2), 337-356.
Gregory, A. (2011). Data governance—Protecting and unleashing the value of your customer data
assets. Journal of Direct, Data and Digital Marketing Practice, 12(3), 230-248.
Hevner, A. R., March, S. T., Park, J. & Ram, S. (2004). Design science in information systems
research. MIS Quarterly, 28(1), 75-105.
Iivari, J. (2015). Distinguishing and contrasting two strategies for design science research.
European Journal of Information Systems, 24, 107-115.
Khatri, V. & Brown, C. V. (2010). Designing data governance. Communications of the ACM, 53(1),
148-152.
Ladley, J. (2012). Data Governance—How to Effectively Design, Deploy, and Sustain an Effective
Data Governance Program. Burlington, MA: Morgan Kaufmann.
Lam, V. (2011). Seven Steps to Effective Data Governance. New York: Information Builders.
https://www.whitepapers.em360tech.com/wp-content/files_mf/white_paper/wp_iway_7steps.pdf
Lee, Y., Madnick, S. E., Wang, R. Y., Wang, F. & Zhang, H. (2014). A cubic framework for the chief
data officer: Succeeding in a world of big data. MIS Quarterly Executive, 13(1), 1-13.
Logan, D., Popkin, J. & Faria, M. (2016). First Gartner CDO Survey: Governance and Analytics Will
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


Be Top Priorities in 2016. Retrieved from Gartner Research.
Luftman, J., Derksen, B., Dwivedi, R., Santana, M., Zadeh, H. & Rigoni, E. (2015). Influential IT
management trends: An international study. Journal of Information Technology, suppl. Special
Issue: Mobile Platforms and Ecosystems, 30(3), 293-305.
March, S. T. & Smith, G. F. (1995). Design and natural science research on information technology.
Decision Support Systems, 15(4), 251-266.
Otto, B. (2011). Organizing data governance: Findings from the telecommunications industry and
consequences for large service providers. Communication of the AIS, 29(3), 45-66.
Patton, M. Q. (2002). Qualitative Research & Evaluation Methods. Thousand Oaks, CA: Sage
Publications.
Peffers, K., Tuunanen, T., Rothenberger, M. A. & Chatterjee, S. (2007). A design science research
methodology for information systems research. Journal of Management Information Systems,
24(3), 45-77.
PMI. (2003). Organizational Project Management Maturity Model (OPM3). Newton, PA: Project
Management Institute.
Redman, T. C. (2013). Data’s credibility problem. Harvard Business Review, 91(12), 84-88.
Ross, J. W., Beath, C. M. & Quadgraas, A. (2013). You may not need big data after all – How lots of
little data can inform everyday decision making. Harvard Business Review, 91(12), 90-98.
Rowe, F. (2014). What literature review is not: Diversity, boundaries and recommendations.
European Journal of Information Systems, 23(3), 241-255. doi:http://dx.doi.org/10.1057/ejis.2014.7
Soares, S. (2010). The IBM Data Governance Unified Process: Driving Business Value with IBM
Software and Best Practices. Boise, ID: MC Press.
Soares, S. (2012). Big Data Governance: An Emerging Imperative. Boise, ID: MC Press.
Soares, S. (2014). Data Governance Tools: Evaluation Criteria, Big Data Governance, and Alignment
with Enterprise Data Management. Boise, ID: MC Press.

proyéctica / projectics / projectique – n° 20 173


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

Tallon, P. P., Ramirez, R. V. & Short, J. E. (2013). The information artifact in IT governance: Toward
a theory of information governance. Journal of Management Information Systems, 30(3), 141-178.
Templier, M. & Paré, G. (2015). A framework for guiding and evaluating literature reviews.
Communication of the AIS, 37, 113-137.
Thomas, G. (2006). The DGI Data Governance Framework. Orlando, FL: The Data Governance
Institute.
Webster, J. & Watson, R. T. (2002). Analyzing the past to prepare for the future: Writing a literature
review. MIS Quarterly, 26(2), xiii-xxiii.

Philippe MARCHILDON is an assistant professor at the School of Management


of the University of Quebec at Montreal (ESG UQAM) since 2016. He holds a Ph.
D. in management of technology from ESG UQAM. His main research interests
include: information system evolution, IT-enabled organizational transformation
and barriers to change, IT adoption and post-adoption, creation and co‑creation
of IT value, and theory building.

Simon BOURDEAU is a professor at the School of Management of the University


of Quebec at Montreal (ESG UQAM) since 2012. He is also a member/researcher
in various research groups and centers, including the CEFRIO, the CIRANO and
the GReSI (HEC Montréal). He holds a Ph.D. in information systems from HEC
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


Montréal. His research interests include: IS project management, project teams
dynamic, operational risks, strategic planning and innovation. Since 2013, he is
also LEGO© Facilitator © Serious Play ™ certified and uses this methodology in
teaching, research as well as in private and public organizations.

Pierre HADAYA is a professor at the School of Management of the University of


Quebec at Montreal (ESG UQAM). His research focuses on strategic planning
and management, organizational transformation and its governance, business
architecture as well as the strategic alignment of IT. As Co-Founder of ASATE
Group Inc., Dr. Hadaya also collaborates with organizations striving to transform
themselves so they can develop a competitive advantage.

Aldrin LABISSIÈRE works as a business analyst at the Ministère de l’Immigration,


de la Diversité et de l’Inclusion. He holds a master degree (M.Sc.) in information
technology from the School of Management of the University of Quebec at
Montreal (ESG UQAM).

174 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

Appendix 1 – Data Governance Maturity


Assessment Tool
(adapted from Soares (2010))

Dimension #1 – Data Risk Management and Compliance


Data Risk Management and Compliance refers to the processes by which data
security risks are identified, qualified, quantified, rejected, accepted, and/
or mitigated in an organization. The volume of data in all business domains is
continually increasing. The amount of laws governing the compliance of these
data is also increasing. Hence, risk management and compliance must be aligned
to the data governance framework (DGF).
Question 1.1
In your organization, is the DGF aligned to risk management and compliance?
There is … (insert here) … alignment between the DGF and risk management and
compliance.
1. … no …
2. … weak …
3. … moderate …
4. … strong …
5. … complete …
Question 1.2
In your organization, how important is the role of the stakeholder responsible for
risk management on the data governance board?
The stakeholder responsible for risk management plays … (insert here) … role on
the data governance board.
1. … no … (i.e., there is no stakeholder responsible for risk management on the
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


data governance board)
2. … small …
3. … moderate …
4. … important …
5. … essential …
Question 1.3
In your organization, does the DGF provide tangible benefits to risk management?
The DGF provides … (insert here) … tangible benefits to risk management.
1. … no …
2. … few …
3. … some …
4. … several …
5. … many …
Question 1.4
In your organization, are managers responsible for risk management convinced of
the benefits identified in the DGF?
Managers responsible for risk management are … (insert here) … convinced of the
benefits that the DGF provides to risk management.
1. … not …
2. … little …
3. … partly …
4. … mostly …
5. … totally …

proyéctica / projectics / projectique – n° 20 175


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

Question 1.5
In your organization, are metrics used to monitor the performance of the DGF in
relation to risk management?
(Insert here) … metrics are used to monitor the performance of the DGF in relation
to risk management.
1. No …
2. Few …
3. Some …
4. Several …
5. All of the required …
Question 1.6
In your organization, how do you rate the quality of the metrics used to monitor
the performance of the DGF in relation to risk management?
1. Poor
2. Fair
3. Good
4. Very good
5. Excellent
How do you evaluate your confidence level in answering the above questions
related to the Data Risk Management and Compliance dimension?
1. Not at all confident
2. Little confident
3. Neutral
4. Somewhat confident
5. Very confident
If you have any comments or questions related to the Data Risk Management and
Compliance dimension, please specify them: ______________________________
___________________________________________________________________
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


Dimension #2 – Data Value Creation
Data value creation refers to the process by which data assets are qualified and
quantified to enable an organization to maximize the value created by these data
assets. It captures the importance given to the use of data in terms of creating
value for the organization.
Question 2.1
In your organization, are stakeholders from different business and IT domains
involved in the data governance board?
There are key stakeholders from … (insert here) … business and IT domains
involved in the DGF.
1. … no …
2. … few ….
3. … some …
4. … several …
5. … all …
Question 2.2
Does the DGF provide any key organizational benefits?
The DGF provides … (insert here) … key organizational benefits.
1. … no …
2. … few …
3. … some …
4. … several …
5. … many …

176 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

Question 2.3
In your organization, are the stakeholders’ engaged towards the benefits of data
governance?
Stakeholder engagement towards the benefits of data governance …
1. … does not exist.
2. … is very limited.
3. … is partial.
4. … is almost complete.
5. … is complete.
Question 2.4
In your organization, are business cases used to justify specific data governance
initiatives?
Business cases are… (insert here) … used to justify specific data governance
initiatives.
1. … not …
2. … rarely …
3. … sometimes …
4. … often …
5. … always …
Question 2.5
In your organization, are metrics used to monitor the performance of the DGF?
(Insert here) … metrics are used to monitor the performance of the DGF.
1. No …
2. Few …
3. Some …
4. Several …
5. All of the required …
Question 2.6
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


In your organization, how do you rate the quality of the metrics used to monitor
the performance of the DGF?
1. Poor
2. Fair
3. Good
4. Very good
5. Excellent
How do you evaluate your confidence level in answering the above questions
related to the Data Value Creation dimension?
1. Not at all confident
2. Little confident
3. Neutral
4. Somewhat confident
5. Very confident
If you have any comments or questions related to the Data Value Creation
dimension, please specify them: _________________________________________
___________________________________________________________________

proyéctica / projectics / projectique – n° 20 177


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

Dimension #3 – Data Organizational Structure and Awareness


Organizational structures and awareness refers to the level of mutual
accountability between IT and organizational departments and the awareness of
the responsibility of governing data at different levels of management.
Question 3.1
In your organization, do senior managers supports the DGF and consider data as
an organizational asset?
… senior managers support the DGF and consider data as an organizational asset.
1. No …
2. Few …
3. Some …
4. Several …
5. All …
Question 3.2
Are members of the organization aware of how to treat data as an organizational
asset?
… members of the organization are aware of how to treat data as an organizational
asset.
1. No …
2. Few …
3. Some …
4. Several …
5. All …
Question 3.3
Are the activities of the organization impacted by the data governance board?
The activities of the organization are … (insert here) … impacted by the data
governance board.
1. … not ...
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


2. … little …
3. … moderately …
4. … strongly …
5. … very strongly …
Question 3.4
In your organization, is the data governance steward involved in organizational
activities?
The data governance steward is … (insert here) … involved in organizational
activities.
1. … not …
2. … very little …
3. … moderately …
4. … strongly …
5. … very strongly …
Question 3.5
In your organization, is there a data governance charter?
1. There is no data governance charter.
2. The organization started working on the creation of a data governance
charter.
3. The organization has partly completed the creation of a data governance
charter.
4. The organization has completed the creation of a data governance charter.
5. There is a data governance charter and it is in effect.

178 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

How do you evaluate your confidence level in answering the above questions
related to the Organizational structures and awareness dimension?
1. Not at all confident
2. Little confident
3. Neutral
4. Somewhat confident
5. Very confident
If you have any comments or questions related to the Organizational structures
and awareness dimension, please specify them: ___________________________
___________________________________________________________________

Dimension #4 – Data Policies and Rules


Data policies and rules refers to the data quality discipline that keeps data
safe for asset updates, risk mitigation and organizational control. It captures
an organization’s ability to properly manage all data, to ensure the quality
of metadata definition as well as to manage and implement the rules of
confidentiality of the DGF. By all data, we mean data from all sources (e.g.,
transactional, social media, machine-generated data, IoT) and in all content
formats (structured, semi-structured and unstructured).
Question 4.1
In your organization, is the data governance board involved in defining the data
policies of the organization?
The data governance board is … (insert here) … involved in the definition of the
data policies.
1. … not ...
2. … very little …
3. … moderately …
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


4. … strongly …
5. … very strongly …
Question 4.2
In your organization, are the data governance policies documented?
Data governance policies are … (insert here) … documented.
1. … never …
2. … rarely …
3. … often …
4. … usually …
5. … always …
Question 4.3
In your organization, are the data governance policies reviewed by the governance
committee?
Data governance policies are … (insert here) … reviewed by the governance
committee.
1. … never …
2. … sometimes …
3. … often …
4. … usually …
5. … always …

proyéctica / projectics / projectique – n° 20 179


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

Question 4.4
In your organization, is the DGF considered when strengthening organizational
practices and policies?
When strengthening organizational practices and policies, the DGF is … (insert
here) … considered.
1. … never …
2. … sometimes …
3. … often …
4. … usually …
5. … always …
How do you evaluate your confidence level in answering the above questions
related to the Data policies and rules dimension?
1. Not at all confident
2. Little confident
3. Neutral
4. Somewhat confident
5. Very confident
If you have any comments or questions related to the Data policies and rules
dimension, please specify them: _________________________________________
___________________________________________________________________

Dimension #5 – Data Stewardship


Data stewardship is a data quality discipline that keeps data safe for asset
updates, risk mitigation and organizational control. It captures the efforts made by
an organization to properly monitor compliance with the rules established during
the implementation of the DGF.
Question 5.1
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


Does your organization use data steward tools to ensure the quality of the data?
(Insert here) … data steward tools are used to ensure the quality of the data?
1. No …
2. Few …
3. Some …
4. Several …
5. All of the required …
Question 5.2
In your organization, how do you rate the quality of the data steward tools used to
ensure the quality of the data?
1. Poor
2. Fair
3. Good
4. Very good
5. Excellent
Question 5.3
In your organization, do business domains have their own data steward?
… business domains have their own data steward.
1. No …
2. Few …
3. Some …
4. Most …
5. All …

180 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

Question 5.4
In your organization, are data stewards responsible for defining the attributes of
the data under their area of responsibility?
Data stewards are … (insert here) … responsible for defining the attributes of the
data under their area of responsibility.
1. … not ...
2. … very little …
3. … partly …
4. … mostly …
5. … completely …
Question 5.5
In your organization, are data stewards responsible for creating and/or monitoring
the metrics tied to the data under their area of responsibility?
Data stewards are … (insert here) … responsible for creating and/or monitoring
the metrics tied to the data under their area of responsibility
1. … not …
2. … very little …
3. … partly …
4. … mostly …
5. … completely …
Question 5.6
In your organization, do the business domains recognize the importance of the
data stewardship program?
The stewardship program is recognized and respected by … (insert here) …
business domains.
1. … no …
2. … few …
3. … some …
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


4. … most …
5. … all …
How do you evaluate your confidence level in answering the above questions
related to the Data stewardship dimension?
1. Not at all confident
2. Little confident
3. Neutral
4. Somewhat confident
5. Very confident
If you have any comments or questions related to the Data stewardship
dimension, please specify them: _________________________________________
___________________________________________________________________

proyéctica / projectics / projectique – n° 20 181


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

Dimension #6 – Data Quality Management


Data quality management refers to the methods that measure, improve, and
certify the quality and integrity of all production, test, and archived data. By all
data, we mean manage data from all sources (e.g., transactional, social media,
machine-generated data, IoT) and in all content formats (structured, semi-
structured and unstructured).
Data quality management captures the solutions put in place in the organization
to ensure data quality.
Question 6.1
In your organization, are the quality metrics approved by the Data Governance
Board?
… metrics are approved by the Data Governance Board.
1. No …
2. Few …
3. Some …
4. Most …
5. All …
Question 6.2
In your organization, are data quality issues documented?
Data quality issues are … (insert here) … documented.
1. … never …
2. … rarely …
3. … often …
4. … usually …
5. … always …
Question 6.3
In your organization, is there a clear agreement between technical and business
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


domains on managing data quality issues?
There is … (insert here) … agreement between technical and business domains on
managing data quality issues.
1. … no…
2. … a weak …
3. … a moderate …
4. … a strong …
5. … a complete …
Question 6.4
In your organization, is there a process for managing data quality issues?
1. There is no process for managing data quality issues.
2. The development of the process for managing data quality issues has just
started.
3. The development of the process for managing data quality issues is partly
complete.
4. The development of the process for managing data quality issues is complete.
5. The process for managing data quality issues is complete and deployed.

182 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

Question 6.5
In your organization, how often are data quality metrics used by the technical
domains?
Data quality metrics are … (insert here) … used by the technical domains.
1. … never …
2. … rarely …
3. … occasionally …
4. … often ….
5. … always …
Question 6.6
In your organization, how often are data quality metrics used by the business
domains?
Data quality metrics are … (insert here) … used by the business domains.
1. … never …
2. … rarely …
3. … occasionally …
4. … often ….
5. … always …
Question 6.7
In your organization, is there a process to take corrective action and/or improve
data quality metrics based on feedback from the data governance board?
1. There is no process to take corrective action and/or improve data quality
metrics based on feedback from the data governance board
2. The development of the process to take corrective action and/or improve data
quality metrics based on feedback from the data governance board has just
started.
3. The development of the process to take corrective action and/or improve data
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


quality metrics based on feedback from the data governance board is partly
complete.
4. The development of the process to take corrective action and/or improve
data quality metrics based on feedback from the data governance board is
complete.
5. The process to take corrective action and/or improve data quality metrics
based on feedback from the data governance board is complete and deployed.
How do you evaluate your confidence level in answering the above questions
related to the Data quality management dimension?
1. Not at all confident
2. Little confident
3. Neutral
4. Somewhat confident
5. Very confident
If you have any comments or questions related to the Data quality management
dimension, please specify them: _________________________________________
___________________________________________________________________

proyéctica / projectics / projectique – n° 20 183


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

Dimension #7 – Data Lifecycle Management


Data life cycle management is a systematic approach to collect, use, store and
delete all data. By all data, we mean data from all sources (e.g., transactional,
social media, machine-generated data, IoT) and in all content formats (structured,
semi-structured and unstructured). Data lifecycle management evaluates the
policies put in place to determine the type of document to be digitized as well as
the method and time of storage and archiving.
Question 7.1
In your organization, are metrics used to evaluate the collection, use, storage and
deletion of data from all sources?
(Insert here) … metrics are used to evaluate the collection, use, storage and
deletion of data from all sources?
1. No …
2. Few …
3. Some …
4. Several …
5. All of the required …
Question 7.2
In your organization, how do you rate the quality of the metrics used to evaluate
the collection, use, storage and deletion of data from all sources?
1. Poor
2. Fair
3. Good
4. Very good
5. Excellent
Question 7.3
In your organization, are metrics used to evaluate the collection, use, storage and
deletion of data in structured and semi-structured formats?
(Insert here) … metrics are used to evaluate the collection, use, storage and
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


deletion of data in structured and semi-structured formats?
1. No …
2. Few …
3. Some …
4. Several …
5. All of the required …
Question 7.4
In your organization, how do you rate the quality of the metrics used to evaluate
the collection, use, storage and deletion of data in structured and semi-structured
formats?
1. Poor
2. Fair
3. Good
4. Very good
5. Excellent
Question 7.5
In your organization, are metrics used to evaluate the collection, use, storage and
deletion of data in unstructured format?
(Insert here) … metrics are used to evaluate the collection, use, storage and
deletion of data in unstructured format?
1. No …
2. Few …
3. Some …
4. Several …
5. All of the required …

184 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

Question 7.6
In your organization, how do you rate the quality of the metrics used to evaluate
the collection, use, storage and deletion of data in unstructured format?
1. Poor
2. Fair
3. Good
4. Very good
5. Excellent
Question 7.7
In your organization, is there a policy on the types of documents that can be
scanned?
1. There is no policy on the types of documents that can be scanned.
2. The development of the policy on the types of documents that can be scanned
has just started.
3. The development of the policy on the types of documents that can be scanned
is partly complete.
4. The development of the policy on the types of documents that can be scanned
is complete.
5. The policy on the types of documents that can be scanned is complete and
deployed.
Question 7.8
In your organization, is there a policy on archiving electronic information from all
data sources?
1. There is no policy on archiving electronic information from all data sources.
2. The development of the policy on archiving electronic information from all
data sources has just started.
3. The development of the policy on archiving electronic information from all
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


data sources is partly complete.
4. The development the policy on archiving electronic information from all data
sources is complete.
5. The policy on archiving electronic information from all data sources is
complete and deployed.
Question 7.9
In your organization, is there a policy on archiving electronic information in
structured and semi-structured formats?
1. There is no policy on archiving electronic information in structured and semi-
structured formats.
2. The development of the policy on archiving electronic information in
structured and semi-structured formats has just started.
3. The development of the policy on archiving electronic information in
structured and semi-structured formats is partly complete.
4. The development the policy on archiving electronic information in structured
and semi-structured formats is complete.
5. The policy on archiving electronic information in structured and semi-
structured formats is complete and deployed.

proyéctica / projectics / projectique – n° 20 185


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

Question 7.10
In your organization, is there a policy on archiving electronic information in
unstructured format?
1. There is no policy on archiving electronic information in unstructured format.
2. The development of the policy on archiving electronic information in
unstructured format has just started.
3. The development of the policy on archiving electronic information in
unstructured format is partly complete.
4. The development the policy on archiving electronic information in
unstructured format is complete.
5. The policy on archiving electronic information in unstructured format is
complete and deployed.
Question 7.11
In your organization, is there a policy on archiving documents based on value
creation vectors (e.g., improved system performance and reduced archiving
costs)?
1. There is no policy on archiving documents based on value creation vectors.
2. The development of the policy on archiving documents based on value
creation vectors has just started.
3. The development of the policy on archiving documents based on value
creation vectors is partly complete.
4. The development of the policy on archiving documents based on value
creation vectors is complete.
5. The policy on archiving documents based on value creation vectors is
complete and deployed.
Question 7.12
In your organization, are the content management systems automated?
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


The content management systems are …
1. … not automated at all.
2. … very little automated.
3. … partly automated.
4. … mostly automated.
5. … completely automated.
How do you evaluate your confidence level in answering the above questions
related to the Data life cycle management dimension?
1. Not at all confident
2. Little confident
3. Neutral
4. Somewhat confident
5. Very confident
If you have any comments or questions related to the Data life cycle management
dimension, please specify them: _________________________________________
___________________________________________________________________

186 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

Dimension #8 – Data Security and Confidentiality


Data security and confidentiality refers to the policies, practices, and control
rules used in the organization to mitigate risks and protect all data assets. By all
data, we mean data from all sources (e.g., transactional, social media, machine-
generated data, IoT) and in all content formats (structured, semi-structured and
unstructured). Data security and confidentiality captures the policies implemented
as part of the DGF to ensure the security and confidentiality of the data.
Question 8.1
In your organization, is there a data governance security and privacy policy for the
data in structured and semi-structured formats?
1. There is no data governance security and privacy policy for the data in
structured and semi-structured formats.
2. The development of the data governance security and privacy policy for the
data in structured and semi-structured formats has just started.
3. The development of the data governance security and privacy policy for the
data in structured and semi-structured formats is partly complete.
4. The development of the data governance security and privacy policy for the
data in structured and semi-structured formats is complete.
5. The data governance security and privacy policy for the data in structured and
semi-structured formats is complete and deployed.
Question 8.2
In your organization, is there a data governance security and privacy policy for the
data in format?
1. There is no data governance security and privacy policy for the data in
unstructured format.
2. The development of the data governance security and privacy policy for the
data in unstructured format has just started.
3. The development of the data governance security and privacy policy for the
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


data in unstructured format is partly complete.
4. The development of the data governance security and privacy policy for the
data in unstructured format is complete.
5. The data governance security and privacy policy for the data in unstructured
format is complete and deployed.
Question 8.3
In your organization, is the IT Security Director/Specialist involved in the Data
Governance Board?
The IT Security Director/Specialist is … (insert here) … involved in the Data
Governance Board?
1. … not …
2. … very little …
3. … moderately …
4. … generally …
5. … always …
Question 8.4
Is your organization subject to confidentiality regulations?
Your organization is … (insert here) … confidentiality regulations.
1. … not subject to any …
2. … subject to a few …
3. … subject to some …
4. … subject to several …
5. … subject to many …

proyéctica / projectics / projectique – n° 20 187


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

Question 8.5
Does your organization fail confidentiality audits?
Your organization … (insert here) … fails confidentiality audits.
1. … very often …
2. … quite often ...
3. … occasionally ...
4. … rarely ....
5. … never ...
Question 8.6
In your organization, is sensitive data encrypted?
Sensitive data is … (insert here) … encrypted.
1. … never …
2. … rarely …
3. … often …
4. … generally …
5. … always …
Question 8.7
In your organization, is unencrypted sensitive data used to develop or test systems?
Unencrypted sensitive data is … (insert here) … used to develop or test systems.
1. … always …
2. … very often ...
3. … occasionally ...
4. … rarely ...
5. … never ...
Question 8.8
In your organization, do administrators, subcontractors or third parties have
access to unencrypted sensitive data?
Administrators, subcontractors or third parties … (insert here) … have access to
unencrypted sensitive data.
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


1. … always …
2. … very often ...
3. … occasionally ...
4. … rarely ...
5. … never ...
Question 8.9
In your organization, are systems used to monitor access to sensitive data by
users with privileges (such as database administrators)?
Systems are … (insert here) … used to monitor access to sensitive data by users
with privileges.
1. … not …
2. … rarely ...
3. … moderately …
4. … generally …
5. … always ...
How do you evaluate your confidence level in answering the above questions
related to the Data security and confidentiality dimension?
1. Not at all confident
2. Little confident
3. Neutral
4. Somewhat confident
5. Very confident
If you have any comments or questions related to the Data security and
confidentiality dimension, please specify them: ____________________________
___________________________________________________________________

188 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

Dimension #9 – Data Architecture


Data architecture refers to the architecture design of structured, semi-structured
and unstructured data systems and applications, which enable the availability and
distribution of data to the appropriate users.
Question 9.1
In your organization, is the architecture of the structured and semi-structured
data systems based on established standards?
The architecture of the structured and semi-structured data systems is … (insert
here) … on established standards.
1. … not based …
2. … based on few …
3. … based on some …
4. … based on several …
5. … completely based …
Question 9.2
In your organization, is the architecture of the unstructured data systems based
on established standards?
The architecture of the unstructured data systems is … (insert here) … on
established standards.
1. … not based …
2. … based on few …
3. … based on some …
4. … based on several …
5. … completely based …
Question 9.3
In your organization, is there a process established to strengthen compliance with
architectural standards in data governance?
1. There is no process to strengthen compliance with architectural standards in
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


data governance.
2. The development of the process to strengthen compliance with architectural
standards in data governance has just started.
3. The development of the process to strengthen compliance with architectural
standards in data governance is partly complete.
4. The development of the process to strengthen compliance with architectural
standards in data governance is complete.
5. The process to strengthen compliance with architectural standards in data
governance is complete and deployed.
Question 9.4
Have your organizational needs to optimize the architecture of the structured and
semi-structured data systems been identified and defined?
1. No organizational needs to optimize the architecture of the structured and
semi-structured data systems have been identified and defined.
2. The identification and definition of the organizational needs to optimize the
architecture of the structured and semi-structured data systems has just
started.
3. The identification and definition of the organizational needs to optimize the
architecture of the structured and semi-structured data systems is partly
complete.
4. The identification and definition of the organizational needs to optimize the
architecture of the structured and semi-structured data systems is complete.
5. The identification and definition of the organizational needs to optimize the
architecture of the structured and semi-structured data systems is complete
and deployed.

proyéctica / projectics / projectique – n° 20 189


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

Question 9.5
Have your organizational needs to optimize the architecture of the unstructured
data systems been identified and defined?
1. No organizational needs to optimize the architecture of the structured and
semi-structured data systems have been identified and defined.
2. The identification and definition of the organizational needs to optimize the
architecture of the structured and semi-structured data systems has just
started.
3. The identification and definition of the organizational needs to optimize the
architecture of the structured and semi-structured data systems is partly
complete.
4. The identification and definition of the organizational needs to optimize the
architecture of the structured and semi-structured data systems is complete.
5. The identification and definition of the organizational needs to optimize the
architecture of the structured and semi-structured data systems is complete
and deployed.
Question 9.6
In your organization, do specific data domains (e.g., customers, providers, and
products) have their own system of record (i.e., a single system for backing up and
retrieving information to ensure data integrity and traceability of changes)?
… specific data domains have their own system of record.
1. No …
2. Few…
3. Some …
4. Several …
5. All …
How do you evaluate your confidence level in answering the above questions
related to the Data architecture dimension?
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


1. Not at all confident
2. Little confident
3. Neutral
4. Somewhat confident
5. Very confident
If you have any comments or questions related to the Data architecture
dimension, please specify them: _________________________________________
___________________________________________________________________

Dimension #10 – Data Classification and Metadata


Classification and metadata refers to methods and tools used to create common
semantic definitions for IT and business terms, data warehouses, and data
models.
Question 10.1
In your organization, is there a data dictionary for key organizational terms?
The dictionary comprises … (insert here) … key terms.
1. … no … (i.e., there is no dictionary)
2. … few …
3. … some …
4. … most …
5. … all …

190 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

Question 10.2
In your organization, do business domains agree on the terms and definitions
included in the data dictionary?
… of the business domains agree on the terms and definitions included in the data
dictionary.
1. None…
2. Few …
3. Some …
4. Most …
5. All …
Question 10.3
In your organization, do business domains agree on a common definition of the
term “Metadata”?
… of the business domains agree on a common definition of the term “Metadata”.
1. None …
2. Few …
3. Some …
4. Most …
5. All …
Question 10.4
In your organization, a data warehouse is used to store technical metadata?
A data warehouse is … (insert here) … used to store technical metadata.
1. … never …
2. … rarely …
3. … often…
4. … generally …
5. … always
Question 10.5
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


Have your organizational needs to support technical, business and operational
metadata been identified and defined?
… organizational needs to support technical, business and operational metadata
have been identified and defined.
1. No …
2. Few …
3. Some …
4. Several …
5. All …
How do you evaluate your confidence level in answering the above questions
related to the Classification and metadata dimension?
1. Not at all confident
2. Little confident
3. Neutral
4. Somewhat confident
5. Very confident
If you have any comments or questions related to the Classification and metadata
dimension, please specify them: _________________________________________
___________________________________________________________________

proyéctica / projectics / projectique – n° 20 191


PHILIPPE MARCHILDON, SIMON BOURDEAU, PIERRE HADAYA, ALDRIN LABISSIÈRE

Dimension #11 – Archiving Information Audits and Reporting


Archiving information audits and reporting refers to organizational processes
which monitor and measure the data value, risks and effectiveness of data
governance.
Question 11.1
In your organization, is there a strategy to reduce the number of non-regulatory
changes made to databases?
1. There is no strategy to reduce the number of non-regulatory changes made to
databases.
2. The development of a strategy to reduce the number of non-regulatory
changes made to databases has just started.
3. The development of strategy to reduce the number of non-regulatory changes
made to databases is partly complete.
4. The development of a strategy to reduce the number of non-regulatory
changes made to databases is complete.
5. The strategy to reduce the number of non-regulatory changes made to
databases is complete and implemented.
Question 11.2
In your organization, are internal controls that certify reports for financial and
compliance purposes used?
(Insert here) … internal controls that certify reports for financial and compliance
purposes are used.
1. No …
2. Few …
3. Some …
4. Several …
5. All of the required …
Question 11.3
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)


In your organization, how do you rate the quality of the internal controls that
certify reports for financial and compliance purposes used?
1. Poor
2. Fair
3. Good
4. Very good
5. Excellent
Question 11.4
In your organization, are systems for monitoring database changes made by users
used?
Systems for monitoring database changes made by users are…
1. … non-existent.
2. … rarely used.
3. … often used.
4. … generally used.
5. … always used.
Question 11.5
In your organization, are changes made to critical data audited?
Changes made to critical data are … (insert here) audited
1. … never …
2. … rarely …
3. … often …
4. … usually …
5. … always …

192 projectique / projectics / proyéctica – n° 20


Data governance maturity assessment tool: A design science approach

Question 11.6
In your organization, how do you rate the quality of the critical data changes audit
checks?
1. Poor
2. Fair
3. Good
4. Very good
5. Excellent
How do you evaluate your confidence level in answering the above questions
related to the Archiving information audits and reporting dimension?
1. Not at all confident
2. Little confident
3. Neutral
4. Somewhat confident
5. Very confident
If you have any comments or questions related to the Archiving information
audits and reporting dimension, please specify them: _______________________
___________________________________________________________________
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)

proyéctica / projectics / projectique – n° 20 193

You might also like