Professional Documents
Culture Documents
PROJ - 020 - 0155 Governance Maturity
PROJ - 020 - 0155 Governance Maturity
APPROACH
DATA GOVERNANCE
MATURITY ASSESSMENT
TOOL: A DESIGN SCIENCE
APPROACH
HERRAMIENTA DE EVALUACIÓN DE LA MADUREZ
DE GOBERNABILIDAD DE DATOS: UN ENFOQUE
DE LA CIENCIA DEL DISEÑO
OUTIL D’ÉVALUATION DE LA MATURITÉ
DE LA GOUVERNANCE DES DONNÉES :
UNE APPROCHE FONDÉE SUR LA SCIENCE DU DESIGN
Mots-clés : gouvernance des données, gestion des données, science du design, outil
d’évaluation de la maturité
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
LITERATURE REVIEW2
Data governance is a growing trend in today’s business environment and it
has generated substantial interest from both practitioners and researchers
(Alhassan, Sammon, & Daly, 2016; Logan, Popkin, & Faria, 2016). It is based on
the idea that data is a valuable organizational asset which must be maintained
(Otto, 2011). Accordingly, the overarching goal of data governance is to ensure
“strong participation across the organization for critical decisions affecting
the data assets” (CMMI Institute, 2014, p. 43) and to warrant an effective over-
sight of data management practices. Hence, data governance aims to specify,
in the form of guidelines and rules for data management, who within an orga-
nization can make what decisions regarding the manipulation of data, and
what are the tasks related to these decisions. Data governance can thus be
seen as the exercise of decision-making and authority for everything related
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
2. A literature review on data governance tools as well as the maturity models was conducted. Methodological
recommendations suggested by different researchers were followed (Rowe, 2014; Templier & Paré, 2015; Webster
& Watson, 2002). Keywords related to the field of study, namely data governance, governance framework, maturity
models, tools, assessment and evaluation were used in combination. In the hopes of identifying both academic and
professional articles, several databases were consulted: JSTOR, ACM Digital Library, ABI ProQuest aggregator and
Google Scholar. The inclusion criteria used were: governance framework and assessment tools. Results of all these
searches did not permit us to identify a single scientific article presenting such a tool. For that reason, our focus then
shifted to the tools developed in the professional field.
the prerequisites for the expected use of the data, (3) Metadata to define the
semantics of the data, (4) Data Access to determine the prerequisites and
rules for accessing data, and (5) Data Lifecycle to establish the definition, pro-
duction, retention and archiving of data.
The Data Governance Institute (DGI) has also proposed a DGF, which is
articulated around six key data governance areas: (1) Policy, Standards and
Strategy (i.e., formal data policies supported by cross-functional data stew-
ards), (2) Data Quality (i.e., data quality criteria and monitoring systems),
(3) Privacy, Compliance and Security (i.e., privacy, compliance and security
data programs mandated by management), (4) Architecture and Integration
(i.e., data needs tied to architecture and integration challenges), (5) Data
Warehouses and Business Intelligence (i.e., data warehouses and BI pro-
grams), and (6) Management Support (i.e., data programs focusing on getting
managerial support) (Thomas, 2006).
Yet, to this day, the most comprehensive DGF is certainly the one proposed
by Soares (2010) which is articulated around 11 data governance competences
or categories:
1. Data Risk Management and Compliance. A method by
which risks are identified, qualified, quantified, avoided,
accepted, mitigated, or transferred out;
2. Value Creation. A process by which data assets are qual-
ified and quantified to enable the business to maximize
the value created by data assets;
3. Organizational Structures and Awareness. The level of
mutual responsibility between business and IT, and the
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
3. Amongst these eleven members of the industry, 3 were project managers, 1 was an IT security analyst, 4 were
business analysts, 1 was a marketing analyst and 3 were managers. They had, on average, thirteen years of work
experience.
for them, (2) what were the challenges they faced regarding data governance,
(3) whether they knew what were their organization’s strengths and weak-
nesses regarding their data governance processes, policies, practices and
structures, (4) if they knew which data governance initiatives they should pri-
oritize and what each should address, and (5) if an DGM assessment tool could
help them develop and deploy a DGF tailored to their needs.
Amongst key findings from this step, transcripts from the exploratory
interviews indicate that all respondents agreed that data were now considered
strategic assets and thus that data governance had become a central organi-
zational preoccupation. Respondents also mentioned that several data gover-
nance rollout projects failed in the past. Specifically, respondents explained
that, although data governance projects were needed and important for their
organization, these projects failed because they were misaligned with the
organization’s needs and context and because business units had diverging
views in regards to the issues these projects should address as well as how
and when these projects should be conducted. In addition, respondents men-
tioned that leaders of data governance projects had a hard time determining
their projects’ goals and contents as well as to prioritize their data governance
initiatives since they did not know exactly what their organization’s strengths
and weaknesses in terms of data governance were. In other words, respon-
dents mentioned that leaders of data governance projects did not know about
their organization’s level of data governance maturity, which, in turn, seriously
undermined their ability to identify the data governance processes, policies,
practices and structures to be developed and deployed.
To help them develop, deploy and improve the data governance processes,
policies, practices and structures of their organization, respondents men-
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
maturity. Specifically, our objective was threefold. First, our artifact should
help organizations know, before the realization of their data governance
initiatives, which data governance processes, policies, practices and/or struc-
ture should be developed and prioritized. Second, our artifact should also help
organizations evaluate, after the implementation of data governance initia-
tives, if those initiatives have allowed them to evolve in terms of data gover-
nance maturity. Third, our artifact should be aligned to already existing data
governance maturity frameworks (CMMI Institute, 2014; Soares, 2010) and
data governance methodologies (Ladley, 2012; Soares, 2014). With this aim
and objective in mind, we elected to design a data governance maturity (DGM)
assessment tool.
Once the questions and scales of our DGM assessment tool were com-
pleted, we crystalized our DGM assessment tool in a Microsoft Word docu-
ment that took the form of a questionnaire. We also developed a Microsoft
Excel spreadsheet to compile and present the results for each respondent on
an intuitive dashboard. To do so, several Microsoft Excel formulas were cre-
ated to calculate a cumulative score for each of the 11 data governance dimen-
sions. Specifically, the cumulative score of each dimension was obtained by
averaging the score of all questions it comprised. Figure 1 provides a screen-
shot of the Microsoft Excel dashboard we developed.
Step 4 – Demonstration
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
4. Amongst these five domain experts, 1 was an IT security analyst, 2 were IT managers, 1 was an IT consultant and
1 was an IT project managers. They have, on average, seventeen years of work experience.
Lastly, some respondents mentioned that they would have liked to get the
results of their efforts right after completing the questionnaire, without any
waiting time. There was a delay because once the questionnaire was com-
pleted by a respondent, the author responsible for the demonstration, had to
extract the respondent’s answers from the Microsoft Word document, trans-
fer it into the Microsoft Excel spreadsheet, generate the DGM dashboard and
5. A description of each dimension is provided in appendix 1 as well as the detailed questions and the scales used
for each of them.
then send it back to the respondent. A future version of our DGM assessment
tool will overcome this technical limitation.
In a DSR project, knowing when to stop generating subsequent iterations of
the artifact during the demonstration step is a difficult and somewhat a sub-
jective decision as enhancements could be carried out forever. In the develop-
ment of our DGM assessment tool, the decision to stop this iterative process
was taken after five demonstrations since we felt we had reached theoretical
saturation at that point.
Step 5 – Evaluation
During step 5, we asked another set of five experts6 to assess the data gov-
ernance maturity of their organization using our tool. Contrary to the previ-
ous step which focused on improving the tool through various iterations, our
objective here was to evaluate the tool’s effectiveness and contribution. The
evaluation of each of the five respondents was completed in presence of one
of the authors to observe how the tool was used, provide direct assistance
and collect their immediate feedbacks. Once the respondent had completed
the questionnaire, a semi-structured interview (Patton, 2002) was conducted
to evaluate the quality and clarity of the instructions and questions as well as
the relevancy, ease of use, usability, effectiveness and added value of the tool.
During the evaluation step, the domain experts highlighted four key benefits
of the DGM assessment tool. First, all of the experts agreed on the fact that the
tool cannot, and should not, be used by only one individual or a limited group
of experts. Indeed, it is virtually impossible for one person in a large organi-
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
6. Amongst these five domain experts, 1 was an IT manager, 1 was an IT consultant, 1 was university professor and
2 were senior business analysts. They had, on average, fifteen years of work experience.
DISCUSSION
Nowadays, data have become a central organizational asset and are often con-
sidered as important, or even more important (Bean, 2018), than financial and
human resources. Yet, for data to be considered as a strategic asset they must
be valid, precise and available in a timely manner (Fleckenstein & Fellows,
2018; Gregory, 2011; Redman, 2013). To do so, an organization must develop
and deploy a data governance framework (DGF) that orchestrates the peoples,
processes and technologies needed to optimize the collection, storage, use
and dissemination of data as organizational assets (DAMA International, 2014;
Ladley, 2012; Soares, 2010, 2012).
For most organizations, and especially larger ones, implementing a
DGF may be challenging since they do not always know the level of matu-
rity of their data management practices. Accordingly, managers responsi-
ble for data governance initiatives in such organizations do not always know
what their DGF should address as well as how and when it should be imple-
mented (Fleckenstein & Fellows, 2018). To overcome these issues, different
methods and approaches for developing and deploying a DGF have been pro-
posed (e.g. DAMA International, 2009; Ladley, 2012; Soares, 2010; Thomas,
2006). However, even though these methods and approaches could guide
organizational managers in the development and implementation of DGF,
they do not provide any tool to support managers in the evaluation of their
organization’s level of data governance maturity. As such, it is still difficult
for organizations to evaluate their own data governance processes, prac-
tices, policies and structures and determine where they stand in terms of data
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
associated with this type of initiative. Also, our DGM assessment tool, by being
easy to use and reusable, may facilitate the communication between tech-
nical and business people, sensitize all employees to the aspects and chal-
lenges tied to data governance as well as serve as a learning tool regarding
data governance. Furthermore, it could be used to evaluate technological
risks (Flyvbjerg & Budzier, 2011) and/or serve as a tool for internal auditing
practices (Gramling, Maletta, Schneider, & Church, 2004). Lastly, by including
all key dimensions of well-established DGF, our DGM assessment tool could
also be used by small or large organizations from the public or private sec-
tors to build trust with present or future business partners that understand
the value of good data.
This study is not without limitations and these should be taken into con-
sideration when using our DGM assessment tool in practice or in future
studies. First, and foremost, changing the DGM assessment tool’s ques-
tions and related scales as recommended by the experts consulted during
the demonstration phase, somewhat changed the nature of our intended arti-
fact. Specifically, it undermines our tool capacity to position an organiza-
tion at a precise level of data governance maturity. Indeed, maturity levels,
as described in the DMM model, for example, represent discrete categories
or steps where data governance is performed in markedly different man-
ners. Hence, by moving away from our original approach, our tool no longer
distinguishes between these key differences that characterize each level of
maturity. However, our tool still allows organizations to (1) know, before the
realization of their data governance initiatives, which data governance pro-
cesses, policies, practices and/or structure should be developed and prior-
itized as well as (2) evaluate, after the implementation of data governance
initiatives, if those initiatives have allowed the organization to evolve in terms
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
CONCLUSION
At the beginning of this paper we set out to design an artifact that would help
organizations assess their own level of data governance maturity. Specifically,
our objective regarding this artifact was threefold. First, our artifact needed to
help organizations know, before the realization of their data governance ini-
tiatives, which data governance processes, policies, practices and/or struc-
ture should be developed and prioritized. Second, our artifact needed to help
organizations evaluate, after the implementation of their data governance ini-
tiatives, if those initiatives allowed them to evolve in terms of data governance
maturity. Third, our artifact needed to be aligned with already existing data
governance maturity frameworks (CMMI Institute, 2014; Soares, 2010) and
data governance methodologies (Ladley, 2012; Soares, 2014). To do so, we fol-
lowed a DSR approach and the six steps methodology proposed by Peffers
et al. (2007). We anchored our work on Soares’ (2010) comprehensive DGF
and the concept of maturity as defined by the CMMI institute. Also, several
domain experts (e.g., IT security analysts, information resources experts,
business analysts, IT managers, etc.) were consulted and asked to provide key
suggestions, comments and feedback to help us design our artifact. In the end,
our research effort allowed us to develop and test a data governance matu-
rity (DGM) assessment tool that includes 11 dimensions and 72 questions. We
are confident that this artifact will allow organizations to assess their level of
maturity in terms of data governance and, in turn, to better define and prior-
itize the goals, content and activities of their data governance initiatives. As
such, we hope that our artifact in the form of a DGM assessment tool will help
organizations to implement a DGF that is tailored to their respective needs
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
REFERENCES
Alhassan, I., Sammon, D. & Daly, M. (2016). Data governance activities: An analysis of the
literature. Journal of Decision Systems, 25(sup1), 64-75.
Alhassan, I., Sammon, D. & Daly, M. (2018). Data governance activities: A comparison between
scientific and practice-oriented literature. Journal of Enterprise Information Management, 31(2),
300-316.
Bean, R. (2018). How Big Data and AI Are Driving Business Innovation in 2018. Retrieved from https://
sloanreview.mit.edu/article/how-big-data-and-ai-are-driving-business-innovation-in-2018/
Begg, C. & Caira, T. (2012). Exploring the SME quandary: Data governance in practise in the small to
medium-sized enterprise sector. The Electronic Journal Information Systems Evaluation, 15(1), 3-13.
CMMI Institute. (2014). Data Management Maturity (DMM) Model. CMMI Institute.
Dallemule, L., & Davenport, T. H. (2017). What’s your data strategy? Harvard Business Review,
95(3), 112-121.
DAMA International. (2009). The DAMA Guide to the Data Management Body of Knowledge.
https://www.dama.org/content/body-knowledge: Technics Publications.
DAMA International. (2014). DAMA-DMBOK2 Framework. https://www.dama.org/sites/default/
files/download/DAMA-DMBOK2-Framework-V2-20140317-FINAL.pdf.
Davenport, T. H. (2013). Analytics 3.0. Harvard Business Review, 91(12), 64-72.
Fleckenstein, M. & Fellows, L. (2018). Data Governance. In Modern Data Strategy (pp. 63-76).
Springer.
Flyvbjerg, B. & Budzier, A. (2011). Why your IT project may be riskier than you think. Harvard
Business Review, 89(9), 23-25.
Gramling, A. A., Maletta, M. J., Schneider, A. & Church, B. K. (2004). The role of the internal
audit function in corporate governance: A synthesis of the extant internal auditing literature and
directions for future research. Journal of Accounting Literature, 23, 194-244.
Gregor, S. & Hevner, A. R. (2013). Positioning and presenting design science research for
maximum impact. MIS Quarterly, 37(2), 337-356.
Gregory, A. (2011). Data governance—Protecting and unleashing the value of your customer data
assets. Journal of Direct, Data and Digital Marketing Practice, 12(3), 230-248.
Hevner, A. R., March, S. T., Park, J. & Ram, S. (2004). Design science in information systems
research. MIS Quarterly, 28(1), 75-105.
Iivari, J. (2015). Distinguishing and contrasting two strategies for design science research.
European Journal of Information Systems, 24, 107-115.
Khatri, V. & Brown, C. V. (2010). Designing data governance. Communications of the ACM, 53(1),
148-152.
Ladley, J. (2012). Data Governance—How to Effectively Design, Deploy, and Sustain an Effective
Data Governance Program. Burlington, MA: Morgan Kaufmann.
Lam, V. (2011). Seven Steps to Effective Data Governance. New York: Information Builders.
https://www.whitepapers.em360tech.com/wp-content/files_mf/white_paper/wp_iway_7steps.pdf
Lee, Y., Madnick, S. E., Wang, R. Y., Wang, F. & Zhang, H. (2014). A cubic framework for the chief
data officer: Succeeding in a world of big data. MIS Quarterly Executive, 13(1), 1-13.
Logan, D., Popkin, J. & Faria, M. (2016). First Gartner CDO Survey: Governance and Analytics Will
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
Tallon, P. P., Ramirez, R. V. & Short, J. E. (2013). The information artifact in IT governance: Toward
a theory of information governance. Journal of Management Information Systems, 30(3), 141-178.
Templier, M. & Paré, G. (2015). A framework for guiding and evaluating literature reviews.
Communication of the AIS, 37, 113-137.
Thomas, G. (2006). The DGI Data Governance Framework. Orlando, FL: The Data Governance
Institute.
Webster, J. & Watson, R. T. (2002). Analyzing the past to prepare for the future: Writing a literature
review. MIS Quarterly, 26(2), xiii-xxiii.
Question 1.5
In your organization, are metrics used to monitor the performance of the DGF in
relation to risk management?
(Insert here) … metrics are used to monitor the performance of the DGF in relation
to risk management.
1. No …
2. Few …
3. Some …
4. Several …
5. All of the required …
Question 1.6
In your organization, how do you rate the quality of the metrics used to monitor
the performance of the DGF in relation to risk management?
1. Poor
2. Fair
3. Good
4. Very good
5. Excellent
How do you evaluate your confidence level in answering the above questions
related to the Data Risk Management and Compliance dimension?
1. Not at all confident
2. Little confident
3. Neutral
4. Somewhat confident
5. Very confident
If you have any comments or questions related to the Data Risk Management and
Compliance dimension, please specify them: ______________________________
___________________________________________________________________
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
Question 2.3
In your organization, are the stakeholders’ engaged towards the benefits of data
governance?
Stakeholder engagement towards the benefits of data governance …
1. … does not exist.
2. … is very limited.
3. … is partial.
4. … is almost complete.
5. … is complete.
Question 2.4
In your organization, are business cases used to justify specific data governance
initiatives?
Business cases are… (insert here) … used to justify specific data governance
initiatives.
1. … not …
2. … rarely …
3. … sometimes …
4. … often …
5. … always …
Question 2.5
In your organization, are metrics used to monitor the performance of the DGF?
(Insert here) … metrics are used to monitor the performance of the DGF.
1. No …
2. Few …
3. Some …
4. Several …
5. All of the required …
Question 2.6
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
How do you evaluate your confidence level in answering the above questions
related to the Organizational structures and awareness dimension?
1. Not at all confident
2. Little confident
3. Neutral
4. Somewhat confident
5. Very confident
If you have any comments or questions related to the Organizational structures
and awareness dimension, please specify them: ___________________________
___________________________________________________________________
Question 4.4
In your organization, is the DGF considered when strengthening organizational
practices and policies?
When strengthening organizational practices and policies, the DGF is … (insert
here) … considered.
1. … never …
2. … sometimes …
3. … often …
4. … usually …
5. … always …
How do you evaluate your confidence level in answering the above questions
related to the Data policies and rules dimension?
1. Not at all confident
2. Little confident
3. Neutral
4. Somewhat confident
5. Very confident
If you have any comments or questions related to the Data policies and rules
dimension, please specify them: _________________________________________
___________________________________________________________________
Question 5.4
In your organization, are data stewards responsible for defining the attributes of
the data under their area of responsibility?
Data stewards are … (insert here) … responsible for defining the attributes of the
data under their area of responsibility.
1. … not ...
2. … very little …
3. … partly …
4. … mostly …
5. … completely …
Question 5.5
In your organization, are data stewards responsible for creating and/or monitoring
the metrics tied to the data under their area of responsibility?
Data stewards are … (insert here) … responsible for creating and/or monitoring
the metrics tied to the data under their area of responsibility
1. … not …
2. … very little …
3. … partly …
4. … mostly …
5. … completely …
Question 5.6
In your organization, do the business domains recognize the importance of the
data stewardship program?
The stewardship program is recognized and respected by … (insert here) …
business domains.
1. … no …
2. … few …
3. … some …
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
Question 6.5
In your organization, how often are data quality metrics used by the technical
domains?
Data quality metrics are … (insert here) … used by the technical domains.
1. … never …
2. … rarely …
3. … occasionally …
4. … often ….
5. … always …
Question 6.6
In your organization, how often are data quality metrics used by the business
domains?
Data quality metrics are … (insert here) … used by the business domains.
1. … never …
2. … rarely …
3. … occasionally …
4. … often ….
5. … always …
Question 6.7
In your organization, is there a process to take corrective action and/or improve
data quality metrics based on feedback from the data governance board?
1. There is no process to take corrective action and/or improve data quality
metrics based on feedback from the data governance board
2. The development of the process to take corrective action and/or improve data
quality metrics based on feedback from the data governance board has just
started.
3. The development of the process to take corrective action and/or improve data
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
Question 7.6
In your organization, how do you rate the quality of the metrics used to evaluate
the collection, use, storage and deletion of data in unstructured format?
1. Poor
2. Fair
3. Good
4. Very good
5. Excellent
Question 7.7
In your organization, is there a policy on the types of documents that can be
scanned?
1. There is no policy on the types of documents that can be scanned.
2. The development of the policy on the types of documents that can be scanned
has just started.
3. The development of the policy on the types of documents that can be scanned
is partly complete.
4. The development of the policy on the types of documents that can be scanned
is complete.
5. The policy on the types of documents that can be scanned is complete and
deployed.
Question 7.8
In your organization, is there a policy on archiving electronic information from all
data sources?
1. There is no policy on archiving electronic information from all data sources.
2. The development of the policy on archiving electronic information from all
data sources has just started.
3. The development of the policy on archiving electronic information from all
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
Question 7.10
In your organization, is there a policy on archiving electronic information in
unstructured format?
1. There is no policy on archiving electronic information in unstructured format.
2. The development of the policy on archiving electronic information in
unstructured format has just started.
3. The development of the policy on archiving electronic information in
unstructured format is partly complete.
4. The development the policy on archiving electronic information in
unstructured format is complete.
5. The policy on archiving electronic information in unstructured format is
complete and deployed.
Question 7.11
In your organization, is there a policy on archiving documents based on value
creation vectors (e.g., improved system performance and reduced archiving
costs)?
1. There is no policy on archiving documents based on value creation vectors.
2. The development of the policy on archiving documents based on value
creation vectors has just started.
3. The development of the policy on archiving documents based on value
creation vectors is partly complete.
4. The development of the policy on archiving documents based on value
creation vectors is complete.
5. The policy on archiving documents based on value creation vectors is
complete and deployed.
Question 7.12
In your organization, are the content management systems automated?
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
Question 8.5
Does your organization fail confidentiality audits?
Your organization … (insert here) … fails confidentiality audits.
1. … very often …
2. … quite often ...
3. … occasionally ...
4. … rarely ....
5. … never ...
Question 8.6
In your organization, is sensitive data encrypted?
Sensitive data is … (insert here) … encrypted.
1. … never …
2. … rarely …
3. … often …
4. … generally …
5. … always …
Question 8.7
In your organization, is unencrypted sensitive data used to develop or test systems?
Unencrypted sensitive data is … (insert here) … used to develop or test systems.
1. … always …
2. … very often ...
3. … occasionally ...
4. … rarely ...
5. … never ...
Question 8.8
In your organization, do administrators, subcontractors or third parties have
access to unencrypted sensitive data?
Administrators, subcontractors or third parties … (insert here) … have access to
unencrypted sensitive data.
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
Question 9.5
Have your organizational needs to optimize the architecture of the unstructured
data systems been identified and defined?
1. No organizational needs to optimize the architecture of the structured and
semi-structured data systems have been identified and defined.
2. The identification and definition of the organizational needs to optimize the
architecture of the structured and semi-structured data systems has just
started.
3. The identification and definition of the organizational needs to optimize the
architecture of the structured and semi-structured data systems is partly
complete.
4. The identification and definition of the organizational needs to optimize the
architecture of the structured and semi-structured data systems is complete.
5. The identification and definition of the organizational needs to optimize the
architecture of the structured and semi-structured data systems is complete
and deployed.
Question 9.6
In your organization, do specific data domains (e.g., customers, providers, and
products) have their own system of record (i.e., a single system for backing up and
retrieving information to ensure data integrity and traceability of changes)?
… specific data domains have their own system of record.
1. No …
2. Few…
3. Some …
4. Several …
5. All …
How do you evaluate your confidence level in answering the above questions
related to the Data architecture dimension?
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
Question 10.2
In your organization, do business domains agree on the terms and definitions
included in the data dictionary?
… of the business domains agree on the terms and definitions included in the data
dictionary.
1. None…
2. Few …
3. Some …
4. Most …
5. All …
Question 10.3
In your organization, do business domains agree on a common definition of the
term “Metadata”?
… of the business domains agree on a common definition of the term “Metadata”.
1. None …
2. Few …
3. Some …
4. Most …
5. All …
Question 10.4
In your organization, a data warehouse is used to store technical metadata?
A data warehouse is … (insert here) … used to store technical metadata.
1. … never …
2. … rarely …
3. … often…
4. … generally …
5. … always
Question 10.5
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)
Question 11.6
In your organization, how do you rate the quality of the critical data changes audit
checks?
1. Poor
2. Fair
3. Good
4. Very good
5. Excellent
How do you evaluate your confidence level in answering the above questions
related to the Archiving information audits and reporting dimension?
1. Not at all confident
2. Little confident
3. Neutral
4. Somewhat confident
5. Very confident
If you have any comments or questions related to the Archiving information
audits and reporting dimension, please specify them: _______________________
___________________________________________________________________
© De Boeck Supérieur | Téléchargé le 16/08/2022 sur www.cairn.info (IP: 114.10.25.230)