You are on page 1of 21

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/268335323

A Critical Success Factor Framework for Information Quality Management

Article in Information Systems Management · October 2014


DOI: 10.1080/10580530.2014.958023

CITATIONS READS
32 2,498

2 authors:

Sasa Baskarada Andy Koronios


University of South Australia University of South Australia
43 PUBLICATIONS 1,247 CITATIONS 148 PUBLICATIONS 2,178 CITATIONS

SEE PROFILE SEE PROFILE

All content following this page was uploaded by Sasa Baskarada on 09 August 2015.

The user has requested enhancement of the downloaded file.


Information Systems Management, 31:276–295, 2014
Copyright © Taylor & Francis Group, LLC
ISSN: 1058-0530 print / 1934-8703 online
DOI: 10.1080/10580530.2014.958023

A Critical Success Factor Framework for Information Quality


Management
Saša Baškarada and Andy Koronios
School of Information Technology and Mathematical Sciences, University of South Australia, Mawson Lakes, Australia

advanced data mining/analytics (Baškarada, 2011; Bose, 2009;


5 Effective Information Quality Management (IQM) is still rela- Brown & Kros, 2003; Nemati & Barko, 2003; Solomon, 2005),
tively poorly understood. As such, there is a gap in the literature Health Information Systems (HIS; Civan & Pratt, 2007; Poston,
regarding Critical Success Factors (CSFs) for IQM. This article Reynolds, & Gillenson, 2006), and product data management 40
presents 11 CSFs for effective IQM as identified through seven
case studies and one ethnographic study. The CSF framework for (Silvola, 2009).
10 IQM presented in this article adds to extant IS theory, and may aid Awareness of IQ issues is growing globally and across all
practitioners in the development of more effective IQM strategies. industries. In a survey of more than 140 companies in vari-
ous industries and geographic regions, Gartner asked respon-
Keywords critical success factors; information quality manage- dents to estimate the impact of poor IQ on their organizations 45
ment; data quality; information quality; case study; (Friedman, 2009). The survey found that poor IQ has a negative
15 ethnography; interpretivism; constant comparative financial impact on most organizations, with average estimated
analysis; grounded theory approach
annual losses of $8.2 million. Some organizations even indi-
cated annual losses as high as $100 million. Furthermore, IQ is
INTRODUCTION also a growing area of research, as evidenced by the launch of 50
the International Journal of Information Quality (Inderscience),
Many organizations are now starting to assess and improve
the ACM Journal of Data and Information Quality (ACM
the quality of their information. Most organizations depend
JDIQ), and by the introduction of IQ tracks in several key IS
on quality information for everyday business operations (Al-
conferences.
20 Hakim, 2008; Gelle & Karhu, 2003; Hwang & Cappel, 2002;
Even though quality is a subjective concept, several 55
Shchiglik & Barnes, 2004; Sherer & Alter, 2004; Xu &
researchers and practitioners have derived a range of IQ dimen-
Koronios, 2004/2005), and some are even starting to recog-
sions from theory (Price & Shanks, 2004; Wand & Wang,
nize its potential contributions toward achieving a strategic
1996), identified them through empirical studies (Price, Neiger,
competitive advantage (Friedman, 2008). As a result, inter-
& Shanks, 2008; Wand & Wang, 1996), or have deductively
25 est in Information Quality Management (IQM) research and
proposed them based on a priori judgments (Redman, 1996). 60
practice is growing globally (Friedman & Bitterer, 2009;
However, there has only been limited research on Critical
Madnick, Wang, Lee, & Zhu, 2009). Studies have highlighted
Success Factors (CSFs) for IQM. Through identification of rel-
the importance of Information Quality (IQ) to a wide range of
evant CSFs, this study aims to contribute to IS theory and
domains, including Enterprise Resource Planning (ERP; Haug,
practice by providing detailed guidance on how organizations
30 Arlbjørn, & Pedersen, 2009; Vosburg & Kumar, 2001; Xu,
may develop more effective IQM strategies. Given that qual- 65
Nord, Brown, & Nord, 2002; Xu, Nord, Nord, & Lin, 2003),
ity of unstructured information is still an under-researched area,
Business Intelligence (BI; Payton & Handfield, 2003; Watson,
it shall be considered as out of scope in this study. As such,
2009; Watson, Wixom, Buonamici, & Revak, 2001), Supply
this article focuses on effective IQM of structured data sets.
Chain Management (SCM; Ivert & Jonsson, 2010; Moon &
In brief, this article aims to answer the following research ques-
35 Ngai, 2008; Sari, 2008; Saura, Frances, Contrı´, & Blasco,
tions: (1) What are the CSFs for effective information quality 70
2008), Data Warehousing (DW; Ma, Chou, & Yen, 2000),
management? and (2) How do they relate to each other?

Address correspondence to Saša Baškarada, School of Information


Technology and Mathematical Sciences, Building D, Mawson Lakes LITERATURE REVIEW
Campus, Mawson Lakes Boulevard, Mawson Lakes SA 5095
Australia. E-mail: baskarada@gmail.com Total Quality Management
Color versions of one or more of the figures in the article can be The literature provides several definitions of quality (Hardie
found online at www.tandfonline.com/uism. & Walsh, 1994), including “the capacity to satisfy wants” 75
276
A CSF FRAMEWORK FOR IQM 277

(Edwards, 1968, p. 37). While Shewhart (1931) and Crosby retrieval (Dasu & Johnson, 2003). There are, however, some
(1979) argued that quality is represented by a difference fundamental differences between IPs and physical products 130
between the preferred state and the actual state (thus, mainly (Ballou, Wang, Pazer, & Tayi, 1998). Moreover, ensuring the
defining quality as conformance to requirements), proposed quality of information is much more difficult than is the
80 dimensions of product quality nowadays include: Performance, case with manufactured products (Tayi & Ballou, 1998). For
features, reliability, conformance, durability, serviceability, aes- instance, it is very difficult to measure information and its qual-
thetics, and perceived quality (Garvin, 1987). On the other hand, ity, given that information has no physical properties to measure 135
Feigenbaum (1986) and Juran (1974) defined quality in terms (Ballou et al., 1998; Redman, 1995). Additionally, informa-
of customer satisfaction, where customer expectations can be tion can be consumed indefinitely, without being depleted
85 seen as the preferred state, and the concrete product or service (Paradice & Fuerst, 1991; Wang, 1998). Furthermore, many
received as the actual state. Nevertheless, it has been argued that IQ dimensions would not make sense if applied to phys-
the philosophies of Deming (1982), Juran (1974), and Crosby ical products (Wang, 1998). According to Redman (1995), 140
(1979) “provide fundamental principles on which total qual- most useful information is novel or unique. Thus, the uses of
ity is based” (Motwani, 2001, p. 293). As such, Total Quality such novel information may only be partially known (Tayi &
90 Management (TQM) has been defined as “a set of systematic Ballou, 1998).
activities carried out by the entire organization to effectively and
efficiently achieve company objectives so as to provide prod-
ucts and services with a level of quality that satisfies customers, Critical Success Factors
at the appropriate time and price” (JUSE, 2007, p. 2). TQM is The concept of CSFs, initially proposed in the early 1960s 145
95 one of the most widely applied quality management approaches, (Daniel, 1961), has been defined as “the areas in which good
and it is mainly focused on customer satisfaction, participatory performance is necessary [and sufficient] to ensure attain-
management, and results-orientation (Milakovich, 2004). While ment of goals” (Rockart, 1979). As such, CSFs should not
Dean and Bowen (1994) identified customer focus and contin- be confused with performance indicators (Freund, 1988).
uous improvement as the main principles of TQM, others have From a Project Management (PM) perspective, CSFs have 150
100 argued that process improvement is usually at the heart of any been described as “characteristics, conditions, or variables
TQM initiative (Hackman & Wageman, 1995). that can have a significant impact on the success of the
project when properly sustained, maintained, or managed”
Information Quality (Milosevic & Patanakul, 2005, p. 183). From an organiza-
Despite the fact that the importance of quality assurance tional perspective, CSFs can be derived for a number of 155
has been well understood in the software engineering commu- hierarchical levels, including industry, company/organization,
105 nity for decades (Chirinos, Losavio, & Matteo, 2004; Paulk, sub-organization/department, individual manager, and so on.
Curtis, Chrissis, & Weber, 1993; Phan, 2001), IQM is still a (Bullen & Rockart, 1981). In the context of ISs, CSFs are fre-
relatively immature domain. Also, while it has been argued quently incorporated in studies on IS planning (Boynton &
that data only has meaning when it is put into context, thus Zmud, 1984; Peffers, Gengler, & Tuunanen, 2003). However, 160
becoming information (Lillrank, 2003), many researchers and despite the apparent simplicity of the concept, “[t]here is
110 practitioners use the terms Data Quality (DQ) and Information no universal procedure for CSF data collection and analy-
Quality (IQ) interchangeably (Baškarada & Koronios, 2013; sis” (Bergeron & Begin, 1989) cited in (Peffers et al., 2003).
Madnick et al., 2009). Wang and Strong (1996) adopted the As such, relevant studies have used a variety of research
“fitness for use” (p. 6) definition for IQ and identified 15 rel- methods, including case studies/interviews (Akhavan, Jafari, 165
evant dimensions from the user’s perspective (Table 1). Using a & Fathian, 2006; Boynton & Zmud, 1984; Bullen & Rockart,
115 two-stage survey, they developed a framework comprising four 1981; Lu, Huang, & Heng, 2006), surveys (Sabherwal & Kirs,
categories and 15 dimensions that have now become generally 1994), and a combination of the two approaches (Guynes &
accepted in the literature (Madnick et al., 2009). Additionally, Vanecek, 1996).
other researchers have developed a range of IQ metrics (Pipino, The most important PM CSFs include the importance of a 170
Lee, & Wang, 2002; Pipino, Wang, Kopcso, & Rybolt, 2005; project receiving support from senior management, having clear
120 Redman, 1996) and assessment instruments (Baškarada, 2010; and realistic objectives, producing an efficient plan (Fortune
Lee, Strong, Kahn, & Wang, 2002; Price et al., 2008), which can & White, 2006), identifying key stakeholders, and under-
be used to assess different aspects of IQ. Wang (1998) drew an standing their requirements and expectations (Aaltonen, 2011;
analogy between traditional manufacturing of tangible products Jepsen & Eskerod, 2009; Pemsel & Widén, 2010). Learning 175
and the manufacturing of Information Products (IPs). As such, across projects has also been identified as critical to continu-
125 he argued that the quality of IPs may be directly affected by ous improvement of PM practices (Lewis, 2000); this not only
the quality of processes employed in the IS—a view also held includes learning from past mistakes, but also includes learning
by Ballou and Pazer (1985). In that view, data may become from successful projects that can serve as future role models
corrupted during collection, analysis, storage, integration, and (Cao & Hoffman, 2011). 180
278 S. BAŠKARADA AND A. KORONIOS

Numerous researchers have, over the years, proposed a wide & Schroeder, 1989; Zeitz, Johannesson, & Ritchie, 1997).
range of CSFs for TQM (Ahire, Golhar, & Waller, 1996; Motwani (2001) conducted a comparative analysis of six of
Anderson, Rungtusanthanam, Schroeder, & Devaraj, 1995; those studies (Ahire et al., 1996; Black & Porter, 1996; Flynn 190
Black & Porter, 1996; Blackburn & Rosen, 1993; Bowen & et al., 1994; Powell, 1995; Saraph et al., 1989; Zeitz et al.,
185 Lawler, 1992; Camp, 1989; Davenport, 1993; Flynn, Schroeder, 1997), synthesizing similar constructs to arrive at seven fac-
& Sakakibara, 1994; Lawler, 1994; Mann & Kehoe, 1994; tors (see Table 2). Ten factors for effective TQM identified in
Oakland, 1993; Oliver, 1988; Powell, 1995; Saraph, Benson, Juran’s Quality Handbook (Juran & De Feo, 2010) include:

TABLE 1
Information Quality Dimensions (Wang & Strong, 1996)
Dimension Description
Believability Data are accepted or regarded as true, real, and credible.
Accuracy Data are correct, reliable, and certified free of error.
Objectivity Data are unbiased (unprejudiced) and impartial.
Reputation Data are trusted or highly regarded in terms of their source or content.
Value-Added Data are beneficial and provide advantages from their use.
Relevancy Data are applicable and helpful for the task at hand.
Timeliness The age of the data is appropriate for the task at hand.
Completeness Data are of sufficient breadth, depth, and scope for the task at hand.
Amount The quantity or volume of available data is appropriate.
Interpretability Data are in appropriate language and units and the data definitions are clear.
Understandability Data are clear without ambiguity and easily comprehended.
Consistency Data are always presented in the same format and are compatible with previous data.
Conciseness Data are compactly represented without being overwhelming.
Accessibility Data are available or easily and quickly retrievable.
Access Security Access to data can be restricted and hence kept secure.

TABLE 2
Critical Success Factors for Total Quality Management (Motwani, 2001, p. 297)
Critical Success Factor Performance Measures
Top management commitment allocate budget and resources, control through visibility, monitor progress,
planning for change
Quality measurement and benchmarking zero-defects conformance, use SPC for process control, cost of quality,
proportion of defects, percentage of products needing rework, defective rate
relative to competitors
Process management unit cost, production goals, reduce material handling, design for
manufacturability, reduce cycle time, reduce setup time, productivity
Product design number of new products introduced, time taken from design to first sale, fitness
of use, design quality
Employee training and empowerment training resources, training management, cross-training employees,
training/retraining budget
Vendor quality management reduce inventory, supplier relations, number of suppliers, inventory turnover,
inventory accuracy, implement kanban, material cost, material availability
Customer involvement and satisfaction delivery dependability, operators involved/value-added labor, customer service
training budget, prompt handling of complaints, number or percent of
complaints, number or percent of orders that are delivered late, broad
distribution channels, number of contacts with customers, consumer surveys,
time to respond to questions/complaints, responsive repairs, percentage of
repeat business
A CSF FRAMEWORK FOR IQM 279

195 Customers and quality have top priority; create a performance Most interviews were recorded (a small number of interviewees
excellence system; do strategic planning for quality; benchmark did not agree to being recorded) and subsequently transcribed.
best practices; engage in continuous innovation and process In addition to the interviews conducted, a wide range of docu-
improvement; offer training in managing for quality, the meth- ments provided by the case-study participants were also exam- 250
ods and tools; create an organization-wide assurance focus; ined. They involved business process documentation, use-case
200 project by project, create multifunctional teams; empower descriptions, policies, procedures, architectural frameworks,
employees; and build an adaptable and sustainable organization system documentation, and the like.
(pp. 37–38). Theoretical sampling (as opposed to statistical sampling)
Based on a literature review of nine sources, Xu and Al- was used to guide the data collection. Theoretical sampling 255
Hakim (2005) identified 25 factors affecting DQ of Accounting originated with the development of grounded theory (Glaser
205 Information Systems (AIS). They also conducted a large-scale & Strauss, 1967), and its goal is to gain a deeper understand-
survey, finding that the perceived performance was consistently ing of the cases and to facilitate the development of theories.
lower than the expected importance of all factors. The 10 most It “implies that the researcher decides what data will be gath-
important factors identified include: Input controls, nature of ered next and where to find them on the basis of provisionary 260
the AIS, top management commitment, middle management theoretical ideas” (Boeije, 2002, p. 393). This approach was
210 commitment, user focus, personal competency, internal con- preferred, as it allowed the researchers to explore issues that
trols, teamwork (communication), management of changes, have arisen from the analysis of and reflection on previous
and employee relations. Some of the other domains to which data. Accordingly, the theory was continuously modified as a
the CSF construct has been applied include: Service Oriented consequence of the ongoing research. The scope of data collec- 265
Architectures (SOA; Lee, Shim, & Kim, 2010), Business tion was determined by theoretical saturation—the amount of
215 Performance Management (BPM; Ariyachandra & Frolick, data collected per case study was considered sufficient at the
2008), Information Technology Infrastructure Library (ITIL) point when additional data collected ceased to provide any new
implementation (Pollard & Cater-Steel, 2009), and system inte- relevant insights.
gration (Mendoza, Pérez, & Grimán, 2006). Constant Comparative Analysis (CCA) constituted the core 270
of data analysis (Boeije, 2002). CCA has its origins in grounded
theory (Glaser & Strauss, 1967). It is an inductive method used
METHODOLOGY to discover latent patterns in qualitative data provided by mul-
220 The aim of this study was to inductively build theory by tiple participants. While the literature provides little specific
answering broad research questions rather than to deductively guidance on how exactly to apply it (Boeije, 2002), we fol- 275
propose and test narrowly specified hypotheses. The research lowed three sequential stages: Open coding, axial coding, and
design was aimed at achieving a suitable combination of the- selective coding (Glaser & Strauss, 1967). Open coding gen-
ory and practice, so that the research results could eventually be erally breaks down the data into codes and categories. Axial
225 used by practitioners (Walsham, 1993). As such, the qualitative coding explores and explains the relationships between those
interpretive research paradigm was adopted in this study. codes and categories. Selective coding identifies the central 280
Qualitative research, an ongoing process of interpretation category, relates other codes and categories to that core, and
and analysis, has been defined as “any kind of research that pro- explains them in terms of that core. The IQ dimensions pre-
duces findings not arrived at by means of statistical procedures sented in Table 1 were used as a theoretical lens to guide the
230 or other means of quantification” (Strauss & Corbin, 1994, data analysis. As such, they provided the analysis framework
p. 17). On the other hand, interpretive research seeks to explain and contextualized the coding by guiding the development of 285
phenomena in terms of the meanings they hold for people codes and categories with either positive or negative effects on
(Orlikowski & Baroudi, 1991) by starting from the proposi- any of the IQ dimensions shown in Table 1. The coding was
tion that our knowledge of the social reality is a construction iteratively performed by the lead researcher and reviewed by
235 by human actors (Walsham, 1993). the second author. Following each review, any coding-related
A total of 49 in-depth, semi-structured interviews were con- disagreements were discussed by the researchers, and the code- 290
ducted in seven large organizations (see Table 3). Based on the book was updated accordingly (Hruschka et al., 2004). While no
recommendations from the literature (Stake, 1995; Yin, 2008), quantitative intercoder reliability assessments were performed,
a predefined case study protocol was followed; however, the intercoder reliability was not considered problematic, as the
240 researchers were able to refocus their questions, or prompt number of coding-related disagreements reduced, and inter-
for more information based on the answers provided by the coder consensus increased, with each iteration. Initially, each 295
interviewees. Each subject matter expert was interviewed face- interview transcript and document was analyzed independently
to-face, and each interview generally lasted between 30 and from any other data. Subsequently, within-case and cross-case
60 minutes. The subject matter experts included represen- analysis (Yin, 2008) provided the means for triangulation of
245 tatives from operational, tactical, and strategic levels, who data sources, which allowed the researchers to seek conflicting
represented various roles within the case-study organizations. data. When conflicting data was found, the research participants 300
280 S. BAŠKARADA AND A. KORONIOS

TABLE 3
Profiles of the Case Study Organizations and the Interviewees
Case Industry Description Interviewees
A Defense A multi-billion dollar defense organization focused integrated logistic support manager, project
on design, construction, and maintenance or large manager, systems support manager, logistic
systems. engineering manager, product manager, data
manager, systems support engineer, database
administrator, business analyst, information
systems manager, innovation manager
B Energy A multi-billion dollar independent oil and gas chief information officer, enterprise architect,
exploration and production company that manages information quality manager, information
more than 15,000 wells, produces more than security manager, supervisor of operations,
150,000 barrels of oil and more than 50 million process analyst, information quality analyst,
cubic feet of natural gas each day. data modeler,
C Government A state government department providing telephone, chief technology officer, chief financial officer,
networking, and data management services to database administrator, enterprise services
various state agencies. administrator, quality manager, business
intelligence developer
D Telecom A multi-billion dollar telecommunications company enterprise information architect, information
with more than 10 million customers. quality manager, information management
development manager, information
management operations manager, information
quality analyst
E IT Services A multi-billion dollar information management senior ICT manager, data center manager
company that develops business intelligence and
marketing databases and provides information
technology outsourcing services for a number of
Fortune 500 companies.
F Transport A multi-billion dollar rail transport company that TQM team leader (information management)
operates several thousand trains and has more than
200,000 employees.
G Telecom A multi-billion dollar telecommunications company head of data governance, head of security, head of
with more than 5 million customers, offering data management, data architect, enterprise
services for mobile, landline, and IP-based voice architect, solution architect, ICT system owner,
and data communication. information quality manager (data warehouse),
business object owner, IT specialist

were again contacted for follow-up explanation. Alternatively, In addition to the seven case studies, data collection and anal- 315
any such conflicts were further investigated in subsequent case ysis also included an ethnographic study of another large, multi-
studies. Thus, relevant constructs (i.e., the CSFs) were built in billion-dollar telecommunication organization. Ethnography,
an iterative and incremental fashion. which requires the scientist to spend a considerable amount of
305 Open coding produced 80 codes, which were aggregated into time in the field, originates from social and cultural anthro-
11 key categories (the CSFs) as described above. The code- pology research (Myers, 1999). As a result, it is particularly 320
book was then visually represented in MS Visio (see Figure 1). well suited to exploring organizational contexts of IS (Crabtree,
Given that we started the data collection and analysis with a Nichols, O’Brien, Rouncefield, & Twidale, 2000; Myers, 1999).
specific research question in mind (i.e., “What are the CSFs for Ethnography, which has been described as “the most in-depth
310 information quality management?”) “information quality man- or intensive research method possible” (Myers, 1999, p. 6),
agement” was selected as the core category. This is in contrast to is actually an umbrella term for a range of different analytic 325
the traditional grounded theory approach to analysis, where the frameworks (Crabtree et al., 2000). Key differences between
theoretical core only emerges (or is identified) during selective case study research and ethnography include the amount of time
coding. spent in the field as well as the extent of participant observation
A CSF FRAMEWORK FOR IQM 281

IQ evaluation formal education training workshops risk identification risk prioritization


IQ balanced IQ benchmarking IQ dashboard professional risk analysis
scorecard certification training evaluation risk categorization
random
statistical IQ helpdesk risk monitoring
sampling on the job mentoring risk mitigation
process control (controls)
IQ metrics IQ assessments / monitoring training
IQ risk management
data profiling
IQM culture
physical storage backup and recovery IQM maturity
IQ tools management IQM balanced
redundant storage IQM optimization scorecard
archival and retrieval management IQM metrics
secure transmission information
of sensitive destruction continuous IQM improvement IQM benchmarking
information storage management

sensitive information
disposal management security management Information quality management information architecture management
security/
confidentiality interface
audit trail
access control classification of management
continuous IQ improvement IQM governance
authentication management information products
integration
authorization management
IQM roles and
information product lifecycle management responsibilities enterprise tier
data migration business process
reengineering (business process)
derived information IQ audit trail
information product information product information tier
products IQ communication
configuration taxonomy IQ problem root- (logical)
management
management cause-analysis IQM policy and
information product application tier
information product procedures
meta-information supply chain (software)
management visualization IQM cost-benefit IQM rewards and
management
analysis incentives physical tier
IQ requirements (hardware)
IQ/business rules requirements management
management IQM team and project
strategic IQM management master data
IQ requirements information
stakeholder management
elicitation requirements
management IQM accountability
management information cleansing preventive
IQ dimensions physical modeling controls detective
IQ requirements IQ problem reporting
and handling corrective
analysis conceptual modeling logical modeling

FIG. 1. Open coding.

involved in data collection (Myers, 1999). Similarities include CSFs) and the theoretical core (information quality manage-
330 the fact that both ethnographic as well as case-study research ment). The order in which the CSFs are discussed is arbitrary
lead to theoretical rather than to statistical generalizations and does not imply any sequential ordering or ranking. It is
(Lee & Baskerville, 2003; Myers, 1999; Yin, 2008). The lead worth noting that while each organization studied had an IQM 355
researcher spent approximately 1000 hours at the company, pro- program in place, some organizations were more mature in their
viding information quality/analytics related consulting services approach to IQM than others. As a result, more-mature orga-
335 to two departments (complaints and risk/compliance) over a nizations were able to offer evidence for the CSFs, whereas
period of six months. As such, relevant data were primarily less-mature organizations were able to provide examples of yet
collected through interviews, workshop/meeting participation, unresolved IQ issues, discuss their potential causes, and explain 360
and informal social contacts. Ethnographic data is interpreted how they were planning to overcome them.
through the scientist’s theoretical context, and the analysis usu-
Information Security Management
340 ally involves the coding of notes, which are then compared
to identify patterns and themes (Crabtree et al., 2000; Glaser Even though Security Management does not feature as a
& Strauss, 1967). Following recommendations from literature CSF for TQM in the literature (see Table 2), given that access
(Myers, 1999), the lead researcher continuously wrote up and security is a key IQ dimension (see Table 1), it is perhaps unsur- 365
analyzed written notes, thus producing regular research memos. prising that Information Security Management emerged as one
345 Similar to the case study data, ethnographic memos (comprising of the CSFs for effective IQM. An interviewee noted:
relevant codes) were regularly reviewed by the second author
and incorporated into the codebook. We had an issue last year where some vendor . . . on the internet
had somehow managed to get hold of a whole bunch of call details
and made it available for purchase. Of course, that is illegal, that’s 370
private information.
RESULTS
This section further elaborates on the axial and selective cod-
350 ing by explaining the relationships between the codes and the Information security management requires that organization
categories (the CSFs) as well as between the categories (the have effective access controls in place, which must ensure that
282 S. BAŠKARADA AND A. KORONIOS

all users are appropriately authenticated as well as authorized Another interviewee noted:
375 with the least set of privileges they require. An interviewee
We transmit some of our HR information over the public internet;
noted:
that’s encrypted. E-mail attachments are also occasionally encrypted 425
We . . . have restricted entry where only authorized persons can enter with WinZIP encryption.
data into the system.
Furthermore, confidentiality (non-disclosure) agreements
Furthermore, user accounts and access privileges, which should should be used when sensitive information is communicated
380 only be granted with appropriate management approval, should to third-parties. Finally, any sensitive information should be
be regularly reviewed. At the same time, any potential role- destroyed in a secure manner—for example, hard-copies and 430
based conflicts of interest should be proactively identified and optical media (CDs or DVDs) should be shredded to a level
controlled through relevant separations of duty access rules. For that prevents reconstruction of information (or burnt); sensitive
instance, IS developers should not have access to the production information on hard drives should be deleted by means of
385 environment. Also, audit trails (logs of users’ activities on the dedicated software algorithms that prevent reconstruction.
IS) should be analyzed (e.g., for exceptions) and periodically An interviewee noted: 435
reviewed. An interviewee noted:
We occasionally use paper shredders. We don’t throw everything in
Stringent security audits are performed by the IRS because they do the bin. There is also a specific process for the disposal of IT assets.
not want to lose the public trust. Making sure that only authorized . . . All computers are sanitized before they are disposed of.
390 people can view social security information. They log every time a
social security number is viewed.
Storage Management
Such audits may positively contribute to access security, partic- Similar to Information Security Management, Storage 440
ularly when controls are well publicized and regularly enforced. Management does not have a corresponding CSF in the TQM
Another interviewee stated: literature (see Table 2). This fact may be explained in terms
395 We have log analyzers to find out any fraudulent activities. of the accessibility IQ dimension (see Table 1), which has
traditionally largely been assumed for tangible products and ser-
In order to be able to ensure appropriate levels of access, any vices. However, given the intangible nature of information as 445
potentially sensitive/confidential information (e.g. intellectual well as many contemporary online services (e.g., the cloud),
property, commercial-in-confidence, private information, etc.) information accessibility cannot just be assumed; it has to be
should be identified as well as appropriately classified and ensured. Effective storage management plays an essential part in
400 labeled. An interviewee noted: ensuring continuous accessibility of information. Thus, organi-
We have to deal with intellectual property issues for suppliers. . . .
zations need to make sure that information is stored in dedicated 450
We also have our internal restrictions of commercial-in-confidence areas (e.g., data centers), which have physical access con-
classification. trol procedures in place, are environmentally appropriate (the
environmental conditions should be monitored in real-time),
Security policies (documenting relevant roles and respon- and are assured through suitable disaster management plans.
405 sibilities) as well as classified documents should be regu- An interviewee noted: 455
larly reviewed and updated as required (e.g. commercial-in-
confidence classification may be removed after a certain period We have a secure data center. We also have disaster recovery
of time). Also, a security incident reporting process should be processes, and we ship our data out daily.
in place. An interviewee noted:
Given that preventing and/or predicting most disasters is essen-
410 We use a ticketing system for all our security incidents. That way we tially impossible, disaster recovery comprises reactive and
can track them to resolution, and we can analyze trends and generate corrective controls that may help ensure accessibility after a nat- 460
reports.
ural or a human made disaster. Backup and recovery, which are
Additionally, all sensitive information should be either transmit- a subset of disaster recovery, may help ensure accessibility in
ted over a secure channel, or encrypted before it is transmitted the case where some information is accidentally deleted or is
415 over a public channel. An interviewee noted: perhaps lost due to system failure. Thus, critical information
should be regularly backed up, and those backups should be 465
We can’t just e-mail customer information. E-mail is not secure. stored off-site. Furthermore, the physical security of the back-
We always have to keep customer information within our network.
ups should be appropriately maintained, and backups should
Similarly, any portable devices should automatically encrypt periodically be restored to a test machine. They should also
any locally stored data, and any classified, hard-copy IPs should be appropriately labeled, including the date of backup, backup
420 be transported in secure containers. An interviewee noted: level, sensitivity level, and so forth. An interviewee noted: 470
We have to follow very stringent security regulations. Different types Twice a year, we take all of our tape storage to an off-site recovery
of classification have different handling requirements. center and we test it.
A CSF FRAMEWORK FOR IQM 283

Another interviewee noted: Information Quality Requirements Management


Information Quality Requirements Management roughly 520
Some data are stored redundantly for disaster recovery purposes.
corresponds with the “customer involvement and satisfaction”
and “product design” CSFs for TQM (see Table 2). Identifying,
475 However, another interviewee argued:
analyzing, prioritizing, and documenting all key stakehold-
Redundancy is the number one enemy of information quality! ers, including information suppliers, stewards, consumers, and
owners, is a prerequisite for effective IQM. An interviewee 525
As such, any requirements for replication and, thus, redundant noted:
storage of information should also be identified. An interviewee It’s not just a data problem. It’s a process and customer requirement
noted: problem. . . . Who decides on what data you need to collect? . . .
How do you establish what you need to capture in the first place?
480 As I speak, there are about seven or eight different information
systems that have got some . . . overlapping data. Another interviewee noted: 530
We know who our big users are; that’s a good thing. So we listen to
Redundant storage usually involves master data, which is often them.
stored in multiple transactional systems. Another interviewee
elaborated: Once all the key stakeholders have been identified and their
requirements have been gathered, it is then possible to model
485 The process for capturing information worked, but if any changes those requirements in conceptual, logical, and physical data 535
to the information were later applied, then they weren’t necessarily models. An interviewee noted:
updated in the central system. That’s because information has been
stored redundantly. We go through conceptual, logical, and physical modeling. We work
with the third normal form.
Another aspect of storage management is that some informa- This equality applies to cases where the relevant IS is commer-
490 tion may need to be archived for longer periods of time. Any cially purchased, as opposed to custom developed. An intervie- 540
such requirements should be clearly identified and documented, wee noted:
and any such information should be cataloged. Furthermore, the
media used for archiving as well as the file formats should have For our new system, we did have a big effort to define what the
appropriate longevity. Archival processes may help to ensure requirements were, even though it was a purchased system.
495 that information, which is not currently/frequently needed, is Conceptual data modeling involves the analysis of organiza-
removed from operational systems (e.g., in order to improve tional structure and information requirements, leading to iden- 545
performance). On the other hand, any such information should tification and definition of relevant entity classes, and their
be readily accessible from the archival systems/databases. attributes and relationships. An interviewee noted:
An interviewee noted:
As a part of our business requirements management is to develop a
500 Sometimes we may want to remove some data from the production matrix of all . . . the different types of work that we can perform,
system, say because they slow the database down and we don’t need so for each cell in the matrix we need to determine the informa- 550
access to them anymore, but we may not necessarily want to com- tion requirements needed to do that job. Who are the users of that
pletely delete them, say because we may need them for auditing information, the key stakeholders?
purposes at a later stage. Then, we may decide to archive them.
Ensuring that definitions are clear may reduce duplication as
well as aid with the subsequent interpretability of the data.
505 On the other hand, some information may need to be period-
An interviewee noted: 555
ically destroyed. Any such actions should only be performed
by authorized personnel, who are following clearly defined While the systems held similar sorts of information, the attributes
and documented rules for information disposal. An interviewee were named differently.
noted: It has been shown that the quality of conceptual mod-
els is influenced by several factors, including the extent of
510 There is a whole documented process by which databases may be
decommissioned. It’s not like someone just goes and drops the ongoing/periodic model reviews, the extent of usage of stan- 560
database. dard (or best practice) models, the timeliness of the model, and
the degree of model abstraction (Fettke, 2009). A conceptual
This may be of particular importance with regard to privacy schema, which represents the semantics of an organization, can
policies and government legislations, where customers may then be used as a basis for the development of a logical data
515 have the right to review, modify, and delete their personal model. Logical data models are usually normalized to, at least, 565
information. Furthermore, periodic review and destruction of the Third Normal Form and, thus, they may aid with consis-
information that is no longer needed may minimize any IQM tent representation by helping to prevent insertion, update, and
related costs. deletion anomalies. An interviewee noted:
284 S. BAŠKARADA AND A. KORONIOS

We purchased an enterprise data model last year. We looked at sev- Most our problems have to do with wrong definitions of business
570 eral different models . . . and the one that most closely represented rules.
the way our business viewed our industry we ended up purchasing
and customizing it to really make it [our] model. . . . It’s in the third In terms of the management of such business rules, an intervie- 620
normal form; we are mapping that to existing data structures and we wee noted:
are deriving new data structures from it.
We are trying to externalize the business rules from applications to
575 Another interviewee noted: a business rules management system, real-time decision engine, not
only to standardize them, but also to put them in the hands of the
We use the logical model to document the business requirements in business. 625
a form that we can give to the developers.
Any conflicting IQ requirements should be identified and appro-
Physical data models may contribute toward consistency by priately managed. An interviewee noted:
defining data types, toward completeness—and to some very
Coverage and accuracy somewhat conflict. So we already have an
580 limited degree to accuracy—by defining validation rules and by
inherent problem in that our clients expect high coverage, and then
implementing Database Management System (DBMS) integrity expect consistency in our data, and therefore accuracy is not always 630
constraints. An interviewee noted: as high as it could be. So, we have to balance that very carefully.

Most of the systems that we run have mandatory fields defined. Most Any such IQ requirements also need to be verified and vali-
of them also have foreign key checks built into the software. dated as well as mapped to relevant IPs. Finally, there should
585 However, another interviewee noted: be regular reviews and updates of IQ requirements as well as
effective communication of such requirements to all relevant 635
There are controls in the system that will prevent you from entering stakeholders.
an invalid code, not necessarily a wrong code, but an invalid one.

Organizations often use Commercial Off the Shelf (COTS) Training


products, which are usually based on generic data models that This CSF roughly corresponds with the “employee training
590 permit limited modification. In such cases organizations may and empowerment” CSF for TQM (see Table 2). An interviewee
still be able to map their logical models to the third-party phys- explained the importance of employee training as follows: 640
ical models; however, it may be difficult (or impossible) to
There are two main reasons for data quality problems. First, the
modify validation rules and DBMS integrity constraints, lead- integration between the systems is not in place or is not working
ing to potential negative impacts on accuracy, completeness, and properly. And the second is failures in the processes; people are
595 the like. An interviewee noted: not working according to the rules. Process issues can be resolved
relatively easily with training. 645
Because we are not a software development shop, we have a lot of
applications that we have purchased off the shelf, and we don’t have Thus, relevant training needs should be identified and docu-
a lot of control over those applications. They are sold as monolithic mented, and training workshops should be regularly conducted.
applications that have everything in them. . . . Now we have data
In addition to formal training, mentoring programs should
600 that doesn’t match across systems.
ensure on-the-job professional development. This equally
Another interviewee noted: applies to training related to information management (e.g., 650
information entry, access, analysis, storage, etc.), information
We have so many applications here, which we do not develop our- security management, IQ assessments, process improvements,
selves, so we do not have a grip on mapping between the logical
and the like. An interviewee illustrated the importance of formal
model and the physical model.
training as follows:
605 The above discussion mainly dealt with information require- I did my master’s thesis a few years ago on developing a bal- 655
ments. However, as stakeholders are the ones judging the quality anced scorecard for data management. That’s perhaps why I took
of that information, their input to any IQM efforts is also criti- the initiative with information quality.
cal. As such, before any IQ assessment or improvement efforts
can be initiated, relevant IQ requirements first need to be iden- In one of the case-study organizations, IQ skills and compe-
610 tified from a statistically valid representative sample of the tencies requirements had been defined by job family. In line
critical stakeholders. This includes the identification of the key with these requirements, IQ related training was mandatory for 660
IQ dimensions, relevant business rules, and minimum desirable new employees. Typically, one team member would receive
levels of IQ. Several interviewees noted: training by the IQM manager, and then that team member
would subsequently train his or her colleagues. Ideally, an
We have developed information quality goals . . . we have identified IQ helpdesk should augment on the job mentoring. Such a
615 the relevant dimensions. helpdesk is different from a traditional IT helpdesk as it should 665
Every project I’ve worked on has had extensive documentation of be staffed by business/IQ/Information Management (IM) sub-
business rules. ject matter experts rather than by IT specialists. As such, it
A CSF FRAMEWORK FOR IQM 285

may provide assistance with locating/accessing information We have a quality group that’s looking . . . at the organizational
elements/products as well as advice on how to record, present, quality effort. We have DBAs who manage data integrity. We have
security people who look at information security.
670 secure IPs, and the like. An interviewee noted:
Without such a framework, organizations may find it difficult to
We have very experienced subject matter experts who have been
working with the systems for a number of years. If anyone else implement IQM in a systematic way. An interviewee noted: 720
has any information management or business process problems, they
usually call the subject matter experts. We don’t have any written down processes; we don’t have any soft-
ware systems that do that for us. . . . I guess the thing is that we do
it, but I can’t show you a piece of paper that says that.
675 Membership and active participation in professional associa-
tions was also valued and encouraged in most of the case study Having such a governance framework may also reduce duplica-
organizations. A couple of interviewees noted: tion of effort. A couple of interviewees noted: 725
Our company provides financial support for memberships in pro- There are another two guys working on the same stuff in other
fessional organizations. We also attend conferences as part of our departments, but I don’t know what they are doing at the moment.
680 professional development. I guess we have to get closer together.

We go to conferences of our suppliers, and we talk about the There are many data quality and data management teams. There will
functionality, we give presentations, and we talk to people. now be a reorganization project which will try to integrate all those 730
different groups.
Finally, any training provided needs to be periodically evaluated
in order to assess the effectiveness as well as to improve the An IQM governance framework should include preventive,
685 relevant methods and the curriculum. An interviewee noted: detective, and corrective IQ controls, which can be manual,
semi-automated, and fully-automated. These should be doc-
We always ask for feedback on the training provided. This includes umented in the relevant IQM policy and procedures, which 735
issues around the quality of the delivery as well as the relevancy of should be developed and communicated through a governance
the material covered. We take all feedback seriously and update our board. The governance board can also be used to communicate
courses accordingly.
IQ expectations/targets and the current state of IQ within the
organization. An interviewee noted:
690 Information Quality Management Governance
Information Quality Management Governance roughly cor- I have to bring people together in the data governance board, so that 740
responds with the “top management commitment” CSF for they can understand the big picture, . . . We need a central point that
people can go to and get the guidelines.
TQM (see Table 2). However, while being popular with practi-
tioners as well as academics, governance is an abstract concept Preventive controls are aimed at preventing poor quality infor-
695 with a number of overlapping definitions. For instance, cor- mation from entering the IS. Data validation checks (application
porate governance has been defined as “a set of mechanisms level validation as well as DBMS constraints) provide a basic 745
through which outside investors protect themselves against example of preventive controls. The provision of staff train-
expropriation by the insiders” (La Porta, Lopez-de-Silanes, ing in IM also constitutes preventive control as it is aimed at
Shleifer, & Vishny, 2000, p. 4). On the other hand, project gov- increased compliance with processes, policies, and procedures.
700 ernance has been defined as “the structure through which the Preventive controls form the first level of protection against poor
objectives of the project are set, and the means of attaining quality information. An interviewee noted: 750
those objectives are determined, and the means of monitoring
performance are determined” (Turner, 2006, p. 93). In terms When we build processes, we are always looking to build in controls
of information governance, Gartner Research defined it as “the to ensure the quality of the transactions.
705 specification of decision rights and an accountability framework
to encourage desirable behavior in the valuation, creation, stor- However, due to the inherent complexity associated with devel-
age, use, archival and deletion of information” (Bell, Logan, & oping and managing IS, it is likely that not all required preven-
Friedman, 2008, p. 4). An interviewee defined governance as tive controls will be identified and implemented. In addition, 755
follows: there is a possibility that any such controls may not be consis-
tently implemented throughout the organization—a particular
710 The governance process is to resolve, and get everybody agreeing control may be implemented only in a subset of required busi-
on, what should be done. ness processes or applications. Also, preventive controls may
not be correctly designed, and, as such, they may not func- 760
Formal and structured management of IQM efforts is critical for tion as intended. Finally, due to the fact that such controls take
their success. All the above definitions highlight the need for a time to develop and implement, and taking into account the
hierarchical framework with clear definitions of relevant roles dynamic nature of organizations, any such controls can only
715 and responsibilities. An interviewee noted: lag the continuously evolving information requirements. Thus,
286 S. BAŠKARADA AND A. KORONIOS

765 preventive controls alone cannot be considered as sufficient pro- We have created an error list with 10–15 criteria. That’s being
tection against poor quality information. In terms of semi and checked monthly, and it gets linked to the people producing those
fully automated controls, an interviewee noted: failures. 820

Somebody actually has to code those into the application, and that Additional IQ related accountabilities, rewards, and incentives
all adds to the cost, and the ongoing maintenance. So, the question may, for instance, be included in regular job performance
770 is, when do you invest in those things? In the application, so that reviews. An interviewee noted:
the person who enters that data gets an error message, or do you put
in a check where each user gets a performance report at the end of We are looking at introducing some information quality reward
the month saying your performance with regards to data entry is B+ structures, because people need to be motivated. That’s one way of 825
because you made 30 mistakes. changing the culture.

775 Due to the limitations identified above, it may be assumed that


some poor quality information could pass through the preven- Information Quality Risk Management
tive controls and, thus, may enter the IS. As a result, there is While Information Quality Risk Management does not have
a requirement for detective and ultimately corrective controls. a corresponding CSF in the TQM literature (see Table 2),
Detective controls are aimed at detecting poor quality informa- it should play a central role in any mature IQM program. 830
780 tion in the IS. Data profiling tools and scripts, which can be An interviewee noted:
used to detect inconsistent or incomplete data elements, pro-
We perform, every two years, a risk assessment for our department.
vide a basic example of detective controls. Corrective controls . . . Information quality risks are typically identified there.
are aimed at correcting poor quality information. Corrective
controls are most difficult to implement and, in particular, to Risk is usually defined in terms of the likelihood of an event
785 automate because of the inherent difficulty associated with esti- occurring and the potential negative consequences of such an 835
mating the true value. As such, they usually involve significant event (Kendrick, 2009). However, as such estimates are often
manual component (i.e., human intervention). An interviewee subjective, reaching consensus among stakeholders may be dif-
noted: ficult. Additionally, while care has to be taken not to ignore risks
(Kutsch & Hall, 2010), managers may not always be prepared to
We have written cleansing applications, which look for inconsisten-
790 cies, which a human then checks manually. It is not really possible expend limited resources on planning for events which may not 840
to make an automated tool, because it is hard to guess which one is materialize (preferring to deal with problems as they emerge).
the correct one. An interviewee noted:
While most IQM activities should be continuously carried There is no structured approach. The data quality efforts are mainly
out, occasionally there may be requirements for specific IQM based on gut feeling. I guess we’ll have to undertake more structured
795 projects, especially in less-mature organizations. In line with the risk management. We were asked by top management to show where 845
the main risks lie.
above definition of project governance, IQM project manage-
ment involves the application of standard project management As the quote above indicates, without a structured risk manage-
principles to any non-continuous IQM activities, with the aim of ment approach to IQM, organizations are basing their efforts on
effectively managing IQM resources. For instance, a dedicated “gut feeling.” In any but the simplest cases, the “gut feeling”
800 project manager (IQ champion) should be responsible for any approach to IQM is virtually guaranteed to result in inefficient 850
such projects; relevant roles, responsibilities, and authorities and ineffective use of resources. An interviewee noted:
should be defined, documented, communicated, and enforced;
the project scope should be clearly defined; any project con- All data are not equal. You don’t have to apply the same level or
straints should be identified; and the like. An interviewee rigor to everything. . . . Financial data are some of the most critical.
Integrity of financial data is required by law.
805 noted:
My basic task is to help drive IQ projects. . . . Some of the IQ As such, IQ risks to business goals (this may include finan- 855
projects that we have done, have mostly been one-off efforts, where cial risks, risks to reputation, regulatory risks, etc.) should be
we have looked at a specific functional area and compared the data identified, documented, analyzed, categorized, prioritized, and
we have received from a system to what we have aggregated in the mitigated/controlled. As discussed in the previous section, rel-
810 data warehouse. . . . We are looking to productionalize those checks,
to continually run those in the future at some agreed-upon frequency.
evant preventive, detective, and corrective controls should be
enforced through the IQM Governance framework. The key 860
Furthermore, the audit trail described under Information risks identified should be continuously monitored, and the risk
Security Management may be extended from access security to register should be periodically reviewed (as new risks may
other IQ dimensions, thus becoming an IQ audit trail. This may emerge at any time). As a result, effective IQ Risk Management
815 include the logging of user activities related to creation, modi- should allow organizations to focus their IQM efforts on the
fication, and destruction of key IPs; as such, it may help ensure most critical IPs, thus, increasing IQM efficiency and effective- 865
IQ related accountabilities. An interviewee noted: ness. An interviewee noted:
A CSF FRAMEWORK FOR IQM 287

We . . . have a priority matrix telling us which data source has higher Another aspect of information product lifecycle management—
priority in comparison with another one. configuration management—involves the application of stan-
dard configuration management processes to the creation and
Information Product Lifecycle Management modification of any IPs. Effective IP configuration manage-
870 Information Product Lifecycle Management roughly corre- ment requires identification and naming of relevant config- 920
sponds with the “process management” and “vendor quality uration items, establishment of configuration baselines, and
management” CSFs for TQM (see Table 2). Managing infor- use of change control boards to ensure that any changes
mation as a product as well as effectively managing the life- to IPs and IS do not create any IQ problems. Furthermore,
cycles of critical IPs is critical to effective IQM. One of the change requests (problem reports) should be formally initi-
875 interviewees illustrated the importance of this CSF as follows: ated, reviewed, approved and tracked to closure; Configuration 925
Status Accounting (CSA) should ensure the ability to record and
There was no governance, no documentation, no development guide- report on the configuration baselines associated with each con-
lines, and no version control. Now we have all this data, and scripts
figuration item at any moment of time. Formal audits should
that transform the data, and nobody can handle it. Data flows every
way possible, and there are countless ways data is transformed; even be regularly performed in order to assess compliance with the
880 the same data. configuration management plan. An interviewee noted: 930

One of the aspects of this CSF includes identifying and docu- When we go and change the functionality of our data, we are sub-
ject to full configuration management processes to ensure that the
menting the information flow within the organization as well as integrity of our systems and the data are not undermined by any
between the organization and any external parties (i.e., infor- changes that were made.
mation product supply chain management). An interviewee
885 noted: Various relationships between the raw data elements and IPs, 935
including dependencies, associations, aggregations, and compo-
System X, actually, only uses the data that’s recorded in other sys- sitions should also be appropriately documented and managed.
tems. Now, the problem that we have is that if you suddenly have
development in your target system that doesn’t take into account that
Given that IPs comprise a set of raw or synthesized data ele-
you are linked to it, then the interacting systems will have problems. ments (e.g., a report detailing average product sales per month),
documenting and appropriately managing the rules and pro- 940
890 Effectively managing information product supply chains may cedures used to construct such IP is critical. An interviewee
involve establishment of Service Level Agreements (SLAs) noted:
which may specify IQ requirements. An interviewee noted:
What you see when you look at System X is the rendered version
Service level agreements mainly revolve around transit times and the of the elements of the database. . . . All reports are documented. . . .
expectation of getting the information. They are mostly void of the Any changes follow standard change management processes. 945
895 integrity of the information. I think the SLAs could be improved.
In addition, the way IPs are represented should be standardized
Relevant information product supply chain management pro- and simplified. For instance, all IPs should have a similar look
cesses should also aim to minimize any manual copying of and feel; this could be achieved through the use of templates.
information, while preferably maximizing real-time informa- Any such templates would need to be categorized, cataloged,
tion capture. An interviewee noted: and mapped to the individual IPs. An interviewee noted: 950
900 The data that come in from our contractors comes in form of hard All forms look pretty much the same. Once you have seen one form,
copy . . . comes to us on paper, and this is then entered into the the rest look pretty much the same. The structure is almost the same.
system. There is obviously an opportunity there to pick up electronic The layout is the same; the buttons are all in the same place.
data instead.
Metadata, which is usually defined as data about data, may
Any data migration (i.e., Extract Transform Load [ETL]) include things like data models, access/modification times- 955
905 between IS or databases (Abiteboul et al., 2005), which should tamps, definitions, business rules, access based descriptors (e.g.,
be minimized as much as possible (e.g., only undertaken in creator/owner), and even IQ confidences. Such metadata should
the case of system/database upgrades), should follow stan- be appropriately organized and stored in a metadata registry,
dard system/software development methodologies and include which should be managed separately from the transactional and
extensive testing. Furthermore, if possible, all data should be the master data. Several interviewees noted: 960
910 migrated at the same time, so that any duplicate storage and use
can be minimized in the production systems. An interviewee Our whole system is just metadata driven. Metadata is also used to
noted: drive reporting, and is also used in the rule engine.

The biggest problem is that we don’t have any metadata repository.


We migrated our . . . data in stages. Each stage involved significant
testing and produced a large amount of documentation. We also had Some metadata may exist in pockets, but we do not have a data
915 external consultants come in to audit everything before the cutoff. architecture with metadata. 965
288 S. BAŠKARADA AND A. KORONIOS

We have never had a good business metadata strategy—an Excel Master data is the information that is shared across the enterprise.
sheet here, a Word document there—but now we need a better . . . Our long-term goal is to have one system of record for every
solution. piece of data. 1020

Absent “one system of record for every piece of data“ consis-


Information Architecture Management
tency may be aided through periodic synchronization of any
970 While Information Architecture Management does not have redundant data sets; this process should include rules for iden-
a corresponding CSF in the TQM literature (see Table 2), tification and correction of any inconsistencies. An interviewee
appropriately managing the information architecture of an orga- noted: 1025
nization is critical to effective IQM. A couple of interviewees
noted: There is still no consensus about where certain data elements should
be mastered. . . . We have redundant systems working in parallel,
975 We have single islands in the IT landscape. . . . Data is stored in with nightly consistency checks.
different files, in different servers, and if you need some information
you need to go to the right people and know who is responsible for As previously discussed, identifying inconsistencies is usually
this information.
relatively straightforward (given the correct mapping of data 1030
It is not about the system, it is about the data. And if we can establish elements); however, automated correction is usually much more
980 a data architecture then systems do not matter. difficult (or even impossible) to implement due to the difficulty
associated with choosing the preferred value.
In order to facilitate the establishment of a single version of the
truth, this CSF involves the mapping of relationships between
the business processes, the logical data model, software, and Information Quality Assessment/Monitoring
hardware. A couple of interviewees noted: Information Quality Assessment/Monitoring roughly corre- 1035
sponds with the “quality measurement and benchmarking” CSF
985 You have to look at the whole chain of the information flow . . . and for TQM (see Table 2). Before any IQ improvements can be
we record it in a database. Then you have in the database all appli-
cation that are relevant to the company, you have all the interfaces
attempted, the current state of IQ first needs to be assessed.
between the applications, and you have all the elements which are And, in order to assess the current state of IQ, qualitative
transferred. . . . We have now added a hardware level to the model, and quantitative metrics need to be developed and applied. 1040
990 so you can in fact identify what hardware each application is using. Such IQ metrics, or Key Performance Indicators (IQ-KPIs),
Now we are adding another layer describing the business activities which should target the key IQ risks (as identified through the
that generate and use the data.
Information Quality Risk Management CSF), should be named
Systems and data, these days, business processes, are just so closely and appropriately described. In addition, measurement proce-
linked, so that they influence each other. For instance, if you look at dures should describe how each IQ-KPI is calculated, the target 1045
995 your business processes, you may find that you are collecting data
values for each IQ-KPI (as identified through the Information
that you actually don’t use. Or if you do the mapping between the
business processes and the data stores, that can be quite revealing. Quality Requirements Management CSF), the units of mea-
surement, as well as the measurement schedule. While simpler
Aligning business process and workflow models with the log- IQ-KPI measurement procedures may be based on automated
ical enterprise information model should effectively combine checks of business rule violations, more complex IQ-KPIs may 1050
1000 any heterogeneous information sources under a single query require elaborate manual audits. With respect to automated met-
interface. Additionally, the logical enterprise information model rics, each IQ-KPI may comprise one or more business rules.
should be mapped to the software layer (i.e., different physi- An interviewee noted:
cal model instances), which in turn should be mapped to the
hardware layer. An interviewee noted: Each of the KPIs has more than one rule associated with it; each KPI
would have between 10 and 20 rules. 1055
1005 We are using . . . Rational Data Architect to map and keep the align-
ment between the derived data structures and the conceptual model. Such IQ business rules should be extracted from operational
. . . It is a derived model, but the mapping is intact. So if we make systems and ideally centrally managed in a Business Rule
any changes to the enterprise logical model, we’ll be able to tell Management System (BRMS). More mature organizations may
the downstream impact on any physical repository, any service, data
1010 structure, any of those.
be able to dynamically generate business rules based on histor-
ical data. An interviewee noted: 1060
Achieving such mappings should facilitate horizontal and ver-
tical integration (i.e., it should assist with application inter- We have developed a rule base . . . comprising business rules, which
can be tested. We have a tester, which shows whether the rules are
operability and management reporting). Finally, while verti- being followed. . . . We have 1000 or so rules . . . the rules are in the
cal integration is usually addressed through Online Analytical form IF-THEN.
1015 Processing (OLAP) and other BI tools and techniques, horizon-
tal integration is usually much more reliant on effective Master IQ tools may be used to detect (and sometimes correct) incom- 1065
Data Management (MDM). An interviewee noted: plete, inconsistent, or duplicate data elements. Commercial
A CSF FRAMEWORK FOR IQM 289

profiling tools may be used to perform domain and type assess- Continuous Information Quality Improvement
ments; calculate various statistics, including minimum, maxi- Continuous Information Quality Improvement roughly cor-
mum, and median values; discover any dependencies within or responds with the “process management” CSF for TQM (see
1070 between tables; identify any duplicates; and so on. However, an Table 2). After IQ assessments have been performed (as illus- 1120
interviewee noted: trated in the Information Quality Assessment/Monitoring CSF),
and after any IQ problems have been identified, organizations
Information profiling is the simplest form of assessment. . . . It really
does not help you identify anything other than “you have a problem
should aim to improve the quality of the relevant IPs; improv-
with the quality,” but you have no way of really categorizing what ing the quality of IPs is the ultimate goal of any IQM effort. The
1075 the nature is and what the impact is. initial step requires the identification of the root causes of any 1125
IQ problems. An interviewee noted:
Automated IQ-KPIs are limited to the checks of coherence
We are going out there, first of all seeing how bad it is. . . . Then we
(or internal consistency) of information. Any checks of corre-
back into what are the reasons behind those problems.
spondence to any real-world entities or events require manual
audits. For instance, accuracy of information cannot be automat- Such Root-Cause-Analysis (RCA) may employ the
1080 ically assessed (based on business rule validations). A couple of use of Ishikawa diagrams and should consider human, 1130
interviewees noted: process/organizational, and technological factors. Causal
factor charts may be used to describe the events leading up to
Verification is even harder, because that’s not only “Is it a valid IQ problems and the conditions surrounding such events, and
code?” but “Is it right for this customer?” And that’s really difficult
to do without just calling up the customer.
any links between the root-causes and IQ problems should be
documented and backed up by clear evidence. An interviewee 1135
1085 Somebody goes down and eyeballs the asset, and then reviews the noted:
data that is stored in the databases themselves against: (a) What
should be there, and (b) what the system says is there. We also search for the root causes of problems . . . and then we write
change requests.
Such manual audits should be based on statistically valid ran-
Before proceeding with any process improvements, it is critical
dom samples of relevant information elements or IPs; the qual-
to estimate the costs associated with poor IQ and corresponding 1140
1090 ity of IPs should be assessed by aggregating the IQ-KPI results
improvement initiatives, as well as any potential benefits or cost
of relevant information elements. Furthermore, IQ-KPIs should
savings that may result from any process improvements. Several
be continuously monitored over time. A couple of interviewees
interviewees noted:
noted:
You have to weigh the benefits of what’s really the importance of
We have generated several KPIs, which are measured on a monthly this field vs. the cost to get it filled in correctly. 1145
1095 basis. We track the monthly trends and we also have targets, which
we are trying to reach. The guess from the management was that we lose too much money
due to poor information quality. Then the question is: Does it make
Some of the information quality projects that we’ve done, have sense to improve the information quality, or is the effort to improve
mostly been one off efforts. We are looking to productionalize those going to cost more than you can save?
checks, to continually run those in the future at some agreed upon
1100 frequency, weekly, monthly, quarterly, whatever. We have very high quality; we sometimes even think it’s too high, 1150
because it costs too much effort to reach the quality on our side.
When feasible, continuous monitoring of IQ-KPIs should be
Based on a cost–benefit analysis, we haven’t so far been performing
incorporated into statistical control charts (or Shewhart charts), any data enrichment from external sources.
so that any special-cause variation may be identified. An inter-
viewee noted: In addition to incremental process improvements, Business
Process Reengineering (BPR) may help address root-causes 1155
1105 We started using statistical XmR charts . . . we study the trends and of IQ problems by radically rethinking how information is
any time there is a deviation . . . we get alerts. We have also devel- collected, stored, analyzed, and the like at the business pro-
oped a web site for monitoring. . . . Monday has a particular pattern,
Tuesday has a pattern, and then holidays are a particular pattern.
cess level. In any case, process improvements should follow
Deming’s Plan, Do, Check, Act (PDCA) cycle (Deming, 1982).
In addition to the above discussed assessments, surveys should A couple of interviewees noted: 1160
1110 be used to assess information consumers’ subjective perceptions Data quality is mainly a function of the business processes.
of IQ. Furthermore, IQ levels should be benchmarked internally
within the organization (e.g., between different offices) as well It is not only an IT project; it is also a business process reengineering
project. So we want to improve and optimize processes, and support
as externally against industry leaders or international standards
them with IT.
and regulations; this may involve the development of an IQ bal-
1115 anced scorecard and an IQ dashboard. Finally, the state of IQ As any process improvement efforts may at best reduce any 1165
should regularly be reported to all relevant stakeholders. future IQ problems, organizations may also wish to cleanse
290 S. BAŠKARADA AND A. KORONIOS

some of their currently “dirty” information. Information cleans- “What are the people doing wrong?” I reach out and I engage exter-
ing is closely related with IQ assessments (see the Information nal experts to come in and look at what we are doing to assess
Quality Assessment/Monitoring CSF)—identifying an IQ that. 1220
1170 problem in many cases requires the knowledge of the “cor- Continuous IQM improvement deals with defining qualitative
rect” value (this is not always the case; e.g., business rule and quantitative IQM metrics or Key Performance Indicators
violations and inconsistencies do not require the knowledge (IQM-KPIs), and using them to continuously monitor the
of the “correct” value). Given that all but the most simplistic effectiveness of organizational IQM efforts. Similar to IQ-
IQ assessments are based on random sampling, reactive infor- KPIs, IQM-KPIs should be named and appropriately described. 1225
1175 mation cleansing usually has limited feasibility. In addition, In addition, measurement procedures should describe how each
standardized procedures should cater for the reporting and han- IQM-KPIs is calculated, the target values for each IQM-KPI,
dling of any IQ problems; ad hoc approaches are not usually units of measurement, as well as the measurement schedule.
very effective. An interviewee noted: For instance, such IQM-KPIs may be targeted at measuring the
These problems go round and round, and nobody documents them, rate of IQ improvements as well as at evaluating what could be 1230
1180 so the topic comes up every six months. Same issues trying to be termed the organizational IQM culture—for example, the con-
resolved all the time. sistency at which IQM processes are applied in the organization,
More mature organizations use the IQ help desk to log IQ prob- the rate of participation in relevant professional bodies, contri-
lem reports and to track them to resolution. Using a help-desk, bution of the organization to professional and academic IQM
business users may report any potential problems and request an literature (e.g., number of articles published), involvement with 1235
1185 information quality specialist to further investigate any issues. academia, IQM training/workshops provided by members of
A couple of interviewees noted: the organization, and so on. When feasible, continuous monitor-
ing of IQM-KPIs should be incorporated into statistical control
We log help desk issue tickets to make sure we are tracking them charts (or Shewhart charts), so that any special-cause varia-
until they are resolved. We are keeping those and prioritizing those
tion may be identified. Internal benchmarking should be used 1240
periodically.
to compare the effectiveness of IQM practices within the orga-
1190 I mostly analyze the problems that get reported to me by the busi- nization (e.g., between different departments or offices), and
ness users. I then . . . [write] change requests to fix the problem and external benchmarking should be used to compare organiza-
manage the whole process involved.
tional IQM practices against world leading organizations, best
Finally, if possible, organizations should aim to strategically practices, and standards; this may involve the development of 1245
align their IQM efforts with long-term organizational objectives an IQM balanced scorecard and an IQM dashboard. The IQM
1195 and potential future IQ requirements. However, as organiza- dashboard for top management should indicate the levels of
tional strategies are often quite abstract, strategic IQM may not internal and external IQM benchmarking. An interviewee noted:
always be possible or desirable. An interviewee noted: We have a department called revenue assurance, and they asked us to
In telecommunications, there is no corporate strategy. . . . This is show them how we measure data quality. We gave them the informa- 1250
an industry problem that we cannot execute on a strategy more than tion, and they benchmarked us against two other operators in Europe.
1200 three days in advance. If we are really in an industry where strategies They compared our data quality management practices and the error
don’t work, do we then need architects, or data governance? How rates found. . . . They now want to expand the number of operators
can you plan for an unstable environment? benchmarked, to get a clearer picture.

As with any incremental and iterative process improvements, 1255


Continuous Information Quality Management any changes aimed at IQM improvements should follow
Improvement Deming’s PDCA cycle.
1205 Continuous IQM Improvement also roughly corresponds
with the “process management” CSF for TQM (see Table 2). DISCUSSION
However, while the Continuous IQ Improvement CSF aims to This section presents the second iteration of axial and selec-
improve IM processes (i.e., processes dealing with the collec- tive coding (this time performed at the CSF level). While all 1260
tion, storage, access, analysis, etc., of information) this CSF of the CSFs discussed in the previous section are by defini-
1210 is targeted at improving IQM processes. In other words, this tion critical (necessary and sufficient) for effective IQM, they
CSF aims to continuously improve the way all other CSFs are may not necessarily all be equally important. If that is the case,
implemented and, thus, the IQM maturity of the organization. then it should be possible to rank the CSFs in the order that
Since many organizations may find it difficult to critically self- organizations should ideally aim to implement them. However, 1265
evaluate, external subject matter experts may be brought in to given that most of the study participants (interviewees) had spe-
1215 review existing IQM processes. An interviewee noted: cific subject matter expertise that was a subset of the CSFs
I engage external quality assurance people and when they are in I presented in the previous section, any attempt to rank or pri-
ask: “What am I doing wrong?”, “What is the vendor doing wrong?”, oritize the CSFs based on the responses received from the
A CSF FRAMEWORK FOR IQM 291

1270 interviewees would be seriously flawed. In other words, each of responsibilities, accountabilities, and the like, and as most of
the interviewees attributed the greatest significance to his or her the other CSFs are either directly (top inner box representing
area of expertise (e.g., security, architectures, modeling, etc.); quality assurance processes) or indirectly (bottom inner box 1295
very few interviewees had expertise in all the relevant areas. representing operational processes) managed by it, it is sug-
The general lack of across-the-board IQM subject matter exper- gested that organizations should start their IQM programs by
1275 tise presents a methodological challenge, as any attempts at addressing this CSF first. This is consistent with the impor-
quantitative comparison or ranking of the CSFs (e.g., through tance of management commitment as identified by (Motwani,
subject matter expert questionnaires) would most probably lead 2001; Xu & Al-Hakim, 2005). The arrow pointing from IQM 1300
to biased results. One way to overcome this limitation is to Governance to the box directly below it indicates that the four
identify CSF inter-dependencies through the second iteration CSFs within the box (IQ Requirements Management, IQ Risk
1280 of axial and selective coding (this time performed at the CSF Management, IQ Assessment/Monitoring, and Continuous IQ
level). In other words, if some CSFs can be identified as pre- Improvement) are directly managed by IQM Governance. Next,
requisites of other CSFs then such dependencies may allow us as IQ risks to the business objectives cannot be effectively 1305
to generate a CSF framework for IQM. Such a framework may identified without the knowledge of the relevant IQ require-
then be used to guide organizations in planning and implement- ments, following the establishment of a preliminary IQM
1285 ing an IQM program. This section presents the CSF framework Governance capability, organizations should focus on imple-
for IQM (see Figure 2), explains the CSF inter-dependencies, menting IQ Requirements Management followed by IQ Risk
and suggests a logical sequence in which the CSFs should be Management. This is consistent with the findings in literature, 1310
implemented. which stress the importance of identifying key stakeholders and
The arrows in Figure 2 indicate directional CSF depen- understanding their requirements and expectations (Aaltonen,
1290 dence. Dotted lines indicate groupings of closely related CSFs. 2011; Jepsen & Eskerod, 2009; Pemsel & Widén, 2010).
Given that IQM Governance among others deals with the The IQ Risk Management then directs the implementation of
establishment of the IQM policy and procedures, roles and IQ Assessment/Monitoring by allowing organizations to tar- 1315
get the most critical IPs (Borek, Woodall, & Parlikad, 2011);
organizations do not need to monitor the quality of all their
information. The IQ Assessment/Monitoring CSF (Juran & De
Training Feo, 2010) then feeds the IQ gap (target – actual levels of
IQ) to Continuous IQ Improvement for the purpose of process 1320
improvement (Baškarada, 2010; Motwani, 2001); Continuous
IQ Improvement (i.e., process improvement) cannot effectively
Continuous be implemented without IQ Assessment/Monitoring. Next,
IQM
IQM
Improvement
Governance Continuous IQ Improvement can lead to process improve-
ments in any of the four CSFs in the bottom inner box (as 1325
such, the bottom inner box is indirectly managed by IQM
Governance). These four CSFs (IP Lifecycle Management,
IQ
IQ Risk Storage Management Information Architecture Management,
Requirements
Management and Information Security Management) are highly inter-
Management
dependent, as they all deal with operational processes (i.e., 1330
IQ the business as usual collection, storage, analysis, communica-
Continuous IQ tion, use, and the like of information). Changes to any of these
Assessment /
Improvement
Monitoring four CSFs may result in follow-on effects on any of the other
three. For instance, Continuous IQ Improvement may lead to
changes to the IP supply chain (dealt with under the IP Lifecycle 1335
Management CSF; Wang, 1998). This change may in turn have
IP Lifecycle Storage an impact on Storage Management (e.g., the IP may now be
Management Management managed by an external provider), on Information Architecture
Management (e.g., relevant application interfaces may need to
be updated), and on Information Security Management (e.g., 1340
Information Information communication to the external provider, say over the inter-
Architecture Security
Management Management
net, may require encryption). Continuous IQM Improvement
(Baškarada, 2010; Juran & De Feo, 2010) continuously mon-
itors the effectiveness of the quality assurance processes in
the top inner box and identifies any opportunities for IQM 1345
FIG. 2. CSF Framework for IQM. process improvement, which are then implemented through
292 S. BAŠKARADA AND A. KORONIOS

IQM Governance. Similarly, Continuous IQM Improvement a structured, process-oriented, IQM methodology. Academics
may also have an effect on Training (Juran & De Feo, 2010; interested in IQM may wish to further test the construct validity
Motwani, 2001), which in turn maintains and/or improves the of the overall framework as well as of individual CSFs. 1400
1350 capabilities of all 10 CSFs in the box below it.

AUTHOR BIOS
LIMITATIONS Saša Baškarada, PhD, is a systems scientist and an adjunct
Several limitations concerning the qualitative interpretive research fellow in the School of Information Technology and
research methodology applied in this study need to be acknowl- Mathematical Sciences at the University of South Australia.
edged. From the positivist perspective, it may be argued that He has published on a wide range of topics, including infor- 1405
1355 case and ethnographic studies are specific to particular orga- mation quality management, information systems semiotics,
nizations and, thus, may not be generalizable to the wider e-collaboration, operations management, leadership, organi-
population (Gable, 1994). Additional potential limitations also zational learning, and research methods.
include subjective interpretations as well as the inability to con- Andy Koronios, professor, is the head of the School of
trol dependent variables and manipulate independent variables Information Technology and Mathematical Sciences at the 1410
1360 (Lee, 1989). However, since IS phenomena are time, context, University of South Australia. Andy holds academic qual-
and interpretation dependent (Guba & Lincoln, 1982), case and ifications in Electrical Engineering and Computing and
ethnographic studies are also specific to a particular time, place, Education, and a PhD from the University of Queensland.
and context. It has also been argued that “a strong case can be Andy has research interests in electronic commerce, infor-
made that external validity is enhanced more by many heteroge- mation quality, computer security, and the management 1415
1365 neous small experiments than by one or two large experiments” and strategic exploitation of information. He is also
(Cook & Campbell, 1979, p. 80). Furthermore, such studies aim the research program leader of the Centre of Integrated
towards analytical generalization, as opposed to statistical gen- Engineering Asset Management (CIEAM), a federally
eralization. In other words, where statistical generalization aims funded center for system integration in engineering asset
to make an inference about a population on the basis of empir- management. 1420
1370 ical data collected from a sample, analytical generalization—as
applied in this article—is made to theory and not to population
(Myers, 1999; Walsham, 1993; Yin, 2008). In addition, while
REFERENCES
subjectivity needs to be acknowledged, interpretive research in Aaltonen, K. (2011). Project stakeholder analysis as an environmental interpre-
general seeks to explain phenomena in terms of the meanings tation process. International Journal of Project Management, 29, 165–183.
1375 they hold for people (Orlikowski & Baroudi, 1991) by starting Abiteboul, S., Agrawal, R., Bernstein, P., Carey, M., Ceri, S., Croft,
from the proposition that our knowledge of reality is a social B.,. . . Gawlick, D. (2005). The Lowell database research self-assessment. 1425
Communications of the ACM, 48(5), 111–118.
construction by human actors (Walsham, 1993). Nevertheless, Ahire, S. L., Golhar, D. Y., & Waller, M. A. (1996). Development and validation
as is the case with any research, there is a need to further test the of TQM implementation constructs. Decision Sciences, 27, 23–56.
findings identified in this article in other organizations/contexts. Akhavan, P., Jafari, M., & Fathian, M. (2006). Critical success factors of
knowledge management systems: a multi-case analysis. European Business 1430
Review, 18(2), 97–113.
Al-Hakim, L. (2008). Surgical disruption: information quality perspective.
1380 CONCLUSION AND FUTURE WORK
International Journal of Information Quality, 2(2), 192–204.
Considering the significance that IQ plays in the con- Anderson, J. C., Rungtusanthanam, M., Schroeder, R., & Devaraj, S. (1995).
temporary economy and society as well as the gap in the A path analytic model of a theory of quality management underlying the 1435
literature on CSFs for IQM, this study makes a major con- Deming management method: Preliminary empirical findings. Decision
Sciences, 26(5), 637–358.
tribution to IS practice and theory. Based on 49 interviews Ariyachandra, T. R., & Frolick, M. N. (2008). Critical success factors in busi-
1385 in seven case studies and one six-month-long ethnographic ness performance management—Striving for success. Information Systems
study, this article identified 11 CSFs for effective IQM. The Management, 25(2), 113–120. 1440
CSFs discussed include: Information Security Management, Ballou, D. P., & Pazer, H. L. (1985). Modeling data and process quality in
multi-input, multi-output information systems. Management Science, 31(3),
Storage Management, IQ Requirements Management, Training, 150–162.
IQM Governance, IQ Risk Management, Information Product Ballou, D. P., Wang, R. Y., Pazer, H., & Tayi, G. H. (1998). Modeling infor-
1390 Lifecycle Management, Information Architecture Management, mation manufacturing systems to determine information product quality. 1445
Management Science, 44(4), 462–484.
IQ Assessment/Monitoring, Continuous IQ Improvement, and
Baškarada, S. (2010). Information quality management capability maturity
Continuous IQM Improvement. The resulting CSF framework model. Wiesbaden, Germany: Vieweg+Teubner.
for IQM explained the CSF inter-dependencies and suggested Baškarada, S. (2011). How spreadsheet applications affect information quality.
a logical sequence in which the CSFs should be implemented. Journal of Computer Information Systems, 51(3), 77–84. 1450
Baškarada, S., & Koronios, A. (2013). Data, Information, Knowledge, Wisdom
1395 Practitioners concerned with the development of holistic and (DIKW): A semiotic theoretical and empirical exploration of the hierar-
effective IQM strategies may benefit from the CSF frame- chy and its quality dimension. Australasian Journal of Information Systems,
work for IQM presented in this article by using it to develop 18(1), 5–24.
A CSF FRAMEWORK FOR IQM 293

1455 Bell, T., Logan, D., & Friedman, T. (2008). Key issues for establishing informa- Friedman, T. (2008). Case study: Aera energy’s comprehensive focus on data
tion governance policies, processes and organization, Stamford, CT: Gartner quality generates competitive advantage. Stamford, CT: Gartner Research.
Research. Friedman, T. (2009). Findings from primary research study: Organizations per-
Bergeron, F., & Begin, C. (1989). The use of critical success factors in eval- ceive significant cost impact from data quality issues. Stamford, CT: Gartner 1525
uation of information systems: A case study. Journal of Management Research.
1460 Information Systems, 5(4), 111–124. Friedman, T., & Bitterer, A. (2009). Magic quadrant for data quality tools.
Black, S. A., & Porter, J. L. (1996). Identification of critical success factors of Stamford, CT: Gartner Research.
TQM. Decision Sciences, 27(1), 1–21. Gable, G. G. (1994). Integrating case study and survey research methods: An
Blackburn, R., & Rosen, B. (1993). Total quality and human resource manage- example in information systems. European Journal of Information Systems, 1530
ment: Lessons learned from Baldrige Award-winning companies. Academy 3(2), 112–126.
1465 of Management Executive, 7(2), 49–66. Garvin, D. (1987). Competing on the eight dimensions of quality. Harvard
Boeije, H. (2002). A purposeful approach to the constant comparative method Business Review, 65(6), 101–109.
in the analysis of qualitative interviews. Quality and Quantity, 36, 391–409. Gelle, E., & Karhu, K. (2003). Information quality for strategic quality plan-
Borek, A., Woodall, P., & Parlikad, A. K. (2011). A risk management approach ning. Industrial Management & Data Systems, 103(8), 633–643. 1535
to improving information quality for operational and strategic manage- Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory.
1470 ment. Paper presented at the Proceedings of the 18th EUROMA Conference: London, UK: Weidenfeld and Nicolson.
Exploring Interfaces. Guba, E. G., & Lincoln, Y. S. (1982). Epistemological and Methodological
Bose, R. (2009). Advanced analytics: Opportunities and challenges. Industrial Bases of Naturalistic Inquiry. Educational Communication and Technology
Management & Data Systems, 109(2), 155–172. Journal, 30, 233–253. 1540
Bowen, D. E., & Lawler, E. E. I. (1992). The empowerment of service workers: Guynes, C. S., & Vanecek, M. T. (1996). Critical success factors
1475 What, why, how, and when. Sloan Management Review, 33(3), 31–39. in data management. Information & Management, 30(4), 201–209.
Boynton, A., & Zmud, A. (1984). An assessment of critical success factors. doi:10.1016/0378-7206(95)00053-4
Sloan Management Review, 25(4), 17–27. Hackman, J. R., & Wageman, R. (1995). Total quality management: Empirical,
Brown, M. L., & Kros, J. F. (2003). Data mining and the impact of missing data. conceptual, and practical issues. Administrative Science Quarterly, 40, 1545
Industrial Management & Data Systems, 103(8), 611–621. 309–342.
1480 Bullen, C. V., & Rockart, J. F. (1981). A primer on critical success factors: Hardie, N., & Walsh, P. (1994). Toward a better understanding of quality.
Center for Information Systems Research, Sloan School of Management, International Journal of Quality & Reliability Management, 11(4), 53–63.
Cambridge, MA. Haug, A., Arlbjørn, J. S., & Pedersen, A. (2009). A classification model of
Camp, R. C. (1989). Benchmarking: The search for industry best practices that ERP system data quality. Industrial Management & Data Systems, 109(8), 1550
lead to superior performance. Milwaukee, WI: ASQ Quality Press. 1053–1068.
1485 Cao, Q., & Hoffman, J. J. (2011). A case study approach for developing a Hruschka, D. J., Schwartz, D., St. John, D. C., Picone-Decaro, E., Jenkins, R.
project performance evaluation system. International Journal of Project A., & Carey, J. W. (2004). Reliability in coding open-ended data: Lessons
Management, 29, 155–164. learned from HIV behavioral research. Field Methods, 16(3), 307–331.
Chirinos, L., Losavio, F., & Matteo, A. (2004). Identifying quality-based doi:10.1177/1525822x04266540 1555
requirements. Information Systems Management, 21(1), 15–26. Hwang, M., & Cappel, J. J. (2002). Data warehouse development and man-
1490 Civan, A., & Pratt, W. (2007). Information systems and healthcare agement: Practices of some large companies. The Journal of Computer
XXII: Characterizing and visualizing the quality of health information. Information Systems, 43(1), 3–6.
Communications of the Association for Information Systems, 20, 226–259. Ivert, L. K., & Jonsson, P. (2010). The potential benefits of advanced plan-
Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design & ning and scheduling systems in sales and operations planning. Industrial 1560
analysis issues for field settings. Boston, MA: Houghton Mifflin Company. Management & Data Systems, 110(5), 659–681.
1495 Crabtree, A., Nichols, D. M., O’Brien, J., Rouncefield, M., & Twidale, M. B. Jepsen, A. L., & Eskerod, P. (2009). Stakeholder analysis in projects: Challenges
(2000). Ethnomethodologically informed ethnography and information sys- in using current guidelines in the real world. International Journal of Project
tem design. Journal of the American Society for Information Science, 51(7), Management, 27, 335–343.
666–682. Juran, J. M. (1974). Quality control handbook (3rd ed.). New York, NY: 1565
Crosby, P. (1979). Quality is free. New York, NY: McGraw-Hill. McGraw-Hill.
1500 Daniel, D. R. (1961). Management information crisis. Harvard Bus Review, Juran, J. M., & De Feo, J. A. (2010). Juran’s quality handbook (6th ed.). New
September–October, 111–121. York, NY: McGraw-Hill.
Dasu, T., & Johnson, T. (2003). Exploratory data mining and data cleaning. JUSE. (2007). The Deming prize guide. Retrieved from http://www.juse.or.jp/
New York, NY: Wiley-Interscience. e/deming/pdf/demingguide2007_01.pdf 1570
Davenport, T. H. (1993). Process innovation: Reengineering work through Kendrick, T. (2009). Identifying and managing project risk (2nd ed.). New York,
1505 information technology. Boston, MA: Harvard Business School Press. NY: Amacom.
Dean, J. W., & Bowen, D. E. (1994). Management theory and total qual- Kutsch, E., & Hall, M. (2010). Deliberate ignorance in project risk management.
ity: Improving Research and practice through theory development. The International Journal of Project Management, 28, 245–255.
Academy of Management Review, 19(3), 392–418. La Porta, R., Lopez-de-Silanes, F., Shleifer, A., & Vishny, R. (2000). Investor 1575
Deming, W. E. (1982). Out of the crisis. Cambridge, MA: MIT Press. protection and corporate governance. Journal of Financial Economics,
1510 Edwards, C. D. (1968). The meaning of quality. Quality Progress. 58(1–2), 3–27.
Feigenbaum, A. V. (1986). Total quality control (3rd ed.). New York, NY: Lawler, E. E. I. (1994). Total quality management and employee involvement:
McGraw-Hill. Are they compatible? Academy of Management Executive, 8(1), 68–76.
Fettke, P. (2009). How conceptual modeling is used. Communications of the Lee, A. S. (1989). A scientific methodology for MIS case studies. MIS 1580
Association for Information Systems, 25, 571–592. Quarterly, 13(1), 33–50.
1515 Flynn, B. B., Schroeder, R. G., & Sakakibara, S. (1994). A framework for qual- Lee, A. S., & Baskerville, R. L. (2003). Generalizing generalizability in infor-
ity management research and associated measurement instrument. Journal mation systems research. Information Systems Research, 14(3), 221–243.
of Operations Management, 11, 339–366. Lee, J. H., Shim, H.-J., & Kim, K. K. (2010). Critical success factors in SOA
Fortune, J., & White, D. (2006). Framing of project critical success factors by a implementation: An exploratory study. Information Systems Management, 1585
systems model. International Journal of Project Management, 24, 53–65. 27(2), 123–145.
1520 Freund, Y. P. (1988). Planner’s guide: Critical success factors. Strategy & Lee, Y., Strong, D., Kahn, B., & Wang, R. (2002). AIMQ: A methodology for
Leadership, 16(4), 20–20. information quality assessment. Information Management, 40, 133–146.
294 S. BAŠKARADA AND A. KORONIOS

Lewis, J. P. (2000). The project manager’s desk reference. New York, NY: Pipino, L., Wang, R., Kopcso, D., & Rybolt, W. (2005). Developing measure-
1590 McGraw-Hill. ment scales for data-quality dimensions. In R. Y. Wang, E. M. Pierce, S. E.
Lillrank, P. (2003). The quality of information. International Journal of Quality Madnick & C. W. Fisher (Eds.), Information quality (pp. 37–52). Armonk,
and Reliability Management, 20(6), 691–703. NY: M.E. Sharpe.
Lu, X.-H., Huang, L.-H., & Heng, M. S. H. (2006). Critical success fac- Pollard, C., & Cater-Steel, A. (2009). Justifications, strategies, and critical suc- 1660
tors of inter-organizational information systems—A case study of Cisco cess factors in successful ITIL implementations in U.S. and Australian
1595 and Xiao Tong in China. Information & Management, 43(3), 395–408. companies: An exploratory study. Information Systems Management, 26(2),
doi:10.1016/j.im.2005.06.007 164–175.
Ma, C., Chou, D. C., & Yen, D. C. (2000). Data warehousing, technology assess- Poston, R. S., Reynolds, R. B., & Gillenson, M. L. (2006). Technology solutions
ment and management. Industrial Management & Data Systems, 100(3), for improving accuracy and availability of healthcare records. Information 1665
125–134. Systems Management, 24(1), 59–71.
1600 Madnick, S. E., Wang, R. Y., Lee, Y. W., & Zhu, H. (2009). Overview and Powell, T. C. (1995). Total quality management competitive advantage: a review
framework for data and information quality research. ACM Journal of Data and empirical study. Strategic Management Study, 16, 15–37.
and Information Quality, 1(1), 2:1–2:22. Price, R., Neiger, D., & Shanks, G. (2008). Developing a measurement instru-
Mann, R., & Kehoe, D. (1994). An evaluation of the effects of quality improve- ment for subjective aspects of information quality. Communications of the 1670
ment activities on business performance. International Journal of Quality Association for Information Systems, 22(3), 49–74.
1605 and Reliability Management, 11(4), 29–44. Price, R., & Shanks, G. (2004). A semiotic information quality framework.
Mendoza, L. E., Pérez, M., & Grimán, A. (2006). Critical success factors for Paper presented at the Proceedings of the IFIP International Conference on
managing systems integration. Information Systems Management, 23(2), Decision Support Systems (DSS2004), Prato, Italy.
56–75. Redman, T. C. (1995). Improve data quality for competitive advantage. Sloan 1675
Milakovich, M. E. (2004). Rewarding quality and innovation: Awards, charters, Management Review, 36(2), 99–107.
1610 and international standards as catalysts for change. In M. A. Wimmer (Ed.), Redman, T. C. (1996). Data quality for the information age. Boston, MA:
Knowledge management in electronic government (pp. 80–90). Berlin, Artech House.
Germany: Springer. Rockart, J. F. (1979). Chief executives define their own data needs. Harvard Bus
Milosevic, D., & Patanakul, P. (2005). Standardized project management may Review, March–April, 22–24. 1680
increase development projects success. International Journal of Project Sabherwal, R., & Kirs, P. (1994). The alignment between organi-
1615 Management, 23, 181–192. zational critical success factors and information technology capa-
Moon, K. L., & Ngai, E. W. T. (2008). The adoption of RFID in fashion retail- bility in academic institutions. Decision Sciences, 25(2), 301–330.
ing: A business value-added framework. Industrial Management & Data doi:10.1111/j.1540-5915.1994.tb00805.x
Systems, 108(5), 596–612. Saraph, J. V., Benson, P. G., & Schroeder, R. G. (1989). An instrument for 1685
Motwani, J. (2001). Critical factors and performance measures of TQM. The measuring the critical factors of quality management. Decision Sciences,
1620 TQM Magazine, 13(4), 292–300. 20(4), 457–478.
Myers, M. D. (1999). Investigating information systems with ethnographic Sari, K. (2008). Inventory inaccuracy and performance of collaborative supply
research. Communication of the Association for Information Systems, 2(23), chain practices. Industrial Management & Data Systems, 108(4), 495–509.
1–20. Saura, I. G., Frances, D. S., Contrı´, G. B., & Blasco, M. F. (2008). Logistics ser- 1690
Nemati, H. R., & Barko, C. D. (2003). Key factors for achieving organiza- vice quality: A new way to loyalty. Industrial Management & Data Systems,
1625 tional data-mining success. Industrial Management & Data Systems, 103(4), 108(5), 650–668.
282–292. Shchiglik, C., & Barnes, S. J. (2004). Evaluating website quality in the airline
Oakland, J. S. (1993). Total quality management (2nd ed.). Oxford, UK: industry. The Journal of Computer Information Systems, 44(3), 17–25.
Butterworth-Heinman. Sherer, S. A., & Alter, S. (2004). Information system risk and risk factors: Are 1695
Oliver, N. (1988). Employee commitment and total quality control. they mostly about information systems? Communications of the Association
1630 International Journal of Quality and Reliability Management, 7(1), for Information Systems, 14, 29–64.
21–29. Shewhart, W. A. (1931). Economic control of quality of manufactured product.
Orlikowski, W. J., & Baroudi, J. J. (1991). Studying information technology in New York, NY: van Nostrand.
organizations: Research approaches and assumptions. Information Systems Silvola, R. (2009). Product data management practices in high-tech companies. 1700
Research, 2(1), 1–28. Industrial Management & Data Systems, 109(6), 758–774.
1635 Paradice, D. B., & Fuerst, W. L. (1991). An MIS data quality methodology Solomon, M. D. (2005). It’s all about the data. Information Systems
based on optimal error detection. Journal of Information Systems, 5(1), Management, 22(3), 75–80.
48–66. Stake, R. E. (1995). The art of case study research. Thousand Oaks, CA: Sage
Paulk, M. C., Curtis, B., Chrissis, M. B., & Weber, C. V. (1993). Capability Publications. 1705
maturity model for software, Version 1.1. Pittsburgh, PA: Software Strauss, A., & Corbin, J. (1994). Grounded theory methodology. In N. K.
1640 Engineering Institute/Carnegie Mellon University. Denzin & Y. S. Lincoln (Eds.), Handbook of qualitative research (pp.
Payton, F. C., & Handfield, R. (2003). Data implementation and outsourcing 273–286). Thousand Oaks, CA: Sage Publications.
challenges: An action research project with Selectron. Communications of Tayi, G. K., & Ballou, D. P. (1998). Examining data quality. Communications
the Association for Information Systems, 12, 633–648. of the ACM, 41(2), 54–57. 1710
Peffers, K., Gengler, C. E., & Tuunanen, T. (2003). Extending critical suc- Turner, J. R. (2006). Towards a theory of project management: The nature of
1645 cess factors methodology to facilitate broadly participative information the project governance and project management. International Journal of
systems planning. Journal of Management Information Systems, 20(1), Project Management, 24(2), 93–95. doi:10.1016/j.ijproman.2005.11.008
51–85. Vosburg, J., & Kumar, A. (2001). Managing dirty data in organizations using
Pemsel, S., & Widén, K. (2010). Creating knowledge of end users’ require- ERP: Lessons from a case study. Industrial Management & Data Systems, 1715
ments: The Interface between firm and project. Project Management 101(1), 21–31.
1650 Journal, 41(4), 122–130. Walsham, G. (1993). Interpreting information systems in organisations.
Phan, D. D. (2001). Software quality and management: How the world’s most Chichester, UK: John Wiley & Sons.
powerful software makers do it. Information Systems Management, 18(1), Wand, Y., & Wang, R. Y. (1996). Anchoring data quality dimensions in
56–68. ontological foundations. Communications of the ACM, 39(11), 86–95. 1720
Pipino, L., Lee, Y., & Wang, R. (2002). Data quality assessment. Wang, R. Y. (1998). A product perspective on total data quality management.
1655 Communications of the ACM, 45(4), 211–218. Communications of the ACM, 41(2), 58–65.
A CSF FRAMEWORK FOR IQM 295

Wang, R. Y., & Strong, D. M. (1996). Beyond accuracy: What data quality Xu, H., & Koronios, A. (2004/2005). Understanding information quality in e-
means to data consumers. Journal of Management Information Systems, business. The Journal of Computer Information Systems, 45(2), 73–82.
1725 12(4), 5–34. Xu, H., Nord, J. H., Brown, N., & Nord, G. D. (2002). Data quality issues
Watson, H. J. (2009). Business intelligence—Past, present, and future. in implementing an ERP. Industrial Management & Data Systems, 102(1),
Communications of the Association for Information Systems, 25, 487–510. 47–58. 1740
Watson, H. J., Wixom, B. H., Buonamici, J. D., & Revak, J. R. (2001). Sherwin- Xu, H., Nord, J. H., Nord, G. D., & Lin, B. (2003). Key issues of account-
Williams’ data mart strategy: Creating intelligence across the supply chain. ing information quality management: Australian case studies. Industrial
1730 Communications of the Association for Information Systems, 5(9), 1–27. Management & Data Systems, 103(7), 461–470.
Xu, H., & Al-Hakim, L. (2005). Criticality of factors affecting data quality of Yin, R. K. (2008). Case study research. Thousand Oaks, CA: SAGE
accounting information systems: How perceptions of importance and per- Publications, Inc. 1745
formance can differ. In R. Y. Wang, E. M. Pierce, S. E. Madnick, & C. Zeitz, G., Johannesson, R., & Ritchie, J. E. J. (1997). An employee sur-
W. Fisher (Eds.), Information quality (pp. 197–214). New York, NY: M. E. vey measuring total quality management practices and culture. Group and
1735 Sharpe Inc. Organization Management, 22(4), 414–444.

View publication stats

You might also like