You are on page 1of 14

Journal of Decision Systems

ISSN: 1246-0125 (Print) 2116-7052 (Online) Journal homepage: http://www.tandfonline.com/loi/tjds20

Business Intelligence (BI) system evolution: a case


in a healthcare institution

Ehsanur Rahman Safwan, Rob Meredith & Frada Burstein

To cite this article: Ehsanur Rahman Safwan, Rob Meredith & Frada Burstein (2016) Business
Intelligence (BI) system evolution: a case in a healthcare institution, Journal of Decision
Systems, 25:sup1, 463-475, DOI: 10.1080/12460125.2016.1187384

To link to this article: http://dx.doi.org/10.1080/12460125.2016.1187384

Published online: 16 Jun 2016.

Submit your article to this journal

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at


http://www.tandfonline.com/action/journalInformation?journalCode=tjds20

Download by: [90.83.25.163] Date: 20 June 2016, At: 02:59


Journal of Decision Systems, 2016
VOL. 25, NO. S1, 463–475
http://dx.doi.org/10.1080/12460125.2016.1187384

Business Intelligence (BI) system evolution: a case in a


healthcare institution
Ehsanur Rahman Safwan, Rob Meredith and Frada Burstein
Faculty of Information Technology, Monash University, Caulfield East, Australia

ABSTRACT KEYWORDS
Business Intelligence (BI) systems are an important part of many Business intelligence;
organisations’ IT portfolios. While the evolutionary nature of other data warehousing; system
kinds of decision support technology has been noted, there is little evolution; healthcare; case
study
research investigating the evolutionary nature of BI systems. This
Downloaded by [90.83.25.163] at 02:59 20 June 2016

paper presents a case study of a BI system development in a large


Australian healthcare institution and uses evolutionary theories from
decision support systems (DSS) to understand the system evolution
observed. The paper concludes that the theories describing evolution
in DSS can also be effectively applied to BI as well. BI practitioners
and developers should be aware of evolutionary triggers, as well as
the different kinds of evolution that can affect BI system evolution.

1. Introduction
Business Intelligence (BI) systems are a significant element of many organisations’ IT port-
folios. BI systems provide reporting and analytical capabilities to enterprises through inte-
grating data collected from a range of internal and external sources, and providing users
with data visualisation tools to improve organisational decision-making. Arnott and Pervan
(2014) note that BI and business analytics (BA) have been a top priority for global CIOs since
the early 2000s, while Gartner (2015) predict that they will continue to be a top priority well
into the future.
BI systems have high visibility within organisations. BI systems have the capability to
break down organisational information barriers and influence decisions that have a strategic
impact on enterprises. BI systems, in part, help to make sense of organisational data, con-
tributing to better-informed decision-making. While successful implementation is important
for any IT initiative, the consequences of BI failure are especially problematic given its high
profile and impact across multiple business units.
One research approach to understanding how to increase the likelihood of a successful
BI implementation is a critical success factors (CSF) analysis. Arnott (2008) conducted a
meta-review of a number of studies including McBride (1997), Poon and Wagner (2001),
Salmeron and Herrero (2005), Sammon and Finnegan (2000), and Wixom and Watson (2001)
that investigated the success of BI, Data Warehouse, and Executive Information Systems (EIS).

CONTACT  Rob Meredith  Rob.Meredith@monash.edu


© 2016 Informa UK Limited, trading as Taylor & Francis Group
464    E. Rahman Safwan et al.

One of the common success factors noted was the importance of an evolutionary develop-
ment approach.
To date, there are few studies that have specifically investigated BI system evolution, its
causes, patterns, and management by the development team. Arnott and Pervan (2005,
p.71) argue that BI is the ‘contemporary’ descendent of EIS, both of which are variants of DSS.
Arnott and Pervan (2014) however distinguish BI from earlier EIS, arguing that BI is intended
for wider adoption throughout an organisation than just the executive suite which was the
focus of EIS. They highlight differences in technology as well, including web-based interfaces,
dashboard-style reporting and the use of business performance measurement techniques
such as Balanced Scorecard.
The paper argues that even though BI systems end up being large-scale systems used by
decision makers at many levels in an organisation, its evolutionary nature does not allow it
to be initially developed as a large scale system. BI developers face the same issues of unclear
and changing requirements as past DSS and EIS developers had. A BI system can start off as
a system developed for supporting narrowly concentrated decision tasks, and the scope
Downloaded by [90.83.25.163] at 02:59 20 June 2016

gradually increases as the system evolves. New functionality is added as a result of system
use and other internal and external triggers that cause system evolution.
This paper describes one such system undergoing evolution. We draw on the DSS evo-
lutionary framework of Arnott (2004) to describe the tempo, aetiology and triggers of evolu-
tion in the case. In doing so, we demonstrate that BI evolution shares similar features with
DSS evolution, lending support to Arnott and Pervan’s (2005) assertion that BI belongs to
the ‘family tree’ of DSS. This paper provides an argument that Arnott’s (2004) evolutionary
framework can be applied by researchers investigating BI as well as DSS, and highlights a
number of evolutionary concepts industry practitioners need to be aware of when devel-
oping BI systems.
The paper is organised as follows. The next section provides a brief review of the literature
on DSS evolution, followed by a review of the limited extant literature on the evolution of
BI systems. The subsequent section presents the case of BI development at a large Australian
healthcare organisation, followed by discussion and conclusions for understanding BI sys-
tems evolution in practice and research.

2.  Evolutionary development of DSS


The idea of evolution has been central to DSS theory and research since Keen’s (1980) adap-
tive framework for DSS development. Keen argued that DSS can only properly provide deci-
sion support where the system responds and adapts to shifting decision support requirements
as the decision-maker’s understanding of the decision task is influenced and changed by
the DSS. The evidence for evolution as a critical success factor for DSS was first noted by
Meador and Ness (1974), Ness (1975), and Courbon, Grajew, and Tolovi (1978), but Keen’s
(1980) framework described the interactions between the system, the user, and the builder,
as well as the decision task and the organisational environment.
Decision-making as a task, and therefore decision support as a design objective, is par-
ticularly complex. Sprague (1980) notes that there is no single approach to decision-making,
and the conditions under which decision-makers operate are constantly changing. Keen
(1980) showed that not only are the initial decision support requirements of users necessarily
poorly understood at the outset of a DSS development project, they shift and change as a
Journal of Decision Systems   465

direct result of the provision of decision support by the system. In short, the decision-maker
learns about the decision problem, and the decision questions that need answering change
as the user works through the decision-making process. The DSS design therefore has to
adapt as a direct result of its successful use. This means that development approaches such
as the traditional ‘waterfall’ model of the systems development lifecycle that are predicated
on a clearly explicated specification of design requirements prior to implementation (and
therefore use) are inappropriate (Weinberg, 1991).

2.1.  Arnott (2004) DSS evolution framework


Arnott’s (2004) framework of DSS evolution (see Figure 1 below) is based on three concepts
from the theory of evolutionary biology: aetiology (or evolutionary trigger), lineage and
tempo. He argued that DSS can evolve with different tempos: continuous evolution, punc-
tuated equilibrium, and quantum evolution. He suggested that the lineage of a DSS can
evolve at two different levels. Individual DSS applications can evolve at the micro level, giving
Downloaded by [90.83.25.163] at 02:59 20 June 2016

rise to evolution in functionality – so-called ‘within-application’ evolution. DSS can also evolve
into new ‘species’ where entirely new DSS applications arise from the development and use
process. Arnott refers to this macro-level evolution as ‘between-application’ evolution.
Arnott (2004) suggested that system use is not the only cause of DSS evolution. He
describes two categories of evolutionary triggers, or aetiology: cognitive causal factors
related to the users and their interaction with the system; and environmental causal factors
resulting from factors external to the user, developer and the system itself. Table 1 lists
examples of these two kinds of evolutionary triggers given by Arnott (2004).

Figure 1. Arnott’s (2004) framework of DSS evolution, with aetiology on the vertical axis, lineage on the
horizontal, and tempo populating the cells of the matrix.

Table 1. Aetiology of DSS evolution from Arnott (2004).


Cognitive causal factors Environmental causal factors
System use Technology change
Analyst interaction Personnel change
Peer interaction Internal organisational change
Consultant interaction Merger and Acquisition
Training course Industry changes
‘Idle’ thought Coevolution
466    E. Rahman Safwan et al.

Arnott combines these three concepts, aetiology, lineage and tempo, into a single inte-
grated framework, mapping the kinds of evolutionary tempos that might be observed
against aetiology and lineage, shown in Figure 1 below:
Despite the nature of BI as a decision support tool, and the noted importance of evolu-
tionary approaches to DSS development, there are few studies that specifically investigate
the nature of evolution in BI systems development. Where studies do look at BI development
(Bara et al., 2009; Gangadharan & Swami, 2004; Olszak & Ziemba, 2007), they do not adopt
a full evolutionary view. While many do incorporate iterative aspects in the development
process, iteration per se is not evolution. Conversely, in papers such as Yeoh and Koronios
(2010) where evolution is noted as a CSF for BI development, the concept is not expanded
to look at evolutionary characteristics (such as those in the Arnott (2004) framework), nor
applied empirically.
Given the lack of empirical research investigating the evolution of BI systems development
in practice, the remainder of the paper presents a case study of BI systems development in
a large Australian hospital and applies the Arnott (2004) framework of DSS evolution to
Downloaded by [90.83.25.163] at 02:59 20 June 2016

provide a description of the evolutionary nature of the project, as well as demonstrate the
efficacy of Arnott’s DSS framework to BI systems.

3.  A case study of BI system evolution in healthcare


In this section we present a single case of a BI development project at a large Australian
teaching hospital. We adopted an interpretivist approach (Myers, 1997) with the unit of
analysis being the development of a single BI system (including all of the components and
applications that together formed the system) in the hospital. Semi-structured interviews
were used, which allowed a balance between pre-determined questions to investigate spe-
cific theoretical concepts, and the broader sense making actions of the interviewees. This
allowed a rich understanding of the evolution of the system. The case was selected oppor-
tunistically through the authors’ professional network of contacts.
The hospital, as well as two other partner institutions with numerous departments, had
its data spread out throughout these institutions. The data resided in several thousand data-
bases. The hospital did not have any established data governance strategy that would ensure
the safe-guarding of this critical hospital and medical data. Thus, an effort to secure the data
was initiated in 2011, initiated by the hospital’s finance department. The spread of data
across so many databases affected the progress of the development team in the hospital’s
attempt to develop the BI platform and is a continuing issue for the organisation. We inter-
viewed six members of the development team in the Q2 and Q3 of 2015 who were involved
with the development of the BI system. These included a department head, an information
architect, a business solution designer, a database administrator/application developer and
the acting director of their applications and knowledge management (AKM) department.
Most of the interviewees had at least one and up to five years of work experience in the
industry, and had tertiary qualifications in IT.
The BI system was developed to support several decision domains: hospital management,
clinical auditing, clinical decision support, and clinical research. The system underwent a
number of evolutionary cycles, and its changing structure is portrayed through the timeline
presented in Figure 2.
Journal of Decision Systems   467

Figure 2. BI system timeline, with the case data collection period shaded. See Table 2 below for explanations
Downloaded by [90.83.25.163] at 02:59 20 June 2016

of acronyms and project-specific terms.

Table 2. Identified stages of BI system evolution at Hospital A.


Stage Brief description
1 BI Initiation The first stage when the development of the BI system was first
undertaken by the development team under the finance department.
2 REASON platform development This stage started when the Research Analytics and Operations (REASON)
platform project was undertaken for development by the Information
Development Division (IDD).
3 Project office establishment The start of the collaboration between the development team and the
project office that brought changes to the system. The project office had
a continuous effect on change due to the different projects being
initiated at the hospital.
4 Dashboard development The fourth stage of development was an interactive dashboard capability
developed for hospital executive use.
5 Cohort Discovery Tool (CDT) The CDT is a knowledge discovery tool using advanced search techniques
development added to the BI system portfolio for identifying cohorts among the
hospital patients.
6 REDCap procurement Research Electronic Data Capture (REDCap) application was added as a
backend tool as a data collection tools for BI system. REDCap was a
software solution designed for rapid development of data collection
tools for use in clinical and research data collection (Harris et al., 2009).
7 System Split This was a major split of the system, with one split focused on healthcare
management and the other on clinical decision support (including
research and clinical auditing).

The development of the BI system was an on-going effort as the system evolved, with
several project stages shown in Table 2.
In the following section, we apply the Arnott (2004) DSS evolution framework to the
development process of the BI system at the hospital.

3.1. Aetiology
The BI system evolved with the addition of functionality and support for new decision areas.
Arnott’s (2004) framework describes two kinds of evolutionary triggers: cognitive and
468    E. Rahman Safwan et al.

environmental. Both were observed in the case. It was further observed that multiple evo-
lutionary cycles were caused by a combination of different triggers. The evolutionary triggers
are discussed in the context of the observed micro- and macro-level system evolution in the
following sections. The interpretation of the case provides an understanding of the complex
nature of the observed BI system evolution.

3.2.  Lineage – within-application evolution


There was a significant amount of small-scale, within-application evolution of the system as
functionality and design was modified throughout the development process. We outline
here some of the causes and effects of these micro-evolutionary changes.

3.2.1.  System use as an evolutionary causal factor


In line with Keen’s (1980) observations of adaptive development for DSS, usage of the system
was itself a key driver of evolutionary pressure. Use of the system acted as a stimulus for the
Downloaded by [90.83.25.163] at 02:59 20 June 2016

users as they realised what was potentially possible with the system and placed pressure on
the developers to adapt the system accordingly. The project’s information architect reflected
on the triggering of changes resulting from system use as follows:
Absolutely, once they realised what was available, it was like oh look! Something shiny. So they
would say “right, so how do we get access to that”“How do I use this”“What does this data mean”.
So you constantly fill the enquiries about the information in the platform and helping them to
be able to integrate it with their systems. So one week they might be looking at blood dona-
tions, next week might be looking at pathology results, and so it’s a constant evolving process.
According to the developers, training activities, engaging the users and educating them
first about the system was essential. This was because the system had different user groups
who were neither aware of the existence of the system, nor did they have any comprehension
of its capabilities. Although a significant trigger, other evolutionary causal factors were also
identified apart from system use.

3.2.2.  Changes to the data architecture


The data architecture of the system evolved because of both cognitive and environmental
factors. The data architecture design was subject to change based on the ‘idle’ thought of
the development team, especially the operating sponsor who aimed to reduce the number
of databases from the several thousand prior to the project to a more manageable number.
The project’s information architect commented on the data architecture:
When we started there was reporting services, 800 or 900 reports looking at the system, there
was no standards, no easy way to get to the data and very manual process were in place to load
data into it, and it was a nightmare. A big spaghetti mess of data.
Environment factors included adoption of new versions of the data management soft-
ware, and new metadata management software being implemented by the development
team. The data architecture was also affected by between-application evolution, where the
system was split in two to support hospital management on the one hand and clinical deci-
sion-making on the other, as described below in Section 3.3.
Journal of Decision Systems   469

3.2.3.  Increasing data sources


Due to the large number of departments in the hospital and the commensurate complexity
in data sources, it was not possible to acquire the data from all possible sources at the
beginning of the development. The development team faced hurdles from some depart-
ments who considered their data was not shareable. The project’s business solution designer
stated:
I think what from my experience I felt like in healthcare industries people work in silos, and they
think their data is very valuable to them, and it cannot be shared.
Getting access to data from both internal and external stakeholders was also a challenge
due to multiple factors such as the location of data, size of the data, bureaucracy, and organ-
isational politics. The project’s operating sponsor said:
We had to push quite hard to get the data from that system, and it took probably a year of
pushing and negotiation but we eventually got access to it.
The development team also had issues trying to get access to data from the vendor of
the hospital information system who stored the data for the hospital. The sponsor said:
Downloaded by [90.83.25.163] at 02:59 20 June 2016

We would say we want to get access to data, and they would say what are you trying to do?
Because they are trying to sell you something extra.
Along with the data structure, the data sources feeding the data structure also evolved.
Environmental factors caused these changes as well, as a merger between the hospital and
two others caused new data sources to become available.

3.2.4.  Evolution in the number of reports


One of the effects of the evolution of the system was that the number of reports increased,
primarily through the influence of the users. Often they would demand ad hoc reports, which
would then be incorporated into the system design. Through training and interacting with
the analysts, the users’ understanding of how they could use the system, and what additional
information they could receive changed.

3.2.5.  Evolution in visualisation tools and techniques


The system evolved for hospital management use through the addition of a dashboard tool
from a renowned BI vendor. This evolution was driven by both cognitive and environmental
factors. As the users used the system, they desired to have more control over generating
reports and checking the status of hospital KPIs, and through discussion with the analysts,
the demand for dashboard features came into the picture. The chosen dashboard interface
had the ability to provide greater control to the users to choose how they wanted to view
the information. The project’s operating sponsor said:
So some people just want summary lines for certain things, and others want summary lines and
detailed data behind it that you can drill in, cross references and things.
Simultaneously, the push from technology vendors demonstrating advanced features in
dashboard format triggered the demand for the functionality.
Addition of another tool to the BI portfolio was the cohort discovery tool (CDT). The CDT
was an in-house knowledge discovery and information retrieval tool that allowed the users
to ask questions to the platform about a particular topic, and the system would retrieve all
existing information the hospital had on that particular topic or query, and present the
results in a predetermined format. The tool had two different interfaces. One was a simple
470    E. Rahman Safwan et al.

interface with a limited subset of data commands navigated through point and click proce-
dures. Another was a more advanced interface that involved interaction through a scripting
language. The complex interface provided broader functionality compared to the simple
interface but required technical skills to operate. This tool targeted the data managers, cli-
nicians, and researchers who required access to data, but were unaware of what data were
available. The CDT arose out of interaction between the development team members and
reflecting on how to push data out to the clinical users in different ways.

3.2.6.  Key findings for within-application evolution


The evolution of functionality within the system was caused by both cognitive and environ-
mental factors. In addition to the users demanding changes resulting from system use, the
development team also played an active role in contributing to the system evolution espe-
cially through idle thought and peer interaction. Some of the evolutionary cycles were
affected by both cognitive and environmental factors simultaneously, such as the evolution
of the data architecture and the acquisition of dashboard tool. Evolution of the system was
Downloaded by [90.83.25.163] at 02:59 20 June 2016

complex, driven by multiple causes.


In terms of lineage, the difference between within-application and between-application
evolutionary cycles sits on a spectrum, rather than representing distinct classes of evolu-
tionary types.

3.3.  Lineage – between-application evolution


There were three major macro-level evolutionary events in the life of the BI system. The first
occurred at the end of 2011 with the development of the REASON platform. The platform
was developed to support two major application areas: hospital management and clinical
decision-making. The second came with the acquisition of a tool known as Research
Electronic Data Capture, or REDCap. This tool allowed the hospital to rapidly deploy a number

Table 3. Evolution in the hospital BI system.


Within-application Between-application
Evolution Cause Evolution Cause
Cognitive Causal Data Architecture Idle Thought
Factor Number of Screens System Use
Training Course
Analyst Interaction
Dashboard System Use
Analyst Interaction
CDT Peer Interaction
Idle Thought
Environmental Data Architecture Technology Change REASON Platform Industry Changes
Causal Factor Internal Internal Organisational Change
Organisational
Change
Data Source Merger and REDCap Technology Changes
Acquisition
Internal Industry Changes
Organisational
Change
Dashboard Technology Change System Split Internal Organisational Change
Journal of Decision Systems   471

of data collection tools throughout the organisation and was widely used by staff from a
range of departments. REDCap collected data for clinical auditing and clinical research. The
third major evolutionary event occurred in 2015 with a split in the system, again along the
lines of hospital management and clinical decision support.
Triggers for these splits were environmental, as shown in Table 3. The changes were a
result of a combination of industry changes and internal organisational change. The major
organisational change was the creation of an information development division (IDD) in
2011.
At the hospital, clinical quality management was approached through clinical audits that
were performed in different departments and driven by industry changes. The inclusion of
REDCap increased the scope of the BI system to include clinical auditing. Along with this
change in technology, a number of institutions joined the hospital in a consortium, using
REDCap as a platform for research and creating data collection tools. This not only facilitated
research in the consortium, but also led to creating tools for audits to participate in clinical
quality management.
Downloaded by [90.83.25.163] at 02:59 20 June 2016

A particularly interesting observation from the case is that the development pressures
of supporting the two application areas – hospital management and clinical decision
support – lead to the major system split in 2015. Supporting both decision types through
the same system was complex, and could not be handled by a single development team.
The system branched out mainly due to internal organisational change as the development
team was split, and a new department was created to handle decision support for hospital
management, while the existing development team focused on clinical decision support. It
was noted at the time that the need for different development teams wasn’t due to
technology, but rather the distinct difference between the two different classes of decisions
supported, with different skills needed in information requirements gathering.

3.3.1.  Key findings for between-application evolution


Between-application evolution was caused due to environmental triggers rather than cog-
nitive triggers in this particular case. Among the environmental triggers, internal organisa-
tional change and technological changes were the two significant drivers.
With regard to supporting both clinical decision-making and hospital management, an
arrangement for catering for both within a single BI system supported by a single develop-
ment team was not successful. There were communication issues between the development
team and the users who required reports for hospital management, and the development
team were overwhelmed trying to manage both application areas. This highlights a key
challenge for BI developers in enterprises. Large scale BI systems supporting disparate user
groups may not be manageable by the same team. The difference in focus between opera-
tional clinical decision support, versus the financial and strategic focus of hospital manage-
ment decision support, was too distinct for the one team to manage. This suggests that
operational business intelligence and strategic business intelligence may require different
skill sets, and as a consequence, different teams to effectively deliver the different kinds of
BI solutions.
Table 3 below summarises the evolution of the hospital BI system in terms of the concepts
of lineage and aetiology from Arnott’s (2004) evolution framework. As is evident from the
table, the BI system did not exhibit any between-application evolution caused by cognitive
evolutionary triggers. We do not believe that this is necessarily due to the absence of any
472    E. Rahman Safwan et al.

Figure 3. Evolutionary tempo, from Arnott (2004).

such evolution occurring, but rather as a result of the fact that participants in the study were
developers rather than system users. Although the participants were able to offer some
commentary on cognitive causal factors, the secondary nature of this evidence means that
some cognitive triggers may not have been reported to the researchers. This highlights the
need for a close working relationship between developers and users to facilitate evolutionary
Downloaded by [90.83.25.163] at 02:59 20 June 2016

development. Without such a close relationship, it’s possible that changes in the user’s under-
standing of the task leading to new system requirements may not be captured.

3.4.  Evolutionary tempo of the BI system


Evolution of the BI system was subject to varying tempos, as the functionalities were added
or new applications emerged over time. To understand the project tempo, as part of the
semi-structured interviews three simple graphs based on Arnott (2004) were presented to
the development team, as shown in Figure 3. Graph (a) represents continuous evolution,
graph (b) punctuated equilibrium, and graph (c) quantum evolution. Participants were asked
which of the graphs best represented their overall impression of the evolutionary tempo of
the BI system.
The majority of the development team members suggested that the system underwent
a combination of punctuated equilibrium and quantum evolution. One of the development
team members said:
There are definitely incremental jumps where you know new things happen where it goes
straight up in time, and then that results in some increased functionality and everybody gets
very engaged and that continues to evolve, and then it becomes very static again and it needs
another jump for something else to occur.
The member suggested that there were several stable periods in the development of the
BI system and radical change such as quantum evolution did not take place often. Stability
came after the effect of an evolution cause had subsided, and before any new evolution was
triggered through other factors. Periods of stability were also witnessed due to some obsta-
cles that slowed down or hindered evolution. Factors such as lack of sufficient resources
often stopped the development team from providing the necessary functionalities that
would lead to evolution. They had to wait until they could acquire funds or allocate technical
work to external contractors.
Based on the understanding of the development team, the evolutionary tempo of the
overall BI system could be categorised as a hybrid between punctuated equilibrium and
Journal of Decision Systems   473

quantum evolution, where punctuated equilibrium was more dominant than quantum
evolution.

3.4.1.  Key findings for evolutionary tempo


The hybrid nature of evolution, where the pace of evolutionary change varied significantly
over the life of the project is of significance to BI practitioners. Periods of stability are not
cause for complacency, and these periods should perhaps be used to consolidate resources
in preparation for the next wave of evolutionary cycles. This could help to make the pace of
change when evolution does occur more consistent and therefore manageable. It also means
that factors blocking evolutionary cycles such as lack of resources occur less frequently.

4. Conclusion
In this paper, we’ve presented a case study of BI systems development in a healthcare context.
We have taken Arnott’s (2004) framework, shown in Figure 1, for DSS evolution and applied
Downloaded by [90.83.25.163] at 02:59 20 June 2016

it to the case. The motivation for doing so was twofold. First, given the lack of empirical
research investigating the nature of BI evolution, we wanted to test the proposition that
Arnott’s framework was an effective theory for understanding evolutionary phenomena in
a BI-specific context. In regard to this first objective, we believe that we have demonstrated
that the framework can, indeed, be a useful lens for research and analysis of evolution in a
BI context. Understanding the nature of evolutionary development of these critical enterprise
systems should be a key agenda for DSS and BI researchers.
The second motivation was to understand and explain the evolutionary events that
occurred in the case to better understand the development challenges faced by BI
practitioners.
Evolution of the system was observed as a result of both of Arnott’s aetiological types:
cognitive and environmental factors. Cognitive factors came from both system users as well
as the development team as they increased the functionality of the system. Macro-level,
between-application evolution was caused by several environmental triggers. Among the
environmental factors, three of the causes: industry changes, internal organisational change,
and technology change were dominant. Overall, internal organisational change and tech-
nology change had impact on both within- and between-application evolution. Cognitive
factors did not have any significant impact on the evolution of applications, but did affect
the functionality of the system.
The case demonstrates the importance of the contribution of users to the evolution in
the system. In terms of DSS, the important role of the user was noted at least as early as
Keen’s (1980) framework for adaptive development. However, the case also demonstrates
that the wide range of evolutionary triggers noted by Arnott (2004) also apply to BI, and
that developers need to be cognisant of the need to look beyond just user-driven triggers
when anticipating evolutionary cycles.
Given the absence of user input in this study, there was a lack of evidence for cognitive
triggers forcing between-application evolution. This lack of evidence could be due to the
participants in the study consisting of development team members, rather than users of the
system. If true, this suggests a possible lack of awareness on the part of the developers as
to the shift in user understanding that may have taken place, highlighting the challenge for
developers in identifying the need for new adaptive development resulting from cognitive
474    E. Rahman Safwan et al.

factors. A close relationship between developers and users is essential. Developers cannot
assume they have an independent understanding of the system requirements.
The case also highlights the fact that periods of stability in a BI system’s life are not nec-
essarily a result of ‘all being well’. Blocking factors such as a lack of resources can mean that
evolutionary pressures are building up while further development is held back. Evolutionary
tempo can vary significantly over time, and developers need to remain vigilant even during
periods of apparent stability.
A key contribution of this work, therefore, is in alerting BI developers to the applicability
of Arnott’s (2004) DSS evolution framework to BI development. In particular, it provides
developers with a tool for better understanding, predicting and managing evolution within
a BI project.
As a final concluding remark, we note that there are few case studies of BI development
in a healthcare setting. While there is a body of literature on healthcare DSS more broadly,
the case highlights the difference between the use of BI for more traditional kinds of organ-
isation decision support for hospital management, and the use for clinical decision support.
Downloaded by [90.83.25.163] at 02:59 20 June 2016

This paper therefore is an initial contribution to the body of literature reporting empirical
data on healthcare BI development and use.

Disclosure statement
No potential conflict of interest was reported by the authors.

References
Arnott, D. (2004). Decision support systems evolution: Framework, case study and research agenda.
European Journal of Information Systems, 13, 247–259.
Arnott, D. (2008). Success Factors for Data Warehouse and Business Intelligence Systems. Paper presented
at the 19th Australasian Conference on Information Systems, Christchurch.
Arnott, D., & Pervan, G. (2005). A critical analysis of decision support systems research. Journal of
Information Technology, 20, 67–87.
Arnott, D., & Pervan, G. (2014). A critical analysis of decision support systems research revisited: The
rise of design science. Journal of Information Technology, 29, 269–293.
Bara, A., Botha, I., Diaconita, V., Lungu, I., Velicanu, A., & Velicanu, M. (2009). A model for business
intelligence systems’ development. Informatica Economica, 13, 99–108.
Courbon, J. C., Grajew, J., & Tolovi, J. (1978). Design and implementation of interactive decision support
systems: An evolutive approach. Unpublished Manuscript. Institute d’Administration des Enterprises,
Grenoble, France.
Gangadharan, G. R., & Swami, S. N. (2004). Business Intelligence Systems: Design and Implementation
Strategies. Paper presented at the 26th International Conference on Information Technology
Interfaces.
Gartner. (2015). Flipping to digital leadership: insights from the 2015 Gartner CIO agenda report.
Gartner Executive Programs. Retrieved 19 May, 2016, from http://www.gartner.com/imagesrv/cio/
pdf/cio_agenda_insights2015.pdf
Harris, P. A., Taylor, R., Thielke, R., Payne, J., Gonzalez, N., & Conde, J. G. (2009). Research electronic
data capture (REDCap)—A metadata-driven methodology and workflow process for providing
translational research informatics support. Journal of Biomedical Informatics, 42, 377–381.
Keen, P. G. W. (1980). Adaptive design for decision support systems. SIGMIS Database, 12, 15–25.
McBride, N. (1997). The rise and fall of an executive information system: A case study. Information
Systems Journal, 7, 277–287.
Journal of Decision Systems   475

Meador, C. L., & Ness, D. N. (1974). Decision support systems: An approach to corporate planning. Sloan
Management Review, 15, 51–68.
Myers, M. D. (1997). Qualitative research in information systems. MIS Quarterly, 21, 241–242.
Ness, D. N. (1975). Interactive Systems: Theories of Design. Paper presented at the Joint Wharton/ONR
Conference on Interactive Information and DSS, The Wharton School: University of Pennsylvania.
Olszak, C. M., & Ziemba, E. (2007). Approach to buidling and implementing business intelligence
systems. Interdisciplinary Journal of Information, Knowledge, and Management, 2, 134–148.
Poon, P., & Wagner, C. (2001). Critical success factors revisited: Success and failure cases of information
systems for senior executives. Decision Support Systems, 30, 393–418.
Salmeron, J. L., & Herrero, I. (2005). An AHP-based methodology to rank critical success factors of
executive information systems. Computer Standards and Interfaces, 28, 1–12.
Sammon, D., & Finnegan, P. (2000). The ten commandments of data warehousing. Database for Advances
in Information Systems, 31, 82–91.
Sprague, R. H. J. (1980). A framework for the development of decision support systems. MIS Quarterly,
4, 1–26.
Weinberg, R. S. (1991). Prototyping and the systems development life cycle. Information System
Management, 8, 47–53.
Wixom, B. H., & Watson, H. J. (2001). An empirical investigation of the factors affecting data warehousing
Downloaded by [90.83.25.163] at 02:59 20 June 2016

success. MIS Quarterly, 25, 17–41.


Yeoh, W., & Koronios, A. (2010). Critical success factors for business intelligence systems. Journal of
Computer Information Systems, 50, 23–32.

You might also like