You are on page 1of 6

Enterprise Architecture Implementation Model:

Measurement from Experts and Practitioner


Perspectives

Nur Azaliah A.Bakar1, Harihodin S.2 and Nazri Kama3


Advanced Informatics School
Universiti Teknologi Malaysia
Kuala Lumpur, Malaysia
azzaliya@gmail.com1; harihodin@utm.my2; mdnazri@utm.my3

Abstract— Enterprise Architecture (EA) is vital element However, many organisations facing problem in EA
for organisation to ensure the viability of the organisation implementation due to complexity of the EA frameworks,
functionality. However, many organisations facing problem rigidity of business function and the messy IT infrastructure
in EA implementation due to complexity of the EA itself. Though there are various solutions for better EA
frameworks, rigidity of business function and the chaotic IT implementation suggested in previous studies, but it yet to
structure. Various suggestions for better EA implementation be verified by the EA practitioners in real case scenario.
in previous studies are yet to be verified by the EA Therefore, this paper aims to measure the influential factors
practitioners in real case scenario. Therefore, this paper aims in EA implementation process from both experts and
to measure the influential factors in EA implementation
practitioners perspectives. The paper is organised as
process from both experts and practitioners perspectives.
This EA implementation model was formulated based on 27
follows; next section discusses on issues in EA
factors from six categories, (IP-Internal Process, LG- implementation and proposed model for EA
Learning and Growth, AS-Authority Support, CS-Cost, TC- implementation. This is followed by the research
Technology and TM-Talent Management) which are gathered methodology applied in this study. Next is the results and
from previous studies and case studies of Malaysian Public discussion section where the descriptive and statistical
Sector organisations. To measure this model, survey analyses were explained. Finally this paper ends with
questionnaire was conducted to both selected EA experts and conclusion and proposed future works.
practitioners with intention to identify any differences
between the theoretical and practical aspects in EA II. ISSUES IN EA IMPLEMENTATION
implementation. Findings reveal that, there is no significance From the literatures, there are three main issues
difference on the level of agreement between the EA experts identified in EA implementation process. Firstly the
and practitioners except for three factors which are IP6-Rules complexity of existing EA frameworks and methodologies
and Process, AS5-Political Influence and CS1-Financial [7]; secondly the rigidity of organisation business
Resources. Hence, it can be concluded that both experts and functionality [8] and thirdly the disorganised IT structure
practitioner share the same opinion on the factors that
[9].
influence the EA implementation process.
A. Complexity of existing EA frameworks and
Keywords— enterprise architecture; public sector; EA methodologies
implementation; EA measurement
Despite of large number of architectural frameworks
I. INTRODUCTION available, most of them are complicated to understand.
Studies report that existing EA frameworks focus on
Enterprise Architecture (EA) is a hierarchical approach
technology and business process solutions but do not
used to align business and information technology (IT) by
address the challenges of EA development, implementation
integrating the information systems, processes,
and adoption in the organisation [7, 10]. In addition,
organisational units and people in an organisation. The aim
previously organisation tends to develop and implement EA
is to further enhanced various IT systems in the public
in a large scale and this has increased the risk of failure
sector in order to provide better services to the citizens and
[11]. As an alternative, it has been suggested to build the
business [1-3]. A robust architecture of IT system will
EA incrementally but the drawback of this approach is it
facilitates better communication between the government
takes time and discipline to ensure it progressing well [12].
and citizen [4]. EA also translates the organisational vision
As the consequences, EA initiatives took longer time to
and mission into operational reality and leverage on current
complete, which later it will get halted and even worst
technology to improve the public sector service delivery
terminated.
system [5, 6].

978-1-5090-0751-6/16/$31.00 ©2016 IEEE 1


B. Rigidity of organisation business functionality methodologies were analysed in this study in order to
Many organisations having difficulties in implementing understand the common EA implementation process.
an effective EA function due to inflexibility and complexity Those frameworks/methodologies are Enterprise
of the business and IT structures [8]. Study by Roeleven Architecture Planning (EAP), the Open Group Architecture
and Broer [13] reveal that more than 66 per cent of EA Framework (TOGAF), Federal Enterprise Architecture
program in Netherlands did not fulfil the expectation and (FEA), Extended Enterprise Architecture (E2A), the State
this has resulted longer time spend for EADI process. of Arizona’s Enterprise Architecture, Methodology for
Gartner Group predicted 40 per cent of all EA programs AGency ENTerprise Architecture (MAGENTA), Enterprise
would be terminated caused by failure to demonstrate Architecture Process Model (EAPM) and Malaysian Public
sufficient value to the business [14]. Whereby in United Sector 1Government Enterprise Architecture (1GovEA).
States, most of the Federal EA programs also produced These frameworks/methodologies are selected because it
unsatisfactory results and some have not produced any has clear EA implementation process which can be
results at all [15]. Meanwhile in Malaysian scenario, the analysed in this research context. However, Zachman
implementation of EA is still at infancy level [16-18] and framework is not selected in the review because it is a
none of the organisation has operationalise their EA. descriptive enterprise architecture and does not provide
process for creating its artifacts [22, 23].
C. Disorganised IT structures.
From the EA implementation process analysis, in
No doubt that the chaotic and disorganised IT structures general all these frameworks/methodology have common
cause problem to EA implementation process. In the flows in implementing EA. The process starts from 1)
literatures, poor architecture means nebulous, incorrect, or initiate, 2) assess and plan, 3) analyse and define, 4) design
ill-defined design requirements [9]. This has caused lack of and develop, 5) implement and operate and 6) maintain and
alignment between business activities and IT as well raising review. It is an iterative cycle whereby if there is any new
number of informal or undefined processes for the or latest update on EA, all the processes will start at step 2
management or delivery of architectural projects [19]. again. Fig. 1 depicts the proposed research framework of
Hence this impacted the EA implementation process this study.
because the organisation has to redefine and assess all the
business process which may requires additional Internal Process Learning and Growth Authority Support
 Business Driven  Assessment  Stakeholder
implementation period. Approach  Documentation Support
 Strategic Planning  Learning Culture  Stakeholder Benefit
III. PROPOSED EA IMPLEMENTATION MODEL  Implementation  Skill of Architect  EA Recognition
This study proposed an EA Implementation Model with Roadmap  Training and  Mandate
 Governance certification  Political Influence
aim to identify the influential factors in EA implementation
 Rules and Process  Community of  Stakeholder
process. The underpinning theory of this study is based on  Organisation Value Practice Understanding
the Balanced Scorecard (BSC) by Kaplan and Norton [20]
supported by common process in EA implementation from
various EA frameworks/methodologies. EA IMPLEMENTATION PROCESS
2-Assess & 3-Analyse &
BSC is a strategic planning and management system 1-Initiate Plan Define
that is widely applicable to organisation in any size or type
of business. It consists of a set of measures to assess how 6-Maintain & 5-Implement 4-Design &
Review & Operate Develop
the organisation is progressing toward meeting its strategic
goals. Originally BSC consists of four perspectives which
are financial, customer, internal business process, and
learning and growth perspective. This study adopted Cost Technology Talent Management
 Financial  EA Technology  Talent Management
another BSC measurement which is for non-profit Resources  Technology Plan
organisation (such as public sector) introduced by Kaplan  Non-financial Support  Centralised Enterprise
and Norton in 2001 [21] that consists of internal process, Resources  EA Repository Architect team
learning and growth, authority support and cost.  Central Funding  Retention Program

Findings from preliminary study, literature review and Fig. 1. Proposed EA Implementation Model
multiple case studies conducted in Malaysian Public Sector
(MPS) revealed 27 influential factors in EA implementation IV. RESEARCH METHODOLOGY
process. These factors are grouped into six categories
which are initially the internal process, learning and growth, A questionnaire-based survey was conducted to measure
authority support, cost and two new categories, technology the weightage of each factor in this proposed EA
and talent management. Details of categories and factors Implementation Model. The rationale for choosing this
are described in Fig. 1. instrument is to get a broad view on coordination in the
field, covering a larger number or respondents consisting of
The EA implementation processes were derived from both experts and practitioners. The respondents identified
existing EA frameworks/methodologies proposed by both for this study were EA practitioners and EA experts
academia and industry. Eight EA frameworks/ worldwide.

978-1-5090-0751-6/16/$31.00 ©2016 IEEE 2


A. Sampling Population and Frame C. Data Collection and Anaysis
According to Saunders[24], sampling frame can be This study applied self-administered questionnaires in
obtained from the existing list of potential samples however collecting the data. Self-administered questionnaires refer
if there is no suitable list exists researchers will have to to a data collection technique in which the respondents read
compile own sampling frame with considerations of its the survey questions and record his or her responses without
validity and reliability. For this study, since there is no the presence of a trained interviewer. The evidence also
official body that in charge of EA domain, therefore it is suggests that people are more likely to give honest answers
impossible to determine the finite numbers of EA experts to self-administered than to interview questions [27].
and practitioners population. Hence, the researcher Manual and online questionnaires were used as medium of
compiled own list of worldwide EA experts and survey process. For manual questionnaires, the researcher
practitioners. The list consist of authors from all published travelled to the respondents’ location and met the
peer reviewed articles in EAI area, authors of industry representative to deliver the survey questionnaires to
reviews in EA implementation and those who practice the respondents. Then, the appointed representative of that
EA implementation. The selected respondents must comply particular location collected the completed surveys and the
with criteria stated which are: filled questionnaires were returned to researcher. [27, 28].
This method was applied for the respondent resides in
1. The respondent must be an EA consultant/service Putrajaya and Klang Valley region. While for those who
provider OR EA academia/researcher OR EA are located outside of Putrajaya, Klang Valley or Malaysia,
practitioner AND they were given the online questionnaire that was made
2. The respondent must have more than five years’ available on the web for two months (October and
experience, knowledge and expertise in EA November 2015).
implementation area.
Once the questionnaires were gathered, the process of
As a result, this study able to identify 273 potential data analysis began. It starts with preparing the data for the
respondents that meet the criteria defined. According to analysis, where the data is filtered, coded and categorised
Creswell[25] if possible it best to select as large sample as accordingly. For this study, the codes and categories are
possible from the population because the larger the sample, readily prepared based on the model proposed. Next step is
the less the potential error is that the sample will. Hence, to execute data screening process, whereby the missing data
this study includes all identified population as its sampling and outliers are identified and treated. Then statistical
frame. analysis was performed using The Statistical Package for
B. Questionnaire Social Sciences for Windows (SPSS, v21.0). The
A questionnaire was designed to determine the Cronbach’s alpha test was firstly applied to evaluate the
demographic status and level of influential factor for EA reliability of the Likert-type scale questions with the aim of
implementation. The questionnaire starts with the ensuring each question under a variable are all measuring
Research Information, Research Definition and the same underlying attributes. Descriptive statistics
Questionnaire Explanation section. Then, followed by (frequencies) were used to describe respondents’
Section A that consists of nine questions asking on demography and mean scores for 27 factors defined in six
respondent’s profile such as role, years of working categories (IP-Internal Process, LG-Learning and Growth,
experience, knowledge level in EA, EA training attended, AS-Authority Support, CS-Cost, TC-Technology and TM-
EA certification obtained, years of EA experience, years of Talent Management). Then, Mann-Whitney test was
organisation involvement in EA and the EA executed to test if there any significant differences of
implementation status in organisation. Section B of the factors mean values between these two groups of EA
questionnaire asks on respondent opinion on EAI Experts and EA practitioners.
Assessment Criteria, whereby there are 27 factors and they V. RESULTS AND DISCUSSION
are categorised into six groups (IP-Internal Process, LG-
Learning and Growth, AS-Authority Support, CS-Cost, This section explains the result of questionnaires which
TC-Technology and TM-Talent Management). The survey consist of descriptive analysis and statistical analysis.
items were measured using five-point Likert scale, ranging A. Descriptive survey results
from 1 (strongly disagree) to 5 (strongly agree).
A total of 177 questionnaires were returned and coded.
The reliability of the instrument was analysed using From the total number of response, 40.26% are experts
Alpha Cronbach method, with reliability coefficients for while 59.74% are the practitioners. Most all of the
IP-Internal Process (α=0.736), LG-Learning and Growth respondents answering this questionnaire are from the
(α=0.642), AS-Authority Support (α=0.753), CS-Cost professional executive level and above. Majority comes
(α=0.674), TC-Technology (α=0.844) and TM-Talent from IT executives (49.35%) followed by IT Manager
Management (α=0.952). According to Hair et. al [26], a (18.18%) and IT Consultant (10.39%). The rest of the list
reliability coefficients above 0.60 indicates that instrument consists of various IT roles thus this indicates that this
measured has achieved acceptable reliability. survey has successfully reached the targeted respondent.
Table I shows the details of respondents’ role in their
organisation.

978-1-5090-0751-6/16/$31.00 ©2016 IEEE 3


TABLE I. DISTIBUTION OF RESPONDENTS BASED ON ROLE IN On analysis of EA knowledge among the respondents,
ORGANISATION
result shows that for EA experts, 45.16% are rated
Role in organisation Number of Percentage excellent, 38.71% are good and 16.13% are average.
respondents (%) Meanwhile for EA practitioners, 10.87% are good, 43.48%
(N=177) are average, 43.48% are fair and 2.17% are rated poor. Fig
IT Executive 87 49.35%
IT Manager 32 18.18%
4 shows the distribution of respondent based on level of EA
IT Consultant 18 10.39% knowledge
Chief Information Officer 11 6.49%
IT Programmer 5 2.60% EA Experts EA Practitioners
IT Solution Provider/ Vendor 5 2.60%
Professor 5 2.60% 50.00% 45.16% 43.48% 43.48%
Analyst 2 1.30% 45.00% 38.71%
Chief Architect 2 1.30% 40.00%
Enterprise Architect 2 1.30% 35.00%
ERP Consultant 2 1.30% 30.00%
25.00%
IT Director 2 1.30%
20.00% 16.13%
IT Researcher 2 1.30%
15.00% 10.87%
10.00%
5.00% 2.17%
0.00% 0.00% 0.00%
The respondents were also asked about the years of 0.00%
working experience. Fig. 2 depicts the details described. Excellent Good Average Fair Poor
As shown, the years of working experience was grouped
into five main groups which are more than 20 years (14%), Fig. 4. Distribution of respondent based on level of EA knowledge
16 to 20 years (17%), 11 to 15 years (28%), 5 to 10 years
(10%) and less than 5 years (25%). Finding shows the B. Analysis on influential factors in EA Implementation
distributions of groups are balanced thus this ensures the Besides general information on the participants and their
survey conducted is reliable and thorough. organisations (such as experience with EA, personal role,
organisation status in EA implementation), the
More
questionnaire contained statements covering the factors
than 20 influencing the EA implementation in the organisation. In
years Less than 5 each statement, answers were given on a 5-point Likert
14% years
25% scale ranging from Strongly Disagree (1) to Strongly Agree
16 to 20 years
(5). Hence this section provides the analysis result on the
17% factors mentioned. This analysis has twofold objectives.
5 to 10 years
16%
1. To investigate if the proposed set of EA implementation
influential factors are agreed by both EA experts and
EA practitioner; and
11 to 15 years
28%
2. To analyse if there any significant difference in EA
implementation influential factors agreed by both EA
experts and EA practitioners
Fig. 2. Distibution of respondents based on years of working experience
1) Measurement of overall EA implementation influential
Meanwhile, Fig. 3 shows the distribution of EA experts factors
by region. As shown, Europe (58%) dominates the number In order to achieve the first objective, the means value
of EA experts, followed by Asia (19%) and North America of each criteria defined were computed. Findings show for
(9%). all factors, the mean score is between 3.805 and 4.532.
SOUTH AUSTRALASIA This indicates that the level of agreement is from Neutral
AMERICA 3% (3), Agree (4) to Strongly Agree (5). When these factors
AFRICA 5%
6% were arranged in order of magnitude, Strategic Planning
score the highest mean value with 4.532, followed by
NORTH
AMERICA
Business Driven Approach at 4.429, Rules and Process
9% (4.403), Implementation Roadmap (4.286) and Governance
(4.247). Surprisingly all factors are from Internal Process
EUROPE
(IP) category. This indicates that it is vital to clearly
ASIA
19%
58% defined all internal processes and disseminate it across the
organisation because it have strong influence to the success
of EA implementation.
Next important factor is Non-Financial Resources
(4.195) ranked at number 6, which refer to number of
Fig. 3. Distribution of EA experts by world regions enterprise architects, EA consultations and expertise sharing

978-1-5090-0751-6/16/$31.00 ©2016 IEEE 4


However, Financial Resources factor (4.117), only ranked book of knowledge (EABOK) regarding the EA
at number 10. This point out that large number of money implementation. In order to conduct the tests, the following
invested in EA does not ensure it success but what matter hypothesis was set up.
most is the organisation need to be equipped with all kinds
of resources. To test for a significant difference between EA experts and
EA practitioner means:
The least influential factors in EA implementation Ho: 1= 2 i.e. there is no significant difference
process is EA Repository factor from Technology-TC between the two means
category (3.805), Central Funding factors from Cost-CS H1 : 1 2 i.e. there is a significant difference between
category (3.831) and EA Technology Support from them
Technology-TC (3.857). This shows that EA technology do
Prior to the statistical test, a normality test was
not have huge influence to the EA implementation process.
performed to identify the characteristic of data. Result
Hence organisation can apply any EA technology preferred
shows the data is not normally distributed, therefore in this
and suitable for their organisation’s need. According to the
case a non-parametric tests for 2-sample; Mann-Whitney
survey too, having centralised EA team also does not have
test was selected. Table III shows the test result on
much contribution in ensuring the success of EA
difference of means between EA experts and EA
implementation. This is because every EA is unique and practitioners groups. Using a significance level of 0.05,
tailored according to organisation business function. there were no significant differences between groups of EA
Therefore every organisation ideally should have own EA experts and EA practitioners in EA implementation
team. Table II shows the details of overall EA influential factors mean scores, except for three factors
implementation influential factors mean scores. which are IP6-Rules & Process, AS5-Political Influence
TABLE II. OVERALL EA IMPLEMENTATION INFLUENTIAL FACTORS
and CS1-Financial Resources. The difference of results of
MEAN SCORES may be due to different scenarios and experiences faced by
the respondents from these two groups. Nevertheless,
Category Factors Code Mean ± SD Rank
[IP] Business Driven IP1 4.429 ± 0.6772 2
overall it can be concluded that the rest of 24 EADI
INTERNAL Approach assessment criteria proposed are agreed by both EA experts
PROCESS Strategic Planning IP2 4.532 ± 0.6195 1 and EA practitioners.
Implementation Roadmap IP3 4.286 ± 0.8407 4
Governance IP4 4.247 ± 0.8136 5
Organisation Value IP5 4.091 ± 0.8612 12
TABLE III. RESULT OF EA IMPLEMENTATION INFLUENTIAL
FACTORS MEAN SCORES BETWEEN EA EXPERTS AND EA PRACTITIONERS
Rules and Process IP6 4.403 ± 0.7119 3
[LG] Assessment LG1 4.052 ± 0.7591 15 CODE Mann- Wilcoxon Z Asymp. Exact Result
LEARNING AND Documentation LG2 4.078 ± 0.8548 13 Whitney U W Sig. (2- Sig. (2-
GROWTH Learning Culture LG3 4.143 ± 0.7731 8 tailed) tailed)
Skill of Architect LG4 3.922 ± 0.9565 21 IP1 693.500 1774.500 -.226 .821 .841 Not Significant
Training and certification LG5 3.922 ± 0.8998 22 IP2 680.000 1761.000 -.396 .692 .749 Not Significant
Community of Practice LG6 3.935 ± 0.8166 20 IP3 616.000 1112.000 -1.100 .271 .272 Not Significant
[AS] Stakeholder Support AS1 4.169 ± 0.7678 7
IP4 666.000 1747.000 -.528 .597 .611 Not Significant
AUTHORITY Stakeholder Benefit AS2 4.143 ± 0.8226 9
SUPPORT IP5 711.000 1792.000 -.023 .982 .993 Not Significant
EA Recognition AS3 4.104 ± 0.8043 11
Mandate AS4 4.065 ± 0.8935 14
IP6 494.500 990.500 -2.426 .015 .015 Significant
Political Influence AS5 4.013 ± 0.7694 17 LG1 641.500 1722.500 -.810 .418 .442 Not Significant
Stakeholder AS6 4.000 ± 0.9032 18 LG2 659.500 1155.500 -.605 .545 .538 Not Significant
Understanding LG3 702.000 1783.000 -.124 .901 .902 Not Significant
[CS] Financial Resources CS1 4.117 ± 0.9028 10 LG4 565.000 1646.000 -1.612 .107 .111 Not Significant
COST Non-financial Resources CS2 4.195 ± 0.7786 6 LG5 690.500 1186.500 -.250 .803 .817 Not Significant
Central Funding CS3 3.831 ± 0.9234 26 LG6 591.500 1672.500 -1.342 .180 .188 Not Significant
[TC] Practical EA Technology TC1 3.922 ± 0.9426 23
AS1 634.000 1130.000 -.884 .377 .385 Not Significant
TECHNOLOGY EA Technology Support TC2 3.857 ± 1.0094 25
AS2 643.000 1139.000 -.782 .434 .444 Not Significant
EA Repository TC3 3.805 ± 1.0887 27
[TM] Talent Management Plan TM1 4.026 ± 0.9028 16
AS3 663.500 1159.500 -.551 .582 .601 Not Significant
TALENT Centralised Enterprise TM2 3.870 ± 0.9508 24 AS4 586.500 1082.500 -1.410 .159 .163 Not Significant
MANAGEMENT Architect team AS5 522.500 1603.500 -2.145 .032 .032 Significant
Retention Program TM3 3.974 ± 0.8732 19 AS6 662.000 1743.000 -.570 .569 .575 Not Significant
CS1 500.500 1581.500 -2.351 .019 .019 Significant
CS2 701.000 1782.000 -.132 .895 .900 Not Significant
2) Measurement on the significant difference by experts CS3 618.000 1699.000 -1.089 .276 .287 Not Significant
and practitioners TC1 658.000 1739.000 -.602 .547 .556 Not Significant
For second objective, the mean analysis for two TC2 629.000 1125.000 -.913 .361 .362 Not Significant
TC3 631.500 1127.500 -.902 .367 .380 Not Significant
different groups (EA expert and practitioner) were TM1 539.500 1620.500 -1.917 .055 .057 Not Significant
computed. The purpose of comparison between EA experts TM2 683.000 1179.000 -.330 .742 .764 Not Significant
and EA practitioners is to investigate if the factors proposed TM3 603.500 1684.500 -1.232 .218 .226 Not Significant
in academia are the same as what is happening in real EA
implementation. In the other hand, this study aims to VI. CONCLUSION AND FUTURE WORKS
validate the claims made by the EA practitioners on EA
The main contribution of this paper is in measuring the
implementation scenario and relate it with the existing EA
factors that influence the implementation of EA in an

978-1-5090-0751-6/16/$31.00 ©2016 IEEE 5


organisation. Findings reveal that all proposed factors from [9] D. M. Mezzanotte Sr and J. Dehlinger, "Enterprise Architecture and
the preliminary study, literatures and case studies and Information Technology: Coping with Organizational
Transformation Applying the Theory of Structuration," in
consistent and agreed by all respondents. Overall there are Proceedings of the International Conference on Software
27 factors identified from six categories which are internal Engineering Research and Practice (SERP), 2015, p. 231.
process, learning and growth, authority support and cost. [10] A. Mitra, L. Zhurakovskaya, and A. Gupta. (2015) The Enterprise
Results also indicate that there is no significance different Transformation Architecture (ETA). BPTrends. Available:
on the factors level of agreement between the EA experts http://www.bptrends.com/
and practitioners except for three factors which are IP6- [11] J. W. Ross, P. Weill, and D. C. Robertson, Enterprise architecture as
Rules & Process, AS5-Political Influence and CS1- strategy: Creating a foundation for business execution: Harvard
Business Press, 2006.
Financial Resources. The differences may happen due to
[12] P. Andersen, A. Carugati, and M. G. Sorensen, "Exploring
different perspectives and scenario where the EA Enterprise Architecture Evaluation Practices: The Case of a Large
implementation process took place. University," in System Sciences (HICSS), 2015 48th Hawaii
International Conference on, 2015, pp. 4089-4098.
In the future, we aim to develop an assessment
[13] S. Roeleven and J. Broer, "Why Two Thirds of Enterprise
mechanism to evaluate the EA implementation process Architecture Projects Fail," ed: ARIS Expert Paper, 2009.
based on the factors identified. However we are aware of
[14] Gartner, "Gartner predicts that by 2012 40% of today's enterprise
the limitations of this study. In particular, the data mostly architecture programs will be stopped (0.8 probability)," ed. Gartner
originated from MPS context only. Hence more work is Enterprise Architecture Summit, 2009.
needed to validate the proposed factors across different [15] S. B. Gaver, "Why Doesn’t the Federal Enterprise Architecture
contexts. As such, we hope that future researchers will Work?," Technology Matters, Inc.2010.
improve and evolve these factors, in order that we benefit [16] Z. Md Dahalin, R. Abd Razak, H. Ibrahim, N. I. Yusop, and M. K.
from the cumulative nature of academic knowledge. Hashim, "Enterprise architecture adoption and implementation in
Malaysia," 2011.
ACKNOWLEDGMENT [17] R. Abd Razak, "An exploratory study of enterprise architecture
practices in Malaysia," Communications of the IBIMA, vol. 3, pp.
The research is financially supported by Research 133-137, 2008.
University Grant Q.K130000.2538.11H23 Universiti [18] R. Abd Razak, Z. Md. Dahalin, R. Dahari, S. S. Kamaruddin, and S.
Teknologi Malaysia (UTM). Abdullah, "Enterprise Information Architecture (EIA): Assessment
of Current Practices in Malaysian Organizations," in 40th Annual
REFERENCES Hawaii International Conference on System Sciences, 2007. , 2007,
pp. 219a-219a.
[1] S. Al-Nasrawi and M. Ibrahim, "An enterprise architecture mapping
approach for realizing e-government," in Third International [19] A. MacCormack, R. Lagerstrom, and C. Y. Baldwin, "A
Conference on Communications and Information Technology Methodology for Operationalizing Enterprise Architecture and
(ICCIT), 2013 2013, pp. 17-21. Evaluating Enterprise IT Flexibility," Harvard Business School
Finance Working Paper, pp. 15-060, 2015.
[2] A. Ojo, T. Janowski, and E. Estevez, "Improving Government
Enterprise Architecture Practice--Maturity Factor Analysis," in 45th [20] R. S. Kaplan and D. P. Norton, "The balanced scorecard-Measures
Hawaii International Conference on System Science (HICSS), 2012 that drive performance," Harvard Business Review, vol. 70/1 pp. 71-
2012, pp. 4260-4269. 79, 1992.
[3] I. Shaanika and T. Iyamu, "Deployment of Enterprise Architecture [21] R. S. Kaplan and D. P. Norton, "Transforming the balanced
in The Namibian Government: The Use of Activity Theory to scorecard from performance measurement to strategic management:
Examine the Influencing Factors," The Electronic Journal of Part I," Accounting Horizons, vol. 15, pp. 87-104, 2001.
Information Systems in Developing Countries, vol. 71, 2015. [22] T. Ylimäki and V. Halttunen, "Method engineering in practice: A
[4] A. Kaushik and A. Raman, "The new data-driven enterprise case of applying the Zachman framework in the context of small
architecture for e-healthcare: Lessons from the Indian public sector," enterprise architecture oriented projects," Information Knowledge
Government Information Quarterly, vol. 32, pp. 63-74, 2015. Systems Management, vol. 5, pp. 189-209, 2005.
[5] M.-M. Saarelainen and V. Hotti, "Does Enterprise Architecture [23] O. O. Oduntan and N. Park, "Enterprise viability model: Extending
Form the Ground for Group Decisions in eGovernment Programme? enterprise architecture frameworks for modeling and analyzing
Qualitative Study of the Finnish National Project for IT in Social viability under turbulence," Journal of Enterprise Transformation,
Services," in 15th IEEE International Enterprise Distributed Object vol. 2, pp. 1-25, 2012.
Computing Conference Workshops (EDOCW), 2011, 2011, pp. 11- [24] M. N. Saunders, M. Saunders, P. Lewis, and A. Thornhill, Research
17. methods for business students, 5/e: Pearson Education India, 2011.
[6] V. Weerakkody, M. Janssen, and K. Hjort-Madsen, "Integration and [25] J. W. Creswell, Research design: Qualitative, quantitative, and
enterprise architecture challenges in e-government: a European mixed methods approaches: Sage publications, 2013.
perspective," International Journal of Cases on Electronic [26] J. Hair, W. Black, B. Babin, and R. Anderson, "Multivariate Data
Commerce (IJCEC), vol. 3, pp. 13-35, 2007. Analysis Seventh Edition Prentice Hall," 2010.
[7] L. A. Kappelman and J. A. Zachman, "The enterprise and its [27] D. A. Dillman, J. D. Smyth, and L. M. Christian, Internet, phone,
architecture: ontology & challenges," Journal of Computer mail, and mixed-mode surveys: the tailored design method: John
Information Systems, vol. 53, pp. 87-95, 2013. Wiley & Sons, 2014.
[8] B. Van der Raadt, M. Bonnet, S. Schouten, and H. Van Vliet, "The [28] W. Zikmund, B. Babin, J. Carr, and M. Griffin, Business research
relation between EA effectiveness and stakeholder satisfaction," methods, 9th ed.: Cengage Learning, 2012.
Journal of Systems and Software, vol. 83, pp. 1954-1969, 2010.

978-1-5090-0751-6/16/$31.00 ©2016 IEEE 6

You might also like