You are on page 1of 32

Measuring the Performance of Information Systems: A Functional Scorecard

JERRY CHA-JAN CHANG AND WILLIAM R. KING
is an Assistant Professor in the Department of MIS in the College of Business, University of Nevada, Las Vegas. He has a B.S. in Oceanography from National Ocean University, Taiwan, an M.S. in Computer Science from Central Michigan University, an MBA from Texas A&M University, and an M.S. in MoIS and a Ph.D. in MIS from the University of Pittsburgh. His research interest includes performance measurement, IS strategy, management of IS, group support systems, human-computer interaction, organizational learning, and strategic planning. His work has appeared in Information & Management, Decision Support Systems, DATABASE, Communications of the ACM, and Journal of Computer Information Systems, and several major IS conference proceedings.
JERRY CHA-JAN CHANG

holds the title University Professor in the Katz Graduate School of Business at the University of Pittsburgh. He has published more than 300 papers and 15 books in the areas of Informafion Systems, Management Science, and Strategic Planning. He has served as Founding President of the Association for Informafion Systems (AIS), President of TIMS (now INFORMS), and Editor-in-Chief of MIS Quarterly. He was instrumental in the creation of INFORMS and of tbe Information Systems Research pumdX. He recently received the Leo Lifetime Excepfional Achievement Award by AIS.
WILLIAM R. KING

This study develops an instrument that may be used as an information systems (IS) functional scorecard (ISFS). It is based on a theoretical input-output model of the IS funcfion's role in supporting business process effecfiveness and organizafional performance. Tbe research model consists of three system output dimensions—systems perfonnance, informafion effectiveness, and service performance. The "updated paradigm" for instrument development was followed to develop and validate the ISFS instrument. Construct validafion of the instrument was conducted using responses from 346 systems users in 149 organizafions by a combinafion of exploratory factor analysis and structural equation modeling using LISREL. The process resulted in an instrument that measures 18 unidimensional factors within the three ISFS dimensions. Moreover, a sample of 120 matched-paired responses of separate CIO and user responses was used for nomological validation. The results showed that the ISFS measure reflected by the instrument was positively related to improvements in business processes effectiveness and organizafional performance. Consequently, the instrument may be used for assessing IS performance, for guiding information technology investment and sourcing decisions, and as a basis for further research and instrument development.
ABSTRACT: KEY WORDS AND PHRASES: functional scorecard, information systems performance measurement, instrument development, structural equation modeling. Journal of Management Information 5}'.r(em.s/Summer 2005, Vol. 22, No. 1, pp. 85-115. © 2005 M.E. Sharpe. Inc. 0742-1222 / 2005 $9.50 + 0.00.

86

JERRY CHA-JAN CHANG AND WILLIAM R. KING

(IS) function's performance has long been an important issue to IS executives. This interest is evident from the prominence of this issue in the various IS "issue" studies [12, 13, 34,49,72] as well as the popularity of annual publications such as ComputerWorld Premier 100 and InformationWeek 500, which involve the use of surrogate metrics to assess overall IS functional performance (ISFP). Executives routinely seek evidence of returns on information technology (IT) investments and soureing decisions—both types of choices that have become more substantial and a competitive necessity. As the unit that has major responsibilities for these decisions, the IS function is usually believed to be an integral part of achieving organizational success. Yet the overall performance of the IS function has proved to be difficult to conceptualize and to measure. As the outsourcing of IS subfunctional areas such as data centers and "help desks" has grown into the outsourcing of the entire IS function, there is an ever-growing need for formal performance assessment [61]. This will permit the establishment of baseline measures to use in judging outsourcing success. So, the issue of an overall IS functional metric, which is, and has been, high on IS executives' priorities, is becoming even more important. Although there has been a good deal of research on IS efficiency, effectiveness, and success at various levels of analysis, overall functional-level performance is one of the least discussed and studied. According to Seddon et al. [88], only 24 out of 186 studies between 1988 and 1996 can be classified as focusing on the IS functional level. Nelson and Cooprider's [70] work epitomizes this need. Moreover, while there exist metrics and instruments to assess specific IS subfunctions and specific IS subareas, such as data center performance, productivity and data quality, typically these measures cannot be aggregated in any meaningful way. This limits their usefulness as the bases for identifying the sources of overall performance improvements or degradations. As an anonymous reviewer of an earlier version of this paper said, "The critical issue is that the performance of the IS function is now under the microscope and decisions to insource/outsource and spend/not spend must be made in a structured context." The objective of this research is to develop such an instrument—a "scorecard"— for evaluating overall ISFP.
ASSESSING THE INFORMATION SYSTEM

The Theoretical Bases for the Study
THE DEFINITION OF THE

"IS FUNCTION" that is used here includes "all IS groups and departments within the organization" [84]. This definition is broad enough to include various structures for the IS function, from centralized to distributed, yet specific enough to include only the formal IS function that can be readily identified. Figure 1 shows the modified input-output (I/O) model that is the theoretical basis for the study. The model in Figure 1 has been utilized as a basis for other IS research studies [63,113]. It incorporates a simple input-output structure wherein the IS func-

MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS

87

IS Functional Performance Resources - hardware - software - human resources - integrated managerial and technical capabilities IS Function Outputs IS Function - systems - information - services Business Process EfTectiveness \ Organizationai Performance

Figure 1. Theoretical Input-Output Performance Model

tion uses resources to produce IS performance, which in turn influences both business process effectiveness and organizational performance. The resources utilized by the IS function are shown in Figure 1 to be hardware, software, human resources, and integrated managerial and technical capabilities [14, 15, 36], The IS function is shown to produce systems, information, and services [56, 92], which collectively affect the organization in a fashion that is termed IS functional performance (ISFP), which is to be assessed through an ISfunctional scorecard (ISFS), the development of which is the objective of this study. In the theoretical model, IS outputs are also shown as significant enablers and drivers of business process effectiveness, since IS are often the basis for business process operations and redesign [94, 113], ISFP also is shown to infiuence business process effectiveness, and both influence overall organizational performance [113], Although it is not the primary purpose of this study to directly address the business process effectiveness and organizational performance elements of Figure 1, data were collected on these elements of the model for purposes of nomological validation of the "scorecard" that is being developed. The model of Figure 1 is based on streams of research in IS capabilities, IS effectiveness/success, IS service quality, IS functional evaluation, and IS subfunctional assessment.

IS Capabilities
IS capabilities are integrated sets of hardware, software, human skills, and management processes that serve to translate financial investments in IS into IS performance [17, 23,42, 83,99, 111, 113], For instance, an IS strategic planning capability might consist of well-trained planners, computer-based planning models, knowledgeable

88 JERRY CHA-JAN CHANG AND WILLIAM R. IS Service Quality Recognizing the importance of the services provided by tbe IS function. They concluded that IS success should be a multidimensional measure and recommended additional research to validate tbe model. the SERVQUAL measure. KING technical people. and such measures have been developed at a variety of levels using a number of different perspectives.104]. However. 46. as depicted in Figure 1. adequate planning budgets. originally developed in marketing [74]. However. Other researchers have since tested and expanded their model [7. learning outcomes (e. the controversy over SERVQUAL in marketing [27] has carried over into IS [52. 79].. Wells [112] studied existing and recommended performance measures for the IS function and identified six important goals/issues. They proposed an IS function performance evaluation model to help organizations select and prioritize IS performance dimensions and to determine assessments for each dimension. IS Functional Evaluation Only a few studies directly address the comprehensive evaluation of tbe performance of the IS function. it is directly applicable only to one of the three major outputs of the IS function.. [96]) and other contexts using economic approaches (e.. Both studies focused on top management's perspective of ISFP and did not offer any specific measures. They concluded that their original model was valid and suggested that "service quality" be incorporated as an important dimension of IS success. suggesting that more research needs to be conducted to measure IS service quality. [105]). IS Subfunctional Assessment Measuring IS subfunctional performance has been important to IS practitioners and academics. and a well-formulated and specified planning process.g.g. a . has been adapted to measure IS service quality [75. service (e. [52]). measurements have been made of the effects of IS on users (e. Saunders and Jones [84] developed and validated 11 IS function performance dimensions through a three-round Delphi study. IS Effectiveness/Success DeLone and McLean [30] categorized over 100 IS "dependent variables" into six categories and developed an IS success model to describe the relationships between the categories.g..g. Proponents of this measure sometimes advocate its use as proxy for ISFP. [1]). [16]).110].. DeLone and McLean [31] have updated the model based on a review of research stemming from their original work. No one has developed a validated metric.g. e-business (e. For instance.

g. users represent the largest group. there has been no paucity of interest in IS assessment. Thus. Wbat type of data are being used for judgments of effectiveness? 7. However. 88]. [10]). financial perspecfive (e. From whose perspective is effectiveness being assessed? 2. it was designed according to guidelines from the organizational effectiveness literature. On wbat domain of activity is the assessment focused? 3. Tbe IS function [84]. These guidelines were developed in response to problems plaguing organizational effecfiveness research as described by Steers [95]. Although there are many other stakeholders for the IS function. a business process viewpoint (e. or in the development of measures. or "Gestalt" [41].. a social science perspective (e. The ]VIethodological Basis for the Study To ENSURE THE APPROPRIATENESS OF THE STUDY at the IS functional level. These guidelines have also been adopted by IS researchers to clarify conceptual developments in examining IS funcfional effecfiveness [69.g. So. Products and services provided by tbe IS function. tbe aggregated evaluation of individual users' assessments forms a quite comprebensive picture of tbe ISFP. and their efficacy in ufilizing IS products and services directly affects the organizafion's bottom line. [98]). Identify strengths and weaknesses. Wbat level of analysis is being used? 4. ranging from quarterly to annually.. track overall effectiveness. an "IT value" approaeb [24]. Periodically. and probably others. Cameron and Whetton [19] developed seven basic guidelines that are listed in the lefthand column of Table I. Tberefore. Organizafional users of IS services and systems are the primary stakeholder for the IS function [92]. Past performance measures. there is a great need for a comprehensive measure of IS performance that will provide a configural. [80]). Subjective. Implementafion of Cameron and Whetton's [19] Guidelines Guidelines 1. Cameron [18] later demonstrated the usefulness of these guidelines in a study of 29 organizations. view of an organizafion's formal IS acfivities and facilitate decision making and functional improvement. the ISFS developed here is defined as organizational IS users' percepfion of the performance for all of the aspects of the IS function that they have personally experienced. The implementafions of Cameron and Whetton's [19] guidelines for this study are shown in the dghthand column of Table 1.MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 89 Table 1.g. perceptual data from individual. What is tbe purpose for judging effectiveness? 5. Wbat time frame is being employed? 6. What is tbe referent against which effectiveness is judged? Implementations Organizational users of IS services and systems.. .

and so on. The Domain and Operationalization of the IS Performance Construct USERS' PERCEPTION OF IS ACTIVITIES derive from their use of the IS "products" and the services provided by the IS function. A cross-section mail survey is appropriate to obtain a large sample for analysis and to ensure the generalizability of the resulting instrument. • Service performance: Assesses the user's experience with services provided by the IS function in terms of quality and flexibility [38].90 JERRY CHA-JAN CHANG AND WILLIAM R. In order to develop a measurement instrument with good psychometric properties. A model of the ISFS construct. However. because it is designed to assess people's perceptions of the overall IS function rather than to capture users' attitudes toward a specific system. • Systems performance: Assesses the quality aspects of systems such as reliability. not of IS departments" [87. is presented in Figure 2. The definitions of the three basic output-related dimensions are given below. KING Despite its focus on users. Domain of the ISFS Construct The domain of ISFP used in this study reflects the theory of Figure 1 and the models suggested by Pitt et al. • Information effectiveness: Assesses the quality of information in terms of the design. [75] and Delone and McLean [31]. and the various impacts that systems have on the user's work. use. the "updated paradigm" that emphasizes establishing the unidimensionality of measurement scales [40. 244]. 89] was followed. These "consequences" . The services provided by the IS function include activities ranging from systems development to help desk to consulting. Operationalization of Constructs Two sets of constructs were operationalized in this study—the three-dimensional ISFS construct and the constructs related to the consequences of ISFP. this approach is different from the popular "user satisfaction" measures [6. Therefore. 35]. "Systems" encompass all IS applications that the user regularly uses. system and information quality are "attributes of applications. using LISREL notation.8. they are not sufficient to reflect the effectiveness of the entire IS function. IS research has traditionally separated the effect of systems and information as two distinct constructs [30]. operation. and value [108] provided by information as well as the effects of the information on the user's job. ease of use. p. The information can be generated from any of the systems that the user makes use of. response time.

The three output dimensions of Figure 1 are the basis for three ISFS dimensions. Mirani and King [67]. Kraemer et al. were used to assess nomological validity. e-commerce.MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 91 Systems Performance (SYSP) Information Effectiveness (INFOE) Service Performance (SERVP) Figure 2. [59]. Systems Performance Measures of systems performance assess the quality aspects of systems and the various effects that IS have on the user's work. as shown in Figure 1.. Empirical studies listed under the categories "system quality" and "individual impact" in DeLone and McLean's [30] IS Success Model were reviewed to collect the measure used in those studies. Whenever possible. Davis [29]. Some new measures were also developed from reviews of both practitioner and research literatures to reflect developments that have occurred subsequent to the development of the measures from which most items were obtained (e.g. Doll and Torkzadeh [35]. instruments developed by Baroudi and Orlikowski [8]. Ryker and Nath [81]. enterprise resource planning [ERP]. and Torkzadeh and Doll [103] were also reviewed and . Three-Dimensional Model of ISFS constructs (business process effectiveness and organizational performance). etc). In addition. Saarinen [82]. Goodhue and Thompson [43]. previously developed items that had been empirically tested were used or adopted to enhance the validity and reliability of the instrument under development.

Therefore. since the focus of their instrument is on quality of information. and services were incorporated to expand the item pools for each dimension. [38]. customer relationship management [39]. In addition.71]. Service Performance Measures of service performance assess each user's experience witb the services provided by tbe IS function in terms of the quality and flexibility of the services. The entire IS-SERVQUAL instrument is included in this dimension for comprehensiveness. 66]. . some new items were developed. Information Effectiveness Measures of information effectiveness assess the quality of the information provided by IS as well as the effects of the information on the user's job. 44. In addition to utilizing existing items to measure these constructs. the 118 measures developed by Wang and Strong make up the majority of items in this dimension. the emergence of innovations that have come into use since most of the prior instruments were developed prompted tbe inclusion of new items to measure the IS function's performance in seven new areas: ERP [51]. 102]. and help desks [20]—were also reviewed and included to ensure tbe comprehensiveness of measures for this dimension. 31 new items gleaned from the practitioner and research literatures to reflect potential user assessments of IS function's contribution to those areas in terms of systems. literature on three areas of IS functional services that were not explicitly covered by the service quality literature—training [60. However. electronic business [9.65.64. KING included for more updated measures published subsequent to DeLone and McLean's original review. Wang and Strong [109] developed a more comprehensive instrument that encompasses all measures mentioned in DeLone and McLean's review. Table 2 shows the subconstructs for each dimension that resulted from tbe g-sort process.55]. 68] to reduce the number of items and to ensure the content validity of the ISFS instrument. knowledge management [45. electronic commerce [22. Although DeLone and McLean's [30] "information quality" provided a good source for existing measures. 97]. In total. supply chain management [37]. 100]. New measures were also incorporated to augment the IS-SERVQUAL items based on the more comprehensive view of service performance proposed by Fitzgerald et al.92 JERRY CHA-JAN CHANG AND WILLIAM R. in order to ensure coverage of measures on the effects of information on the user's job. Instrument Development A total of 378 items were initially generated. information centers [11. and organizational learning [86. The last round of g-sort resulted in the identification of subconstructs with multiple items for each of the three dimensions. information. 47. 33. Multiple rounds of Q-sorting and item categorization were conducted [29.

To avoid common-source bias. A packet consisfing of one "Consequences of ISFP" instrument and three ISFS instruments was sent to tbe CIOs of these companies. it is unlikely that the users would be able to consciously bias tbe results due to tbe focus of the analysis (variance explanafion) and the length and complexity of the ISFS instrument. Although it is possible tbat tbe CIO migbt distribute the ISFS survey to "friendly" users and potenfially bias the responses. Data for the ISFS instrument were collected from IS users. and 32 items for the three dimensions." Tbe final version of the instrument is in the Appendix. The CIO is deemed to be suitable for receiving the packet because the topic of this research would be of great interest to bim or her.hoovers.100 medium-to-large companies with annual sales over $250 million was randomly selected from Hoover's Online (www. Survey Design and Execution A sample of 2. were specified. Sub-ISFS Constructs from g-Sort Systems performance Effect on job Effect on external constituencies Effect on internal processes Effect on knowledge and learning Systems features Ease of use Informafion effectiveness Intrinsic quality of information Contextual quality of information Presentational quality of information Accessibility of information Reliability of information Flexibility of information Usefulness of information Service performance Responsiveness Reliability Service provider quality Empathy Training Flexibility of services Cost/benefit of services The g-sorts resulted in an ISFS instrument that consists of 42.MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 93 Table 2. familiarity with IS.com) and InformafionWeek 500. and so on. The characteristics of desirable user-respondents in terms of various functional areas. tberefore increasing the potential for participation. respectively. and organizational CIOs were asked to respond to a "Consequences of ISFP" survey which was used as a basis for establishing nomological validity. . 36. All items were measured using a Likert-type scale ranging from 1 (hardly at all) to 5 (to a great extent) with 0 denofing "not applicable. data were collected from two types of respondents in each of the sampled organizafions. The CIO is also an appropriate respondent to the "Consequence of ISFP" survey because he or she is at a high-enough position to provide meaningful responses concerning consequences. The CIO was asked to respond to the "Consequences of ISFP" survey and to forward ISFS instruments to three IS users.

The respondents are distributed across all functional areas.94 JERRY CHA-JAN CHANG AND WILLIAM R. Therefore. banking/finance (10.2 percent for the CIO survey. and 6.6 percent). with accounting and finance. we first use exploratory factor analysis to determine the number of factors. construct validity. Two analyses were conducted to assess possible nonresponse bias. and nomological validity. f-tests of company size in terms of revenue. and manufacturing and operations being the top three. then use confirmatory factor analysis iteratively to eliminate items that loaded on multiple factors to establish unidimensionality. INSTRUMENT VALIDATION REQUIRES THE EVALUATION Content Validity Content validity refers to the extent to which the measurement items represent and cover the domain of the construct [54].7 percent of the respondents are at the upper-management level and 39. and medicine/health (7. For the ISFS instrument. reliability. indicating that the returned surveys were responded to by individuals at the desired level. it can be concluded that there was no nonresponse bias in the sample and that the relatively low percentage response rate does not degrade the generalizability of the ISFS instrument [57]. In addition. 47. This resulted in a response rate of 7. followed by wholesale/retail (13.7 percent). 5. At the conclusion of data collection in 2001. 346 usable ISFS instruments and 130 "Consequences of ISFS" surveys were received. Sample Demographics The participating companies represent more than 20 industries with nearly a quarter of the companies in manufacturing (24.1 percent matchedpair responses. letters were sent to the CIOs who returned the CIO survey soliciting additional user participation. 62] also showed no significant differences.9 percent were at the middle management level. with 120 companies having responses from at least one IS user and the CIO.8 percent). More than 80 percent of the respondents have titles that are at the upper-management level. The range of annual sales was between $253 million and $45. Following Segars's [89] process for instrument validation. sales and marketing. with an average of $4 billion for the sample. KING Two rounds of reminders were sent to initial nonrespondents to improve the response rate. where appropriate.6 percent for the ISFS questionnaire.352 billion. r-tests of 30 items randomly selected from the three ISFS dimensions (10 items each) between the early (first third) and late (last third) respondents [4. For the "Consequences of ISFP" surveys. It is established by showing that the "items are .9 percent of the respondents hold the title of CIO. Instrument Validation of content validity.8 percent). net income. 46. and number of employees between responding and nonresponding companies showed no significant differences.

0 that explained 70.MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 95 a sample of a universe" of the investigator's interest and by "defining a universe of items and sampling systematically within this universe" [26.6 percent. 90]. Three separate exploratory factor analyses were conducted using principal components with varimax rotation as the extraction method. Another factor in "information effectiveness" had only three items. exploratory factor analyses were first conducted for items within each dimension to determine the factors.8 percent. six for information effectiveness. with the development of new items when necessary. There were seven. Therefore. Churchill [25] recommended specifying the domain of the construct followed by generating a sample of items as the first two steps in instrument development to ensure content validity. 49]. and then as a collective network" [91. Since it is one of the original "ease-of-use" items from Davis [29]. The initial items were refined through a series of Q-sorts and a pilot test. and sample items should come from existing instruments. respectively. and 69. Although the subconstructs of the three basic dimensions described earlier were identified during the g-sort. These development procedures ensured the content validity of the instruments. This method of analysis provides the fullest evidence of measurement efficiency and avoids problems caused by excessive error in measurement [2. Segars and Grover suggest that "measured factors be modeled in isolation. It would be "just identified" for confirmatory factor analysis and was only analyzed in conjunction with other factors in the same dimension. Review of the items showed that most factors loaded very closely to the subconstructs identified by the Q-sort. 53.6 percent of variance for systems performance. domain development was guided by theories in both organizational effectiveness and IS research. In total. and service performance. Gerbing and Anderson suggest that "confirmatory factor analysis affords a stricter interpretation of unidimensionality" [40. 17 measurement models were analyzed. it was included into the factor that contains the rest of the "ease-of-use" items. seven. p. those factors needed to be empirically tested. 186] than other commonly used methods. Each model went through an iterative modification process to improve its model . and five factors with eigenvalues greater than 1. then in pairs. This is acceptable. p. and five for service performance. Domain development should be based on existing theories. 148]. since the items for each dimension were clearly separated in the instrument into sections with opening statements describing the nature of the items in the sections. 68. p. 3. To establish unidimensionality. In this study. This process resulted in six factors for systems performance. Unidimensionality and Convergent Validity Unidimensionality requires that only a single trait or construct is being measured by a set of measures and "is the most critical and basic assumption of measurement theory" [50. information effectiveness. One factor in "systems performance" had only one item. p. 58]. the items that loaded on the same factor were then analyzed with confirmatory factor analysis using LISREL—with two exceptions. Items from existing instruments formed the overwhelming majority of the item pool.

The chi-square and significant factor loadings provide direct statistical evidences of both convergent validity and unidimensionality [91]. and 5.50. 93]. This composite reliability is "a measure of internal consistency of the construct indicators. 89].96 JERRY CHA. Second. However. This suggests that even though all scales (except one) were reliable in measuring their respective constructs. all three dimensions were combined and tested for model fit (Figure 6). depicting the degree to which they 'indicate' the common latent (unobserved) construct" [48.5 to indicate that the variance explained by the construct is larger than measurement error. a composite reliability for each factor can be calculated [5. after every measurement model completed its modification process. items with standardized factor loading below 0.JAN CHANG AND WILLIAM R. With all cross-loading items eliminated.45 were eliminated [78] one at a time. First. After the full measurement models were purified. p. Tbe final. Another measure of reliability is the average variance extracted (AVE). items with cross-loadings in the full model were dropped. second-order measurement models for tbe three ISFS dimensions are presented in Figures 3. pairs of models within each dimension were tested iteratively to identify and eliminate items with cross-loadings. second-order models tbat reflect the subconstructs within each ISFS dimension were tested. With each of the three ISFS dimensions properly tested independently. Tbe construct reliability and AVE of all dimensions and subconstructs are presented in Table 3. Despite the low AVE. discriminant validity can be established by comparing the model fit of an unconstrained model that estimates the correlation between a pair of constructs and a . which reflects the overall amount of variance that is captured by the construct in relation to the amount of variance due to measurement error [48. This process was conducted iteratively by making one modification at a time until either good model fit was achieved or no modification was suggested. Table 3 indicates that all subconstructs showed good composite reliability except the IS training scale. 4. showed remarkably good fit for such high complexity. The value of AVE should exceed 0. there are some scales with an AVE below 0. all factors within the same dimension were tested in a full measurement model. some of them were less capable of providing good measures of their own construct. shown in Figure 6. 612]. this modification was only implemented when theories suggested that the two items should be correlated. The complete ISFS model. Discriminant Validity Discriminant validity refers to tbe ability of the items in a factor to differentiate themselves from items tbat are measuring other factors. error terms between pairs of items were allowed to correlate based on a modification index. KING fit. However. In structural equation modeling (SEM). those scales were retained to ensure the comprehensiveness of the ISFS instrument. Reliability In assessing measures using confirmatory factor analysis. Following Segars and Grover's [90] procedure. Again.

55 "|<-0.improve work quality |<-0.36-*' 3«^ V ^ Seu2 .customer service Sie4 .79»| Sik2 . Tests of all possible pairs of subconstructs witbin each dimension were conducted.reduce cycle time Sii7 .knowledge transfer Ssc7 .19 W-0.62.84^ Sie3 .individual learning Sik5 .-reduce process cost "0.fast response time l.manage external partner f<-0.group problem solving ).90. The difference in model fit is evaluated by the cbi-square difference (witb one degree of freedom) between the models.easy to ure 0.09 I Bfi 1 Sij7 .share infonnation Sie5 .easy to leam d 0.82>|Ssc4-reliable -accessible Figure 3.49«s I 0.23 W-0.038.^1 |<-0.give confidence l. constrained model that fixes the correlation between the constructs to unity.retain value customer I Sie6 .increase awareness Sij8 .54< .87. RMSEA (root mean square error of approximation) = 0.30 0.51'«0.easy to become skillful l<-0.12 .easier to do job 0.60 ± 0. = 411.43-< 1.enhance problem solving |<-0.cost-effective Ssc8 .26 0. Full Second-Order Measurement Model for Systems Perfomiance Notes: f = 618. the results are .customer satisfaction 0.select qualified suppliers |<-0. d.25 J<-0.06 0.increase productivity ^ Sij6 . GH (goodness-of-fit index) = 0.06 |<-0.88*1 Sii6 .f.74 Sii5.68.18 0.-M Ssc 1 .31\ 0.MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 97 Sijl .decision participation 0./? = 0.other area information Sikl .improve perfonnance Sij3 .33 I /. AGFI (adjusted goodness-of-fit index) = 0.improve decision Sij4 .04 1 n < 0.7 [<-0.gO»| Ssc9 .71": Siel .05 Sie2 .responsive to changes ).group decision making Sik3 .18 ~f<-0.flexible '0.86>| SijS .09 Sij9 .00.76*1 Seu 1 . |<-0. Discriminant validity is demonstrated when the unconstrained model has a significantly better fit tban the constrained model.

making decisions J<-0.easily integrat^ -easily updated Iui2 . indicating that each scale captures a construct that is significantly unique and independent of other constructs.77»|lcql .13 85>| Iui4 .78 1. This model consists of the rightmost portions of .20.give competitive edge 11ui7 .improve efficiency 79.0.understandable i.85>|lril-reliable -verifiable i.23 |<-0.f = 156. As shown. An operationalization of the theoretical model of Figure 1 that considers the two important consequences of ISFP was used. p. Nomological Validity A nomological network that specifies "probable (hypothetical) linkages between the construct of interest and measures of other constructs" [85. AGFI = 0.interpretable 0.27 _]<-0.identifying problems 0.033.1 •0.43 Iai4 .up-to-date J<-0.88>llp<ll -well-organized -well-defined ail -available .30 .83 .27 |<-0. \ Iui6 .3H 0. RMSEA = 0.94.easily changed 80>j Ifi3 . all chi-square differences are significant atp < 0. presented in Table 4.73.received in timely nianner|<-0.98 JERRY CHA-JAN CHANG AND WILLIAM R. GFI = 0.38 [<-0.defining problems 0. d.37 .7(»| Iiq2 .92.001. 14] further clarifies the ISFS construct and provides an additional basis for construct validation.81 | Iui3 . This provides evidence of discriminant validity. p = 0.40 _J<-0.66' j Ifi2 .48 |<-0.24 _|<-0. Full Second-Order Measurement Model for Information Effectiveness Notes: f = 216.76>j Iai3 .important -relevant i. KING y | liql .00.10 |<-0. Figure 4.

polite people 0.73 Figure 5.84 95>| Ssp3 .09.16 0.45 0.sufficient service variet71<-0.53 0.sufficient people 72^ 0.responds timely |<-0.6: 0.understand needs Stg3 .71 ^ Stgl .! ^-0. the positive relationship should hold in this study.037.sufficient service capacity[<-0. Organizational Performance Although a positive relationship between IS effectiveness and business performance has been suggested.pleasant to work with \4-0. [21] empirically showed a significant positive relationship between IS and business performance. Using "user information satisfaction" and "strategic impact of IS" as surrogate IS effectiveness measures and several perceptual measures as business performance. Full Second-Order Measurement Model for Service Performance Notes: x^ = 139.46»| Stg2 .reliable people •^ SepS ..f = 94. Chan et al.extends external services r<-0.^ n SfsS .22 -timely completion \<-0.56* 0.57 i. p = 0. RMSEA = 0.become better user 7i\ Sspl . d.useful training programs 0.84 0.dependable people 0.43 0. Figure 1 that relate ISFP to business process effectiveness and to organizational performance.66.95.86 _^ Sfsl .show respect Ssp4 .68.00.82 0.variety of training 0.52.79^ Srl6 .61 Sfs3 .MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 99 I jj)f»| Srpl.41 Sfs4 . .^1 1 ^ Scbl .^ 0.93. AGFI = 0. Since ISFS is posited as a more comprehensive measure of IS performance. the evidence of such an effect has proved to be elusive [58].16 0. GFI = 0.93 0.cost-effective services r<-0.19 Srl2 .

77. w . sales revenue. 101.90. Tbe literature has focused on assessing tbe extent to which the IS improves the organization's return on investment (ROI). = 2.65 O.094. d. 77. market share. RMSEA = 0.039. 106. and customer relations [21. GFI = 0. customer satisfaction.. This construct captures the IS function's contribution to the overall performance of the organization.00. operational efficiency.164.51 -^^^terpersonal quality of service provider IS training 0.113].79.-:yC^j__Contextual quality of infonnation 0. competitiveness.~ '—~ Accessibility of information Flexibility of infonnation Usefulness of information Responsiveness of services Intrinsic quality of service provider 0. AGFI = 0. p = 0. The Complete ISFS Model Notes: y} = 3.91 Rexibility of services Figure 6. KING Impact on job Impact on external constituencies Impact on internal processes Impact on knowledge and leaming Systems usage characteristics Intrinsic systems quality ity of information Reliability of information .100 JERRY CHA-JAN CHANG AND WILLIAM R. . Since subjective measures of those variables have been .76->C[^_ft«sentational quality of i n f o r m a t i o i i _ ^ '•85.f.

63 0.85 0. marketing services.58 0. Based on Porter and Millar [76] and Devenport [28].MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 101 Table 3. management processes.92 0. Validity and Reliability of the Measures Used in Nomological Analysis.66 0. Xia [113] developed a 39-item instrument to assess executives' perception of the extent to which IT improved the effecfiveness of six value-chain activifies.79 0.95 0.48 0. .59 0.88 0.77 0.89 0. Reliability of Measurement Factors in ISFS Dimensions Factor names Systems performance Impact on job Impact on external constituencies Impact on internal processes Impact on knowledge and learning Systems usage cbaracteristics Intrinsic systems quality Information effectiveness Intrinsic quality of information Reliability of information Contextual quality of information Presentational quality of information Accessibility of information Flexibility of information Usefulness of information Service performance Responsiveness of services Intrinsic quality of service provider Interpersonal quality of service provider IS training Flexibility of services Reliability 0. supplier relations.56 0.62 0.91 0. and customer relations.89 0. Improvements to tbe value-cbain activities through IT are captured in this construct.33 0.37 considered to be acceptable in the literature [32.92 0.85 0.93 0.79 0. seven items tbat assess the CIO's perception of IS's contribution to improving the organization's performance in those areas were used.80 0.75 0.56 0. Data analysis resulted in six factors: production operations. product development. the IS function should also have an effect on organizational perfonnance through its impact on tbe effectiveness of business processes.57 0.88 0.66 0.79 0.56 0.49 0. as shown in Figure 1. IS have traditionally been implemented to improve the efficiencies of internal operations.69 AVE 0.80 0. Business Processes Effectiveness Aside from directly affecting organizational performance.87 0. Items representing those six factors were generated for tbis construct.68 0.84 0.and interfunctional business processes [94]. 107].89 0.82 0.66 0.81 0.63 0. Although all scales in the "Consequences of ISFP" survey were from previously tested instruments.73 0. This use of IT has more recently been applied in redesigning both intra.

the purpose here is to demonstrate the expected association between the constructs.60*** 78. Construct validation was assessed by exploratory factor analysis using principal components with oblique rotation as the extraction method.52*** 46.60*** 52. Therefore. Table 5 presents the results of the analyses.34*** 60.31*** 47. tests were conducted to ensure the reliability and validity of those constructs.66*** 47.31*** Service performance Factor2 23. Reliability was evaluated using Cronbach's alpha.52*** Factor4 46.45*** Factor6 49.97*** 55.36*** 52. the nomological network was supported. there were significant positive correlations between the ISFS construct and the two consequence constructs.97*** 58.03*** 65. Items with low corrected item-total correlation.31*** Factor5 66.31*** Factor5 30.68*** Factor7 67.27*** 47.93*** 48. . The average of all items for each construct was used to avoid problems that may occur due to differences in measurement scales.47*** Information effectiveness Factor2 63.73*** 53.55*** 59.84*** Factor3 51.76*** 73.42*** 75.80*** 47. Factor2 Factor3 Factor4 Factor5 Factor6 37.96*** Factor4 64. Chi-Square Differences Between Factors Chi-square differences Factor 1 Systems performance Factor2 39. Although both constructs had two factors extracted. Although correlation is not sufficient to establish causal relationships.82*** 48. As shown in Table 6.84*** 75. Therefore.42*** 64.21*** 40. Items in the final measurement models were used to create an overall score for the ISFS construct.61*** Factor3 31. There was also significant positive correlation between business processes effectiveness and organizational performance. all items within each construct were retained and used to create an overall score for the construct. as shown in Tables 5 and 6.23*** 65. were dropped. the two factors were significantly correlated in both cases.22*** 48.33*** Factor4 26.68*** 70.69*** ***p< 0.00*** 38.89*** 73. indicating low internal consistency for the items.87*** 67. KING Table 4.66*** 28.16*** Factor5 65.65*** Factor6 62.001.93*** since a different sample was used.61*** Factor3 80. Table 6 shows the correlation among the constructs.67*** 78.04*** 57.102 JERRY CHA-JAN CHANG AND WILLIAM R.

75. 24. All scales have high reliability. other variables might have been analyzed [57].01 level (two-tailed). The sample size. 84]). Thus.04 Cronbach's alpha 0. Each dimension contains several unidimensional subconstructs. some limitations to the instrument need to be pointed out. and JVIanagerial Uses THE ISFS INSTRUMENT IS A COMPREHENSIVE ONE that has been designed to measure the performance of the entire IS function. Of course.g.750" 0. Evidence from discriminant validity analyses showed that each scale is measuring a construct that is different from the other constructs.48 70. Results. 98]) as well as various subfunctional "levels" that have previously been measured (e.g. Correlation of Constructs in1 the NomologicalNetwork Constructs Organizational performance ISFS Business processes 0.214" Organizational performance 0. Further studies will need to explore and improve these items.860 Constructs Business processes effectiveness Organizational performance Table 6. Despite this.MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 103 Table 5.. each of which is measured by at least two items. way. is "borderline" for the number of variables relative to the number of observations that are involved in the SEM analysis. but limited..205" ** Correlation is significant at the 0. The instrument consists of three major dimensions: systems performance. and service performance. The ISFS may therefore be thought of as a preliminary step that can guide future research and enhance practice in a significant. these items were retained for comprehensiveness or theoretical soundness. [31. 16. 80. We also note that two subconstructs—"IS training" and "flexibility of services"—were borderline with respect to reliability. Limitations.759 0. [10. information effectiveness. The ISFS integrates aspects of various philosophical approaches that have been taken to developing IT metrics (e. some caution should be taken until it is revalidated. Although this is common practice. The nonresponse bias analysis was conducted by comparing early responders to late responders and in terms of organizational size-related variables for responders and nonresponders. while large. The comprehensiveness of the ISFS instrument was demonstrated by its consideration of "all" . Reliability and Validity of Nomological Constructs Number of factors extracted 2 2 Number of items retained 6 7 Variance explained (percent) 63. especially for "matched-pair" survey studies.

When used within an organization. as represented by tbe subconstructs. ultimately. analysis.JAN CHANG AND WILLIAM R. Comparing postinvestment or outsourcing IS perfonnance to the baseline would provide a more objective evaluation of the efficacy of the actions taken. the ISFS instrument can offer valuable insights for internal benchmarking. the ISFS also allows the IS function to pinpoint specific areas that need improvements and to track both these areas and overall performance over time. with the intensifying scrutiny on IT investment. At this latter level. The resulting instrument is not only comprehensive enough to cover all aspects of ISFP but also sensitive enough to pinpoint specific areas tbat need attention. the ISFS instrument can be very useful to establish a baseline on the current status of ISFP. To be effective. thus providing the basis for "continuous improvement" [73]. Because of the use of data from a cross-sectional field survey for validation. such an indication must be further assessed in light of the overall "Gestalt" of dimensions. KJNG IS activities as reflected in the hierarchical structure of 18 unidimensional subconstructs within the three dimensions. As a result. in terms of both functional areas and organizational levels. will allow the organization to develop follow-up actions to maximize IS performance and. in turn. It has already been used in each of these ways in a number of organizations that participated in the study. Thus. the fact that one dimension may be "low" does not tell the whole story. there is no assurance that the "subscores" have the same degree of nomological validity. Comparing the results of ISFS instruments from different divisions or areas would help identify areas of IS excellence and facilitate the transfer of knowledge to other areas. This allows IS managers to use this instrument to assess their strengths and weaknesses in those subconstructs and dimensions. When used in large organizations witb decentralized IS functions. In this sense. the instrument can be used in various ways—as an overall evaluative tool. The result of later assessments should be compared to earlier evaluations to detect changes that would indicate improvements or degradation in the IS function's performance and in specific performance areas. Since such an analysis is beyond the scope of this study. Of course. In addition. or in evaluating specific subareas. This. we leave it to others who may wish to concern themselves with this issue. to improve organizational performance. The average scores for each subconstruct or dimension are the indicators of the IS function's performance for the specific subarea or dimension. it also provides means of identifying the specific performance areas. since the dimensions are not independent. the goal of developing a measure to assess the performance of the IS function was successfully achieved in this study. This would ensure appropriate representations of the diverse users in the organization. The ISFS instrument should be a useful tool for organizations to use in continuously monitoring the performance of their IS func- . One additional caveat may be useful. the ISFS instrument is applicable to a variety of industries. the instrument should be administered to a range of systems users. The nomological validation was performed in terms of the overall ISFS score.104 JERRY CHA. as a "Gestalt" of areas that may be tracked over time. that may need improvement. Overall. and outsourcing. the ISFS instrument should be administered repeatedly at a fixed interval between quarterly and annually. Thus.

14. and Gerging. Managing information tecbnology investment risk: A real options perspective. G. 247-260. Organizational Effectiveness: A Comparison of Multiple Models.A.. J. 7. J. 225-242.. B. Management Science. 38. Brancheau.J. 4 (April 1987).. D.. MIS Quarterly. 12. A short-form measure of user information satisfaction: A psychometric evaluation and notes on use. Information Systems Research.. Developing a 3-D model of information systems success. O'Brien. Psychological Bulletin. 9. Y.. 10. 530-545.C. S.C Key issues in infonnation systems management: 1994-95 SIM Delphi results.. 44-59. 396-402.E. 4. and Neo. Bonner. Atlanta: Association for Information Systems. 19. 4 (Spring 1988). 16. Proceedings of Seventeenth International Conference on Information Systems.L.. A. An approach for confirmatory measurement and structural equation modeling of organizational properties. An examination of the validity of two models of attitude. A. J. P. Journal of Management Information Systems. M. 4 (December 2002).C. Structural equation modeling in practice: A review and recommended two-step approach. S. Monro. Brynjolfsson. Sloan Management Review. . The productivity paradox of information technology. 3 (September 1990). 8. andYoo. Minneapolis.P. MIS Quarterly. M. dissertation. Carr. Cameron. Anderson. as well as in studies that seek to complement the ISFS througb otber analyses. J. Sanders (eds. CL. and Powell. L. J. and Wetherbe. W.M. 13. 4. T. M.A. 404415. P. 32. E. and De Serre. 261-277. 3 (Spring 1997). REFERENCES 1.MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 105 tion and for researchers to use in studies that require ISFP as a dependent or independent construct. S. J.. Broadbent. B. Bagozzi.. Garrity and G. 18. Martin. 6.S. Ph. Rivard. 12 (December 1993).). Benaroch. Journal of Marketing Research. 1 (January 1986). 174-194. MIS Quarterly. 1998. 323-359. Janz. 1 (Summer 2000). 87-112. Broadbent. I. In J. 3 (August 1977). Development of a tool for measuring and analyzing computer user satisfaction. M.S. Bailey. 43-84. 36. R. 5.E.S.. 17. Firm context and patterns of IT infrastructure capability. Estimating nonresponse bias in mail surveys.W. 2 3 ^ 5 .J. 525-541. Key issues in information systems management. Some conclusions about organizational effectiveness.A. Baroudi. Cameron.J. 20. A comparative study of distributed learning environments on learning outcomes. 14. and Overton. M. Multivariate Behavioral Research. Cameron and D. Communications of the ACM. 11. 1-14.. Weill.. Jarvenpaa.L. pp. 29. D. 17. In K. 33. Workflow management issues in e-business.I.D.L. Byrd. K.). Armstrong. 3 (May 1988). K. Basu. PA: Idea Group. 77-92.. and Orlikowski. In E. Hershey. M.C. 19. 2 (June 1996). and Turner. Bergeron.D. 103.W. 2 (Fall 2002). 5 (May 1983). Journal of Management Information Systems. 1 (March 1987).. 67-77. 20. T. Alavi. Investigating the support role of the information center.N.C. T.. J. Journal of Management Information Systems.S. 1996. and A. and Pearson. J. 1 (March 2002). D. F. Anderson.. New York: Academic Press. 1999. Information Systems Success Measurement. 1983. pp.. Levy. Ballantine. Marakas. P. A study of organizational effectiveness and its predictors. J. and Kumar. Measuring the flexibility of information technology infrastructure: Exploratory analysis of a construct. 11.C.. 16. and Whetton.). 3 (July 1981). DeGross. pp. Whetton (eds. J. Managing service quality at the IS help desk: Toward the development and testing of TECH-QUAL. 167-208. Management by Maxim: How business and IT managers can create IT infrastructers. Brancheau. Management Science. a model of IS technical support service quality. 2. 15. 13. 13. Srinivasan (eds. 46-59. 14. and Wetherbe.. Management Science. University of Minnesota.S. 3. Information Systems Research. and Weill. 411^23. A.

H.. Fan. 1 (Summer 2001) 185-214. 1 (Summer 2002). Implementation team responsiveness and user evaluation of customer relationship management: A quasi-experimental design study of social exchange theory. 2 (June 1997). Problems in Human Assessment. 13. 58. 4 (Spring 2003). J.). 1 (January 1994)..106 JERRY CHA-JAN CHANG AND WILLIAM R. and Torkzadeh. Devaraj. N. and Anderson.. D. R. Doll.L. D. P. 41. Grewal. 1 (February 1979). J.H. 5. Strategic Management Journal. 1 (March 2003). 125-150. S. Churchill. E. J. 33. 1967. 32. Shaping up for e-commerce: Institutional enablers of the organizational assimilation of Web technologies. 14. 175-194. A. 35. 16. Chan. 1 (March 1992). G.J. 28. A.. C. 316-333.. R. 8.R. and Segars. 65-90.D. 19. and strategic alignment. Fitzgerald. L. Information Systems Research. London: Chartered Institute of Management Accountants. 64-73.L.H. and Meehl.J. S. Journal of Management Information Systems. Capturing flexibility of information technology infrastructure: A study of resource characteristics and their measure. information systems strategic orientation. 319-340.. 25. Malhotra. Brignall. A. 3 (September 1984). The DeLone and McLean model of information systems success: A ten-year update. pp. F. 12.. R. Journal of Management Information Systems 12 2 (Fall 1995). M. 8. An updated paradigm for scale development incorporating unidimensionality and its assessment. Pacini. 22. Dess. and Kohli. D. 135-148. S.J. V. Journal of Management Information Systems. MIS Quarterly. 2 (May 1988).R. Dickson. and Ridings. and Sambamurthy. T. L. MIS Quarterly. Journal of Marketing Research. Fan.C.. 39.. 1993. Journal of Management Information Systems. 9-30. 2 (Fall 2002).. and McLean.H.. Barclay. V. Chatterjee. D. Duncan.. and Sambamurthy. 37-57.. R. 265-273. D. Information Systems Research. 60-95. 31. G.. and Kauffman. 7 ^ 2 . A paradigm for developing better measures of marketing constructs. 42. Johnston. In D. 40. and Voss. Antecedents of B2C channel satisfaction and preference: Validating e-commerce metrics. 25. MIS Quarterly. 34. 17. Y. 259-274. W.B.J. perceived ease of use. The measurement of end-user computing satisfaction. The shareholder-wealth and tradingvolume effects of information technology infrastructure investments. 23.E. 3 (Spring 1998). Journal of Marketing. R. Chircu. Chatterjee. and Whinston. 29. 19. Perceived usefulness. Boston: Harvard Business School Press. G. E. W. Information systems success: The quest for the dependent variable.. 26. Silvestro. J. Glazer. Journal of Management Information Systems. M. MIS Quarterly. R.. 26. 186-192. Journal of Marketing Research.W. 1-22.G. 3. 47-70. New York: McGraw-Hill.. Davenport.A. DeLone.J.. . Measuring the knower: Towards a theory of knowledge equity. 1993. 27.M. Journal of Management Information Systems. 125-131.J. 36.H. SERVPERF versus SERVQUAL: Reconciling performance-based and perceptions-minus-expectations measurement of service quality.. Davis. Decentralized mechanism design for supply chain organizations using an auction market. and Robinson. Information Systems Research. DeLone. Process Innovation: Reengineering Work Through Information Technology. Huff. M. and Taylor. and McLean. 3 (September 2002). Messick (eds. KING 21. G. Limits to value in electronic commerce-related IT investments. Business strategic orientation.M. Gerbing. 3 (September 1989). A. Gold. Cronin. California Management Review. S.... A. 2 (June 2002).2 (Fall 2000). 13.B. Construct validity in psychological tests. and Wetherbe. R. and user acceptance of information technology. Cronbach. 59-80. Nechis. 40.. Key information systems issues for the 1980s.W.G... 3 (July-September 1984). W. 18. 24.. C .. 30. 38. 19. Gefen. 2 (June 1988).A. Leitheiser.E. and Copeland. R. 37. 57-77. Knowledge management: An organizational capabilities perspective. Information Systems Research. CM. Performance Measurement in Service Businesses.C.W. Jackson and S. Measuring organizational performance in the absence of objective measures: The case of the privately-held firm and conglomerate business unit.B. D. Stallaert.

54.). M. Journal of Business Logistics.. The usefulness of computer-based information to public managers. In H. Beyond the information center: An instrument to measure end-user computing support from multiple sources. 59. 5th ed.. Wu. J. R.. Multivariate Data Analysis. D. Guimaraes.. Hartog. Information Systems Research. 14. 127-145. 179-228. G. 314-425. 67. Carr.M.L. Klein. Developing a business performance evaluation system: An analytical hierarchical model. UK: Wiley.C.E. 45. 62. Journal of Management Information Systems. processes. In KA. External validity. Newbury Park. New York: McGraw-Hill. Kraut. and Benbasat..J. W. R. 3 (September 2002). 2 (June 1993). Jeong. 2001.. Hair. K. Task-technology fit and individual performance. Global Information Technology Outsourcing.L. and Watson. 1 (Summer 2002). Grover. 11..N.M.. 40. 2004. and He. Foundations of Behavioral Research.. Mirani.. C . 47. 1 (Summer 2003). King.L. A taxonomy of antecedents of information systems success: Variable analysis studies. New York: Academic Press. A. Lee. 481-498. Journal of Management Information Systems. and Segars.H. M. Exploring the relationship between IC success and company performance. 1978. Tatham. 351-361. V. J. Long (eds. 4 (December 1996). 51. Journal of Management Information Systems. Investment in enterprise resource planning: Business impact and productivity measures. Jiang. and Black. Management information systems. King. K. Communications of the ACM. 71-98. Encyclopedia of Management Information Systems. H. M. Lee. 19... Moore. 68.. 66. 1 (Summer 2001).S. MIS Quarterly. 20. W. 220-238. 55. and Susan. Grover.H.R. Methodology review: Assessing unidimensionality of tests and items. Critical success factors for information center managers. coverage and nonresponse errors in IS survey research. Decision Sciences.H. Information & Management.. X. 2 (June 1985).R. 343-357. 145-166. Measuring information technology payoff: A meta-analysis of structural variables in firm-level empirical research. 177-191. W. Larsen. S. Applied Psychological Measurement. and quality of worklife. productivity. University of Pittsburgh. vol.G. and Thompson. and Carr. Lee. Kohli. W.C.J. 33. Lambert. and Harrington. and Reithel.J. 5 (May 1998). B. 64.). R. 32.. 1985 opinion survey of MIS managers: Key issues.E. 1998. WR. 2 (February 1989). Computerization. 3 (March 1994).L. F.. K. 31. 50. S. Information & Management. 3 (September 1988). Knowledge management enablers. K. J.. Lacity. Govindarajulu. 5-22. Hattie. 129-148. T. 294-316. MIS Quarterly. T. Kwak. 2 (June 2002). L. The development of a measure for end-user computing support. 2 (June 1995). S..F. M. and King. Joreskog. Information & Management.R.J. pp. and Igbaria. Information Systems Research. Bidgoli (ed. S. Testing Structural Equation Models. 26. Engineering Economist. V. R. 10. J. NJ: Prentice Hall. Chichester. and Davenport. 139-164. 61. 26.N. 5-25. J.. 12. Katz Graduate School of Business. L. K. 46. Han. I.. Development of an instrument to measure the perceptions .T. and organizational performance: An integrative view and empirical examination. Dunkle. H.. Businesses as buildings: Metrics for the architectural quality of Internet businesses. CL. Goodhue. Kim.. Information systems effectiveness: The construct space and patterns of application. 44.. Magal. Testing structural equation models. 2003. Dumais.. 18. 4 (December 1986)..MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 107 43.. 3. and Herbert. 63.J. 241-250. T. Kerlinger. and Devaraj. Danziger. 13. G. 4 (July-August 1994). Hitt. 239-254. D. J. R.. 2 (1990). 48. and Willcocks. R. MIS Quarterly. 58. Boland and L. C . and King. MIS Quarterly. 60.C. 49. Journal of Management Information Systems. Measuring nonresponse bias in customer service mail surveys. 57. 53. J. 52. 169-246. Kraemer... 56. 25. I. 65. J. 2 (June 2003).R. 19.R. Measuring information system service quality: SERVQUAL from the other side. Upper Saddle River.L. D. D. and Zhou. H. H. and Han. 20. 2 (Fall 2003). 17. 4 (Summer 1995). Anderson. 133-141. 1993. and Lee. 213-236. 9. General perspectives on knowledge management: Fostering a research agenda. CA: Sage. MIS Quarterly.

Research in Organizational Behavior.. Segars. Q. Issues in linking information technology capability to firm performance..C. KING of adopting an infortnation technology innovation.M. 2 (June 1995).L. 139-163.J. 77. A. and Grover. Olian. 84.S. Kappelman.T. 2 (November 1999). 87. and Berry. 27. Parasuraman.. 72. 4 (Spring 1992).. 343-355. Scott. and the human equation: Evolution of the information technology function in modem organizations. 517-525. V.E. 88. B. Journal of Retailing. 73. 92. E. Cummings (eds. 2. performance measures. Ryker.. 4 (December 1993).G. Assessing the validity of IS success models: An empirical test and theoretical analysis. A. . A respecification and extension of the DeLone and McLean model of IS success. 149-160. Journal of Management Information Systems.R. 4 (Spring 2000). and Prybutok. 75.H. Strategic information systems planning success: An investigation of the construct and its measurement. Harvard Business Review. 1 (March 2002). Making total quality work: Aligning organizational processes. and Grover.D. 207-214. L. Schwab.B. and Hendrickson. 50-69. Information & Management. 81. R. C. J. Hershey. Information Systems Success Measurement. Value. 20.. MIS Quarterly. 1 (March 2003). V. and Welker. S. 21.H. 79. Ryan. 70. Construct validation in organizational behavior. 63-82. 192-222.H.. Nelson.. Pittsburgh: University of Pittsburgh. A. and Cheney. Considering social subsystem costs and benefit in information technology investment decisions: A view from the field on anticipated payoffs. V. 3 (September 1991).108 JERRY CHA-JAN CHANG AND WILLIAM R. Greenwich. and Millar. D. J. CT: JAl Press. In E. P. 86. D. knowledge. 2 (1996). In B. Saunders. Garrity and G. Santhanam. 8. and Hartono. J. S.. 4 (December 1996). Journal of Management Information Systems..F. 74.A.B. 8.B. Journal of Management Information Systems. P.L. Omega. and Tu. Segars. S. 85. 10. Re-examining perceived ease of use and usefulness: A confirmatory factor analysis. and Kavan. MIS Quarterly.E. 19. P. 89. Facilitating interorganizational learning with information technology. C. 90. Dimensions of information systems success.. 13. Information Systems Research. 2-39. 71. R. Porter. 409^32. Service quality: A measure of information systems effectiveness.. Communications of the AIS..L. Seddon. 2. 31. 240-253. G. T. Staw and L. 3-43. 4 (December 1991). 1998. 83. 30.. Seddon.). Myers. R. J. Refinement and reassessment of the SERVQUAL scale. 420-450.. 64. 173-185. 69.. The contribution of shared knowledge to IS group performance. 3 (Summer 2000). 125-154. Measuring performance of the information systems function. Segars.R. Information Systems Research.L. and Harrison. MIS Quarterly.D..R.A. 17.H. 303-333. An empirical examination of the impact of computer information systems on users. A. Niederman. Assessing the unidimensionality of measurement: A paradigm and illustration within the context of information systems research. Zeithaml.C.S.. 103-118. 2 (June 1998).A. 22. MIS Quarterly.B. R. MIS Quarterly.. F. T. Information Systems Research. S. Lang. 81-114.S. 94-121. Dimensionality of the strategic grid framework: The construct and its measurement. A comprehensive model for assessing the quality and productivity of the information systems function: Toward a theory for information systems assessment. Pitt. and Nath. Rai.L. 82.W. K. and Bowtell. 431-445. PA: Idea Group. Information & Management. 475-500. 1 (February 1997). 25. J. Human Resource Management. A. 15. Training end users: An exploratory study. and Rynes. A. 91. 16. vol.. MIS Quarterly.. and Cooprider. 107-121. Premkumar. V.E. 2 (Fall 2000). A. pp. J. Brancheau. 80. 63. R. and Jones. 547-559. Sanders (eds. Watson. 4 (July-August 1985). 78. MIS Quarterly. II. L. 4 (December 1987). 3 (September 1997). Raghunathan. V.P. Evaluation of Strategic Information Systems Planning: Empirical Validation of a Conceptual Model. B. R. 76. An expanded instrument for evaluating information system success. Nelson. 29. Patnayauni. 17.M. M. Segars.. 1980.). Raghunathan. Information Systems Research. 4 (Winter 1991). L. and stakeholders. pp. Journal of Labor Research. R. M. Staples. 11^0. Information systems management issues for the 1990s... and Wetherbe.H. Saarinen. How information gives you competitive advantage. 3 (Fall 1991). 4 (1995). 4 (December 1999).. 1989.

and Dhiilon. D.R. R. 13. 27. Measuring infonnation systems service quality: Concerns on the use of the SERVQUAL questionnaire. 327-339. W. 110.. 86-95.. W.D. Straub. The development of a tool for measuring the perceived impact of information technology on work..R.P. and Doll. Integration between business planning and information systems planning: Evolutionary-contingency perspectives. P. Morris. University of Pittsburgh. Tallon. D. 942-962. Van Dyke. Evaluation of the MIS function in an organization: An exploration of the measurement problem. G. 109. Sethi. Strategic orientation of business enterprises: The construct.. T. G. Measuring factors that influence the success of Internet commerce. 97. Torkzadeh.W. 4 (December 1975). Ph. 4 (Spring 2000). and Gurbaxani. 102. Sussman. 1 (March 1998). 98. M. University of Pittsburgh.G. yo«rna/o/Manageme/tr. 1601-1627. Weill. 307-333. Ph. and Davis.Y. 99. K. S. and measurement. 61-79.L.B. Information Systems Research. 5-34. 103.T. Kraemer. 96. Ph. The relationship between investment in information technology and firm performance: A study of the valve manufacturing sector. Informational influence in organizations: An integrated approach to knowledge adoption. N. 100.. dissertation. Information Systems Research. 95.. Beyond accuracy: What data quality means to data consumers. R. Journal of Management Information Systems.B. 12.A. and Kavan. Management Science.. 109-122. and Prybutok. W. Upper Saddle River. and King. D. MIS Quarterly. F. 425-478. 105. G. and King. 195-208.Y. Omega.F. 13.S. 2 (June 2002).J. Tam.W. Venkatraman. 22. CA. Wand.R. 1 (1987). K. Watson. Measuring information systems service quality: Lessons from two longitudinal case studies. 85-98. User acceptance of information technology: Toward a unified view. dimensionality. and Strong.R.s/rative Science Quarterly. Venkatraman.L. B. 19. 107. Intemational Journal of Management Science. 1994.. V Measurement of business economic performance: An examination of method convergence. Communications of the ACM. 11 (November 1996). N. Executives' perceptions of the business value of infonnation technology: A process-oriented approach. B. 8 (August 1989). W. Wang. 3 (September 2003). 13. 21 2 (June 1997). CE.Y. The impact of information technology investment on firm performance and evaluation: Evidence from newly industrialized economies. and Ramanujam.D. Sethi. 115-124.. Pitt. V. 175-218. 40. and Steinfield.R... and Wang. Y. Davis. 1987. Arfm(m.. 39.. dissertation. G. Torkzadeh. C Measuring e-commerce in Net-enabled organizations: An introduction to the special issue.. 27. G. Venkatesh. 187-204. Templeton.P. 112. Problems in the measurement of organizational effectiveness. 1 (March 2003) 247-65.S. Weber.. Wells. 14.H. 94. Management Science. Steers. Kappelman. 12 (December 1994). MIS Quarterly. 9 1 (March 1998). 111. and Snyder. Information Systems Research. Information Systems Research. 1998. V. Anchoring data quality dimensions in ontological foundations. L.M.M. 546-558.MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 109 93. Xia. Information Systems Research 3 4 (December 1992). Hoffman. 3 (June 1999). 113. 106. W. dissertation. 2 (Fall 2002). Development of a measure for the organizational learning construct. Journal of Management Information Systems. 20. R. 1998. 104. . MIS Quarterly. Teo. Lewis. 35. 108. C. T. L. V. Development of measures to assess the extent to which an information technology application provides competitive advantage. P. Organizational Transformation Through Business Process Re-Engineering. 145-173. Dynamic capabilities and organizational impact of IT infrastructure: A research framework and empirical investigation.D. 2 (June 2002).. V. 16.D. Minneapolis. 101. V. Journal of Management Information Systems. R.. University of Minnesota. and Siegal.W. 4 (Spring 1996). NJ: Prentice Hall.

KING Appendix. As a user of some information systems/technology. you have your own definition of what the IS function means to you. Effectiveness of Information The following statements ask you to assess the general characteristics of the information that IS provides to you. Please try lo focus on the data and information itself in giving the response that best represents your evaluation of each statement. To a great extent N/A 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Hardly at all The extent that the information is: Interpretable Understandable Complete Clear Concise Accurate Secure Important Relevant Usable Well organized Well defined Available Accessible Up-to-date Received in a timely manner Reliable Verifiable Believable Unbiased 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 5 0 .110 JERRY CHA-JAN CHANG AND WILLIAM R. and it is the performance of "your" IS function that should be addressed here. circle 0. If a statement is not applicable to you. groups. The IS function includes all IS individuals. and departments within the organization with whom you interact regularly. ISFS Instrument What Is the IS Function? THIS QUESTIONNAIRE IS DESIGNED TO ASSESS the performance of the information systems (IS) function in your organization.

Can be used for multiple purposes. It is easy to identify errors in information. It helps you discover new opportunities to serve customers. It is useful for making decisions. To a great extent N/A Hardly at all The extent that: The amount of information is adequate.MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 111 Hardly at all The extent that the information: Can be easily compared to past information. Can be easily maintained. Can be easily integrated. It is useful for defining problems. Can be easily updated. Meets all your requirements. It gives your company a competitive edge. Can be easily changed. 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 3 4 4 4 4 4 4 4 4 4 5 5 5 5 5 5 5 5 5 0 0 0 0 0 0 0 0 0 . It improves your efficiency. To a great extent N/A 1 1 1 1 1 1 1 2 2 2 2 2 2 2 3 3 3 3 3 3 3 4 4 4 4 4 4 4 5 5 5 5 5 5 5 0 0 0 0 0 0 0 The following statements ask you to assess the outcome of using the information that IS provided to you. It is useful for identifying problems. It improves your effectiveness.

IS function's services are valuable. 1 1 1 1 1 2 2 2 2 2 3 3 3 3 3 4 4 4 4 4 5 5 5 5 5 0 0 0 0 0 Hardly at all The extent that tbe IS function: Responds to your service requests in a timely manner. Please circle tbe number that best represents your evaluation of each statement. Has sufficient people to provide services. To a great extent N/A Hardly at all The extent that the: Training programs offered by tbe IS function are useful. Has sufficient capacity to serve all its users. If a statement is not applicable to you. Is dependable in providing services.112 JERRY CHA-JAN CHANG AND WILLIAM R. Extends its systems/services to your customers/suppliers. Can provide emergency services. Gives you individual attention. Variety of training programs offered by the IS function is sufficient. Completes its services in a timely manner. Provides a sufficient variety of services. circle the number 0. KING IS Service Perfonnance Tbe following statements ask you to assess the performance of services provided by the IS department orfunction. Training programs offered by the IS function are cost-effective. Has your best interest at heart. IS function's services are helpful. To a great extent N/A 1 1 1 1 1 1 2 2 2 2 2 2 3 3 3 3 3 3 4 4 4 4 4 4 5 5 5 5 5 5 0 0 0 0 0 0 . IS function's services are cost-effective.

Please circle the number tbat best represents your evaluation of eacb statement. 1 ] 1 . to access data bases. If a statement is not applicable to you.. Improve your decisions. To a great extent N/A 2 2 2 3 3 3 4 4 4 5 5 5 0 0 0 Hardly at all The extent that systems: Make it easier to do your job. Are efficient in performing their services. Sbow respect to you. Tbe term systems does not refer to the information itself. Understand your specific needs. Have the knowledge and skill to do their job well Are reliable. Solve your problems as if they were their own. Are pleasant to work witb. Are effective in performing their services. it refers to tbe capability to access. and present infonnation to you (e. Rather. manipulate. circle 0. or to develop a spreadsheet). produce.MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 113 Hardly at all The extent that IS people: Provide services for you promptly. Help to make you a more knowledgeable computer user. Are polite. Systems Perfonnance 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 To a great extent N/A 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 The following statements ask you to assess tbe extent that systems produce various outcomes for you. Are sincere. Are dependable. Are willing to help you. Are belpful to you.g. Improve your job performance. Instill confidence in you.

Facilitate knowledge utilization. Facilitate collaborative problem solving. Provide you information from other areas in the organization. Facilitate collective group decision making. Increase your productivity. Streamline work processes. KING Give you confidence to accomphsh your job. Improve customer service. Help you manage relationships with external business partners. 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 0 0 0 0 0 0 0 0 0 3 3 3 3 3 3 3 3 3 0 0 0 0 0 0 0 0 0 3 1 1 1 1 1 1 2 2 2 2 2 2 3 3 3 3 3 3 4 4 4 4 4 4 5 5 5 5 5 5 0 0 0 0 0 0 . Speed product delivery. Help you manage inbound logistics. Reduce process costs. Help you select and qualify desired suppliers. Facilitate your learning. Increase your awareness of job-related information. Enhance your problem-solving abihty. Reduce cycle times. Facilitate collective group learning. Contribute to innovation. Help retain valued customers. Facilitate knowledge transfer. Improve the quality of your work product. Improve management control. Improve customer satisfaction.114 JERRY CHA-JAN CHANG AND WILLIAM R. Enhance information sharing with your customers/suppliers. Increase your participation in decisions.

1 1 1 1 1 1 1 1 1 1 1 1 1 . Please circle the number that best represents your evaluation of each statement. If a statement is not applicable to you. Your company's intranet is easy to navigate. Systems are easy to use. Systems are responsive to meet your changing needs. circle 0.MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 115 The following statements ask you to assess general characteristics of the information systems that you use regularly. To a great extent N/A 2 2 2 2 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 3 3 3 3 3 4 4 4 4 4 4 4 4 4 4 4 4 4 5 5 5 5 5 5 5 5 5 5 5 5 5 0 0 0 0 0 0 0 0 0 0 0 0 0 Hardly at all The extent that: Systems have fast response time. Systems are accessible. Systems are flexible. System downtime is minimal. It is easy to become skillful in using systems. Systems meet your expectation. Systems are cost-effective. System use is easy to learn. Systems are well integrated. Systems are reliable.