You are on page 1of 148

jkm cover (i).

qxd 3/11/05 11:18 AM Page 1

ISBN 1-84544-101-X ISSN 1367-3270


Volume 9 Number 1 2005

Journal of

Knowledge
Management
Technology in knowledge
management
Guest Editor: Eric Tsui

www.emeraldinsight.com
Table of contents
Technology in knowledge management
Guest Editor: Eric Tsui

Volume 9 Number 1 2005

Access this journal online 2 Articles


Guest editorial
The role of IT in KM: Integrating knowledge management technologies in organizational
where are we now and where business processes: getting real time enterprises to deliver real
are we heading? 3 business performance 7
Eric Tsui Yogesh Malhotra
Note from the publisher 146 Balancing business process with business practice for organizational
advantage 29
Laurence Lock Lee

The inseparability of modern knowledge management and computer-


based technology 42
Clyde W. Holsapple

Understanding computer-mediated interorganizational collaboration:


a model and framework 53
Lei Chi and Clyde W. Holsapple

Linking social network analysis with the analytic hierarchy process for
knowledge mapping in organizations 76
Jay Liebowitz

A knowledge-based system to support procurement decision 87


H.C.W. Lau, A. Ning, K.F. Pun, K.S. Chin and W.H. Ip

The ‘‘global’’ and the ‘‘local’’ in knowledge management 101


Joseph G. Davis, Eswaran Subrahamanian and Arthur W. Westerberg

Knowledge management systems: finding a way with technology 113


John S. Edwards, Duncan Shaw and Paul M. Collier

Connected brains 126


Paul Iske and Willem Boersma

VOL. 9 NO. 1 2005, p. 1, Emerald Group Publishing Limited, ISSN 1367-3270


| JOURNAL OF KNOWLEDGE MANAGEMENT
| PAGE 1
www.emeraldinsight.com/jkm.htm

As a subscriber to this journal, you can benefit from instant, Additional complimentary services available
electronic access to this title via Emerald Fulltext and Emerald Your access includes a variety of features that add to the
Management Xtra. Your access includes a variety of features that functionality and value of your journal subscription:
increase the value of your journal subscription.
E-mail alert services
How to access this journal electronically These services allow you to be kept up to date with the latest
To benefit from electronic access to this journal you first need to additions to the journal via e-mail, as soon as new material enters
register via the internet. Registration is simple and full the database. Further information about the services
instructions are available online at www.emeraldinsight.com/ available can be found at www.emeraldinsight.com/alerts
admin Once registration is completed, your institution will have
instant access to all articles through the journal’s Table of
Contents page at www.emeraldinsight.com/1367-3270.htm More Research register
information about the journal is also available at A web-based research forum that provides insider
www.emeraldinsight.com/jkm.htm information on research activity world-wide located at
www.emeraldinsight.com/researchregister You can also register
Our liberal institution-wide licence allows everyone within your your research activity here.
institution to access your journal electronically, making your
subscription more cost-effective. Our web site has been designed User services
to provide you with a comprehensive, simple system that needs Comprehensive librarian and user toolkits have been created
only minimum administration. Access is available via IP to help you get the most from your journal subscription.
authentication or username and password. For further information about what is available visit
www.emeraldinsight.com/usagetoolkit

Key features of Emerald electronic journals Choice of access


Automatic permission to make up to 25 copies of individual Electronic access to this journal is available via a number of
articles channels. Our web site www.emeraldinsight.com is the
This facility can be used for training purposes, course notes, recommended means of electronic access, as it provides fully
seminars etc. This only applies to articles of which Emerald owns searchable and value added access to the complete content of
copyright. For further details visit www.emeraldinsight.com/ the journal. However, you can also access and search the article
copyright content of this journal through the following journal delivery
services:
EBSCOHost Electronic Journals Service
Online publishing and archiving ejournals.ebsco.com
As well as current volumes of the journal, you can also gain
access to past volumes on the internet via Emerald Fulltext and Huber E-Journals
Emerald Management Xtra. You can browse or search these e-journals.hanshuber.com/english/index.htm
databases for relevant articles. Informatics J-Gate
www.j-gate.informindia.co.in
Ingenta
Key readings
www.ingenta.com
This feature provides abstracts of related articles chosen by the
journal editor, selected to provide readers with current awareness Minerva Electronic Online Services
of interesting articles from other publications in the field. www.minerva.at
OCLC FirstSearch
www.oclc.org/firstsearch
Reference linking
Direct links from the journal article references to abstracts of the SilverLinker
most influential articles cited. Where possible, this link is to the www.ovid.com
full text of the article. SwetsWise
www.swetswise.com
E-mail an article TDnet
Allows users to e-mail links to relevant and interesting articles to www.tdnet.com
another computer for later use, reference or printing purposes. Emerald Customer Support
For customer support and technical help contact:
Emerald structured abstracts E-mail support@emeraldinsight.com
New for 2005, Emerald structured abstracts provide consistent, Web www.emeraldinsight.com/customercharter
clear and informative summaries of the content of the articles, Tel +44 (0) 1274 785278
allowing faster evaluation of papers. Fax +44 (0) 1274 785204
Guest editorial
The role of IT in KM: where are we now and
where are we heading?
Eric Tsui

Eric Tsui, Innovation Services, Abstract


CSC Australia and Department Purpose – To provide a summary of the major trends in the evolution of knowledge management (KM)
of Industrial and Systems technologies in the last five years.
Engineering, The Hong Kong Design/methodology/approach – Drawing from a range of literature published in the academic and
Polytechnic University industry arenas including the articles accepted in the special issue, the author also applied his own
(eric.tsui@polyu.edu.hk). personal experience and practice knowledge in the field to summarize the three major trends in the use
of KM technologies for the workplace and individual knowledge workers.
Findings – First, KM is becoming more and more process-centric and relevant technologies are
gradually being aligned to support process-based KM activities. Second, there is the emergence of
personal networks and applications. Third, knowledge sharing and capturing are becoming more
instantaneous (i.e. on-demand and just-in-time).
Practical implications – KM is becoming more and more just-in-time. Large-scale KM programmes still
prevail but, in future, the technical infrastructure and information content of these programmes also need
to support ad hoc, spontaneous but intensive intra- and inter-organizational collaborations.
Originality/value – While most articles on KM technologies tend to focus on individual
technique(s)/system(s), this paper provides a succinct and high-level summary of the evolution of KM
technologies from a commercial and practical perspective.

Keywords Computers, Knowledge management, Social networks, Portals


Paper type Research paper

1. Introduction
In the last five to six years, we have seen plenty of knowledge management (KM) projects come
and go. Many of these projects were successful and organizations are still leveraging benefits
from their KM systems. However, it is also fair to say that a considerable proportion of KM
projects/initiatives have failed. In retrospect, many of the KM projects that commenced in the
First and foremost, the Guest Editor past are primarily driven by the adoption of technologies. Technologies such as search
would like to thank Rory Chase,
Editor-in-Chief of the Journal of engines, retrieval and classification tools, e-collaboration tools, portals and content
Knowledge Management for offering management systems. One of the lessons learnt from these failures is that technology alone
him the privilege of guest editing this
special issue, the very first one on should not be the primary driver for any KM projects/initiatives and that an appropriate balance
technology for the journal. He was also of technology, process, people and content is instrumental to the continued success of any KM
allowed a generous timeframe to
compile this special issue, which he
deployment. Technology, however, can act as a ‘‘catalyst’’ (i.e. an accelerator) for the
feels very grateful for. The following introduction and initial buy-in of a KM program but, in order to be successful, this accelerated
persons have acted as reviewers for adoption has to be aligned with a defined KM strategy and supported by a change program.
papers submitted to this special issue
and their assistance is deeply
appreciated: Paul Iske, Frada Burstein,
On the technologies for supporting KM, as mentioned above, during the 1990s, these
Karl Wiig, Robert Smith, technologies tend to be discrete, distinct from each other and not aligned with defined
John Debenham, Simeon Simoff, business processes. When implemented, a user may have to operate separate systems in
Jay Liebowitz, Bill Martin, Ralph Miller,
Jeanette Bruno, John Gordon, order to accomplish his/her task (e.g. location of company procedures/methodologies,
Kevin Johnson, Patti Anklam, Yuefan Li, discussion with peers and sharing material with peers).
Ian Watson, Geoff Webb,
Igor Hawryszkiewycz, Donmeng Zhu,
John Edwards, Joseph Davis,
Clyde Holsapple, Chris Lueg,
2. Evolution of KM technologies
Brian Garner, Zili Zhang,
Yogesh Malholtra, Ingrid Slembek, and
Over the last five years, there have been two significant changes in landscape of KM
Dickson Lukose. technologies. First, due to advancements in open standards, these technologies have

VOL. 9 NO. 1 2005, pp. 3-6, Emerald Group Publishing Limited, ISSN 1367-3270 j JOURNAL OF KNOWLEDGE MANAGEMENT j PAGE 3
become far more interoperable and less platform dependent. As a consequence, many of
these technologies are now componentized and can be embedded seamlessly into other
enterprise applications. For example, a search engine can be incorporated as part of an
e-collaboration suite and a portal usually provides a document management component.
The second change is the bundling of the market offerings by the vendors of commercial KM
technologies. KM solutions in the marketplace today are likely to be a collection of
complementary technologies that aim at execution of a specific process (e.g. collaborative
product development), a solution (e.g. problem resolution and service support by a contact
center) or a particular industry (e.g. wealth management portal in financial services). This
change is brought about by the consolidation of vendors in the market as well as the
realization that embedding knowledge in processes is a critical success factor in nearly all
KM initiatives (Eppler et al., 1999; Seely, 2002).
So where is KM technologies heading towards in the next five years? There can be three
generalizations. First, given the enormous focus on business process management (BPM)
these few years and the increasing focus of process-based knowledge management, we
should expect an increasing alignment of KM technologies/solutions with process
management tools. The location of relevant information, re-useable assets, stakeholders
(e.g. subject matter experts, sponsor, partners etc.) should be automatic when a process is
initiated (Lewis, 2002). Furthermore, business processes will become less and less
structured in the future, they may only last a few weeks, spread across organizations, and
users will be given powerful tools to create, adjust and dismantle their processes (Seely,
2002). Second, there is the emergence of personal networks in the society. These personal
networks manifest in the form of personal knowledge grids (where an individual can
coordinate an array of resources to support the capturing and sharing of knowledge at the
personal level), social networks (which there are already abundant tools to help identify the
concentration and flow of knowledge), and personal applications (software applications
developed/selected by an individual to support his/her daily work tasks yet these
applications can also operate independently as well as in conjunction with enterprise KM
applications) (Tsui, 2002). Third, KM will become more and more ‘‘on-demand’’ (or
‘‘just-in-time’’). Large-scale long-term KM programs still exist but, at the same time,
organizations realize that they need to become more agile and adaptive in order to capitalize
on strategic opportunities (Lewis, 2002; Morey, 2001; Davenport and Glaser, 2002;
Snowden, 2002; Snowden, 2003). Increasingly, KM technologies will operate on
infrastructures, both technical and content-wise, that support the rapid deployment of
relevant tools and systems for ad hoc, intensive and inter-organizational collaborations.
Some of these tools are now available and gaining popularity in the market, notably
peer-to-peer (P2P) collaboration tools (Tsui 2002), information retrieval and filtering tools,
personal voice over IP (VoIP) communication systems, and taxonomy tools that groups
relevant e-mail messages, documents and contact names on desktops.

3. Papers in the special issue


Altogether nine papers have been accepted for publication in this special issue. The
collective themes of these papers certainly reflect and reinforce the observations on the past
and current trend of KM technologies. These papers critically explore the changing role of
technologies for KM, the embodiment of knowledge in business process management and
execution, social network analyses and artificial intelligence reasoning systems, the nature
of inter-organizational collaborations, and the fusion between local and global knowledge in
multi-national corporations.
Chi and Holsapple presented a comprehensive framework that identifies the various types of
inter-organizational collaboration systems. While most existing work address the
technological and relationship aspects, their model focuses on the types of knowledge
sharing, participative decision-making and conflict governance that underpin
inter-organizational collaborations. The findings of the paper shed light on a deeper
understanding of the processes and learning involved in such collaborations and, ultimately,
led to better alignment and choice of technologies to support different types of
inter-organizational collaborations.

j j
PAGE 4 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Through the use of a case study on the project management and R&D efforts of a large
multi-national corporation, Davis explores the challenges and solutions involved in fostering
knowledge sharing and management aspects at the global and local levels of the
organization. Some of the key issues include the generation of knowledge at the local level,
the verification and passing on such knowledge from the local to the global level, and
maintaining the local-global divide yet ensuring that the knowledge is up to date and not
duplicated. In his study, he has found four generic roles in the organization that are
instrumental to the success in maintaining the local-global divide of knowledge.
Edwards, Shaw and Collier conducted workshops with ten organizations to ascertain their
KM initiatives/systems. Their key findings from these workshops are that only three of the ten
organizations have a KM program that is technology-driven and even so, organizations are
utilizing general IT tools (e.g. e-mail, bulletin boards, information databases) to support KM
initiatives rather than KM-specific technologies. Other issues unveiled include the tension
between decentralization and centralization of IT decisions and the contrast between
providing (pushing) information to users and users requesting (pulling) information from
databases and repositories.
The role of technologies in KM has always been a debatable topic, both in academia and
industry. As mentioned earlier, the general perception is that technology was a driver in
many of the KM projects in the late 1990s but nowadays organizations are treating the
process and people aspects as critical success factors in any KM initiatives. Holsapple,
again, presented another paper examining the critical role of computer-based technology
(CBT) in KM. He argued that both the inclusive and exclusive perspectives of separating
knowledge from information completely ignored or under-estimated the contributions of CBT
to KM. He further proposed a third perspective that is to subdivide the representation and
processing of various types of knowledge by a computer system. Through this new
perspective, which is further substantiated by observations with several renowned
e-business/commerce systems, one can gain a stronger appreciation of how CBT can
add value to KM.
The majority of KM systems in implementation are on capturing, searching, and distributing
knowledge (e.g. search engines, portals, collaboration systems, intellectual capital
reporting tools). Tools/systems that foster the accumulation of social capital are rare. Iske
and Boersma remedies this imbalance by outlining a proprietary-developed
question-answering system that facilitates not only the encapsulation of core knowledge
into a repository but also linking up users with subject matter experts and thereby reducing
ongoing help desk support costs and time. Furthermore, the authors have also discussed
the cultural issues and impact on the deployment of their system as well as developed a
detail quantitative model that computes and ranks the importance of knowledge in the entire
knowledge value chain of an organization.
Knowledge audit is a technique that is often applied by organizations to ascertain what
knowledge the organization already has what else is needed to accomplish corporate
objectives. Social network analysis (SNA) is a key step in any knowledge audit. Liebowitz
defined the analytic process hierarchy (AHP) technique that can be used to measure the
requirements and preferences of an individual or department in the network. Findings from
using the APH technique can, for example, be used to better channel relevant information to
individuals, help understand decision-making in an organization, and contribute to better
process design and management. A comprehensive list of SNA tools is also outlined in his
paper.
Information technology can accomplish a lot more than mere storing and retrieving data.
Over the decades, advancements in artificial intelligence and other information processing
techniques lead to the verification and generalization of stored data, as well as the discovery
of new actionable knowledge. Lau et al. demonstrated, via a research prototype, the use of a
hybrid neural network and online analytical processing (OLAP) algorithm to capture,
process procurement data and generate recommendation for suitable supplier(s) in an
online supply chain network. This application clearly demonstrates and reinforces the trend

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 5
of embedding data and discovering new knowledge in business processes that spread
across multiple organizations.
In the pursuit to become a real time enterprise (RTE), many organizations have invested
heavily on KM and related technologies. Unfortunately, based on experience gained in the
last ten years, most of these organizations failed and very few become a truly RTE. Malholtra
critically explored the dichotomy between the technology-push and strategy-pull
approaches in organizational investments in KM technologies. With justifications and case
studies, the paper outlined the reasons that the form and function of a RTE, and the contrast
between the ends and means of achieving performance outcomes hold the key to
understand why some organizations are so successful in harnessing embedded knowledge
in their business model and processes, both at the intra and inter levels.
More on business processes, Lock Lee focused on the possible synergy between business
processes and business practice in an organizational setting. Business processes are
normally centrally defined and structured. Business practice, on the other hand, occurs at
the operational level and involves a lot of tacit knowledge. Furthermore, on many occasions,
complex decisions are also needed when executing a process. Lock Lee proposed a two
cycle model of interaction based on communities of practice to foster an appropriate mix of
process and practice to achieve optimal organizational performance. Once again, this
paper reinforces the importance of embedding and sharing knowledge in business
processes as well as shed light on how to align existing KM tools to support and enhance
decision-making in organizations at the process level.

References
Davenport, T.H. and Glaser, J. (2002), ‘‘Just-in-time delivery comes to knowledge management’’,
Harvard Business Review, Vol. 80 No. 7, pp. 5-9.
Eppler, M.J., Siefried, P.M. and Ropnack, A. (1999), ‘‘Improving knowledge-intensive processes through
enterprise knowledge medium’’, SIGCPR ’99, pp. 222-30.

Lewis, B. (2002), ‘‘On demand KM: a two-tier architecture’’, IT Professional, Vol. 4 No. 1, pp. 27-33.
Morey, D. (2001), ‘‘High-speed knowledge management: integrating operations theory and knowledge
management for rapid results’’, Journal of Knowledge Management, Vol. 5 No. 4, pp. 322-8.
Seely, C.P. (2002), ‘‘Igniting knowledge in your business processes’’, KM Review, Vol. 5 No. 4, pp. 12-15.
Snowden, D. (2002), ‘‘Just-in-time knowledge management: part 1’’, KM Review, Vol. 5 No. 5, pp. 14-17.

Snowden, D. (2003), ‘‘The knowledge you need, right when you need it’’, KM Review, Vol. 5 No. 6,
pp. 24-7.
Tsui, E. (2002), ‘‘Technologies for personal and peer-to-peer knowledge management’’, CSC Leading
Edge Forum Technology Grant Report, May, available at: www.csc.com/aboutus/lef/mds67_off
/index.shtml#grants

j j
PAGE 6 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Integrating knowledge management
technologies in organizational business
processes: getting real time enterprises to
deliver real business performance
Yogesh Malhotra

Abstract
Purpose – To provide executives and scholars with pragmatic understanding about integrating
knowledge management strategy and technologies in business processes for successful performance.
Design/methodology/approach – A comprehensive review of theory, research, and practices on
knowledge management develops a framework that contrasts existing technology-push models with
proposed strategy-pull models. The framework explains how the ‘‘critical gaps’’ between technology
inputs, related knowledge processes, and business performance outcomes can be bridged for the two
types of models. Illustrative case studies of real-time enterprise (RTE) business model designs for both
Dr Yogesh Malhotra serves on the successful and unsuccessful companies are used to provide real world understanding of the proposed
Faculty of Management framework.
Information Systems at the Findings – Suggests superiority of strategy-pull models made feasible by new ‘‘plug-and-play’’
Syracuse University and has information and communication technologies over the traditional technology-push models. Critical
taught in the executive education importance of strategic execution in guiding the design of enterprise knowledge processes as well as
programs at Kellogg School of selection and implementation of related technologies is explained.
Management and Carnegie Research limitations/implications – Given the limited number of cases, the framework is based on
Mellon University. He is the real world evidence about companies most popularized for real time technologies by some technology
founding chairman of BRINT analysts. This limited sample helps understand the caveats in analysts’ advice by highlighting the critical
Institute, LLC, the New York importance of strategic execution over selection of specific technologies. However, the framework
based internationally recognized needs to be tested with multiple enterprises to determine the contingencies that may be relevant to its
application.
research and advisory company.
His corporate and national Originality/value – The first comprehensive analysis relating knowledge management and its
integration into enterprise business processes for achieving agility and adaptability often associated
knowledge management advisory
with the ‘‘real time enterprise’’ business models. It constitutes critical knowledge for organizations that
engagements include
must depend on information and communication technologies for increasing strategic agility and
organizations such as Philips (The adaptability.
Netherlands), United Nations
Keywords Knowledge management, Real time scheduling, Business performance,
(New York City Headquarters), Return on investment
Intel Corporation (USA), National Paper type Research paper
Science Foundation (USA), British
Telecom (UK), Conference Board
(USA), Maeil Business Introduction
Newspaper and TV Network
(South Korea), Ziff Davis, Technologists never evangelize without a disclaimer: ‘‘Technology is just an enabler.’’ True
Government of Mexico, enough – and the disclaimer discloses part of the problem: enabling what? One flaw in
Government of The Netherlands, knowledge management is that it often neglects to ask what knowledge to manage and toward
and Federal Government of the what end. Knowledge management activities are all over the map: building databases,
USA. He can be contacted at: measuring intellectual capital, establishing corporate libraries, building intranets, sharing best
www.yogeshmalhotra.com e-mail: practices, installing groupware, leading training programs, leading cultural change, fostering
yogesh.malhotra@brintcom collaboration, creating virtual organizations – all of these are knowledge management, and every
functional and staff leader can lay claim to it. But no one claims the big question: why? (Tom
Stewart in The Case Against Knowledge Management, Business 2.0, February 2002).

Constructive comments offered by The recent summit on knowledge management (KM) at the pre-eminent ASIST conference
the special issue Editor Eric Tsui and
the two anonymous reviewers are
opened on a rather upbeat note. The preface noted that KM has evolved into a mature
gratefully acknowledged. reality from what was merely a blip on the ‘‘good idea’’ radar only a few years ago. Growing

DOI 10.1108/13673270510582938 VOL. 9 NO. 1 2005, pp. 7-28, Q Emerald Group Publishing Limited, ISSN 1367-3270 j JOURNAL OF KNOWLEDGE MANAGEMENT j PAGE 7
‘‘ One can see the impact of knowledge management
everywhere but in the KM technology-performance
statistics. ’’

pervasiveness of KM in worldwide industries, organizations, and institutions marks a


watershed event for what was called a fad just a few years ago. KM has become
embedded in the policy, strategy, and implementation processes of worldwide
corporations, governments, and institutions. Doubling in size from 2001, the global KM
market has been projected to reach US$8.8 billion during this year. Likewise, the market for
KM business application capabilities such as CRM (Malhotra, 2004a) is expected to grow
to $148 billion by the next year. KM is also expected to help save $31 billion in annual
re-invention costs at Fortune 500 companies. The broader application context of KM,
which includes learning, education, and training industries, offers similarly sanguine
forecasts. Annual public K-12 education is estimated at $373 billion dollars in US alone,
with higher education accounting for $247 billion dollars. In addition, the annual corporate
and government training expenditures in the US alone are projected at over $70 billion
dollars.
One can see the impact of knowledge management everywhere but in the KM
technology-performance statistics (Malhotra, 2003). This seems like a contradiction of
sorts given the pervasive role of information and communication technologies in most KM
applications. Some industry estimates have pegged the failure rate of technology
implementations for business process reengineering efforts at 70 percent. Recent
industry data suggest a similar failure rate of KM related technology implementations and
related applications (Darrell et al., 2002). Significant failure rates persist despite
tremendous improvements in sophistication of technologies and major gains in related
price-performance ratios. At the time of writing, technology executives are facing a
renewed credibility crisis resulting from cost overruns and performance problems for
major implementations (Anthes and Hoffman, 2003). In a recent survey by Hackett
Group, 45 percent CIOs attribute these problems to technology implementations being
too slow and too expensive. Interestingly, just a few months ago, some research studies
had found negative correlation between tech investments and business performance
(Alinean, 2002; Hoffman, 2002). Financial performance analysis of 7,500 companies
relative to their IT spending and individual surveys of more than 200 companies had
revealed that:
B companies with best-performing IT investments are often most frugal IT spenders;
B top 25 performers invested 0.8 percent of their revenues on IT in contrast to overall
average of 3.7 percent; and
B highest IT spenders typically under-performed by up to 50 percent compared with
best-in-class peers.
Based upon multi-year macroeconomic analysis of hundreds of corporations, Strassmann
(1997) had emphasized that it is not computers but what people do with them that matters.
He had further emphasized the role of users’ motivation and commitment in IT
performance[1]. Relatively recent research on implementation of enterprise level KMS
(Malhotra, 1998a; Malhotra and Galletta, 1999; Malhotra and Galletta, 2003; Malhotra and
Galletta, n.d. a; Malhotra and Galletta, n.d. b) has found empirical support for such
socio-psychological factors in determining IT and KMS performance. An earlier study by
Forrester Research had similarly determined that the top-performing companies in terms of
revenue, return on assets, and cash-flow growth spend less on IT on average than other
companies. Surprisingly, some of these high performance ‘‘benchmark’’ companies have
the lowest tech investments and are recognized laggards in adoption of leading-edge

j j
PAGE 8 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
‘‘ Despite increasing sophistication of KM technologies, we are
observing increasing failures of KM technology
implementations. ’’

technologies. Research on best performing US companies over the last 30 years (Collins,
2001) has discovered similar ‘‘findings’’. The above findings may seem contrarian given
persistent and long-term depiction of technology as enabler of business productivity (cf.
Brynjolfsson, 1993; Brynjolfsson and Hitt, 1996; Brynjolfsson and Hitt, 1998; Kraemer, 2001).
Despite increasing sophistication of KM technologies, we are observing increasing failures
of KM technology implementations (Malhotra, 2004b). The following sections discuss how
such failures result from the knowledge gaps between technology inputs, knowledge
processes, and business performance. Drawing upon theory, prior research, and industry
case studies, we also explain why some companies that spend less on technology and are
not leaders in adoption of most hyped RTE technologies succeed where others fail. The
specific focus of our analyses is on the application of KM technologies in organizational
business processes for enabling real time enterprise business models. The RTE enterprise is
considered the epitome of the agile adaptive and responsive enterprise capable of
anticipating surprise; hence our attempt to reconcile its sense making and information
processing capabilities is all the more interesting. However, our theoretical generalizations
and their practical implications are relevant to IT and KM systems in most enterprises
traversing through changing business environments.

Disconnects between disruptive information technologies and relevant knowledge


Organizations have managed knowledge for centuries. However, the popular interest in
digitizing business enterprises and knowledge embedded in business processes dates
back to 1993[2]. Around this time, the Business Week cover story on virtual corporations
(Byrne, 1993) heralded the emergence of the new model of the business enterprise. The new
enterprise business model was expected to make it possible to deliver anything, anytime,
and, anywhere to potential customers. It would be realized by digitally connecting
distributed capabilities across organizational and geographical boundaries. Subsequently,
the vision of the virtual, distributed, and digitized business enterprise became a pragmatic
reality with the mainstream adoption of the internet and web. Incidentally, the distribution and
digitization of enterprise business processes was expedited by the evolution of technology
architectures beyond mainframe to client-server to the internet and the web and more
recently to web services. Simultaneously, the software and hardware paradigms have
evolved to integrated hosted services and more recently to utility computing and on demand
computing (Greenemeier, 2003a, b; Hapgood, 2003; Sawhney, 2003; Thickins, 2003)
models. Organizations with legacy enterprise business applications trying to catch up with
the business technology shifts have ended up with disparate islands of diverse
technologies.

Decreasing utility of the technology-push model


Management and coordination of diverse technology architectures, data architectures, and
system architectures poses obvious knowledge management challenges (Malhotra, 1996;
Malhotra, 2001a; Malhotra, 2004b). Such challenges result from the need for integrating
diverse technologies, computer programs, and data sources across internal business
processes. These challenges are compounded manifold by the concurrent need for
simultaneously adapting enterprise architectures to keep up with changes in the external
business environment. Often such adaptation requires upgrades and changes in existing
technologies or their replacement with newer technologies. Going business enterprises

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 9
often have too much (unprocessed) data and (processed) information and too many
technologies. However, for most high-risk and high-return strategic decisions, timely
information is often unavailable as more and more of such information is external in nature
(Drucker, 1994; Malhotra, 1993; Terreberry, 1968; Emery and Trist, 1965). Also, internal
information may often be hopelessly out of date with respect to evolving strategic needs.
Cycles of re-structuring and downsizing often leave little time or attention to ensure that the
dominant business logic is kept in tune with changing competitive and strategic needs.
As a result, most organizations of any size and scope are caught in a double whammy of
sorts. They do not know what they know. In simple terms, they have incomplete
knowledge of explicit and tacit data, information, and decision models available within
the enterprise. Also, their very survival may sometimes hinge on obsolescing what they
know (see for instance, Yuva, 2002; Malhotra, 2004b; Malhotra, 2002c). In other words,
often they may not know if the available data, information, and decision models are
indeed up to speed with the radical discontinuous changes in the business environment
(Arthur, 1996; Malhotra, 2000a; Nadler and Shaw, 1995). In this model, incomplete and
often outdated data, information, and decision models drive the realization of the
strategic execution, but with diminishing effectiveness. The model may include reactive
and corrective feedback loops. The logic for processing specific information and
respective responses are all pre-programmed, pre-configured, and pre-determined. The
mechanistic information-processing orientation of the model generally does not
encourage diverse interpretations of information or possibility of multiple responses to
same information. As depicted in Figure 1, this model of KM is often driven by
technological systems that are out-of-alignment with strategic execution and may be
characterized as the technology-push model. This model has served the needs of
business performance given more manageable volumes of information and lesser variety
of systems within relatively certain business environment. However, with recent
unprecedented growth in volumes of data and information, the continuously evolving
variety of technology architectures, and the radically changing business environment,
this model has outlasted its utility. The limitations of the technology-push model are
evident in the following depiction of IT architectures as described in Information Week by
LeClaire and Cooper (2000):
The infrastructure issue is affecting all businesses . . . E-business is forcing companies to
rearchitect all or part of their IT infrastructures – and to do it quickly. For better or worse, the
classic timeline of total business-process reengineering – where consultants are brought in,
models are drawn up, and plans are implemented gradually over months or years – just isn’t fast
enough to give companies the e-commerce-ready IT infrastructures they need . . . Many
companies can’t afford to go back to the drawing board and completely rearchitect critical

Figure 1 How ICT systems drive and constrain strategic execution

g
Environment

TECHNOLOGY PUSH MODEL OF KM

j j
PAGE 10 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
‘‘ The gap between IT and business performance has grown with
the shifting focus of business technology strategists and
executives. ’’

systems such as order fulfillment and product databases from the bottom up because they
greatly depend on existing infrastructure. More often, business-process reengineering is done
reactively. Beyond its disruptive effect on business operations, most IT managers and executives
don’t feel there’s enough time to take a holistic approach to the problem, so they attack tactical
issues one-by-one. Many companies tackle a specific problem with a definitive solution rather
than completely overhaul the workflow that spans from a customer query to online catalogs to
order processing.

Strategic execution: the real driver of business performance


The gap between IT and business performance has grown with the shifting focus of business
technology strategists and executives. Over the past two decades, their emphasis has
shifted from IT (Porter and Millar, 1985; Hammer 1990) to information (Evans and Wurster,
2002; Rayport and Sviokla, 1995; Hopper, 1990; Huber, 1993; Malhotra, 1995) to knowledge
(Holsapple and Singh, 2001; Holsapple, 2002; Koenig and Srikantaiah, 2000a; Malhotra,
2004b; Malhotra, 2000b; Malhotra, 1998c) as the lever of competitive advantage. At the time
of the writing, technology sales forecasts are gloomy because of the distrust of business
executives who were previously oversold on the capabilities of technologies to address real
business threats and opportunities. This follows on the heels of the on-and-off love-hate
relationship of the old economy enterprises and media analysts with the new economy
business models over the past decade. We first saw unwarranted wholesale adulation and
subsequently wholesale decimation of technology stocks. All the while, many industry
executives and most analysts have incorrectly presumed or pitched technology as the
primary enabler of business performance (Collins, 2001; Schrage, 2002)[3].
The findings from the research (Collins, 2001) on best performing companies over the last
three decades are summarized in Table I. These findings are presented in terms of the
inputs-processing-outcomes framework used for contrasting the technology-push model
with the strategy-pull model of KM implementation[4]. Subsequent discussion will further
explain the relative advantages of the latter in terms of strategic execution and business
performance. Given latest advances in web services, the strategic framework of KM
discussed here presents a viable alternative for delivering business performance as well as
enterprise agility and adaptability (Strassmann, 2003).

Will the real knowledge management please stand up?


The technology evangelists, criticized by Stewart (2000), have endowed the KM
technologies with intrinsic and infallible capability of getting the right information to the
right person at the right time. Similar critiques (cf. Malhotra, 2000a; Hildebrand, 1999) have
further unraveled and explained the ’’myths’’ associated such proclamations made by the
technology evangelists. Specifically, it has been underscored that in wicked business
environments (Churchman, 1971; Malhotra, 1997) characterized by radical discontinuous
change (Malhotra, 2000a; Malhotra, 2002b), the deterministic and reductionist logic (Odom
and Starns, 2003) of the evangelists does not hold. Incidentally, most high potential business
opportunities and threats are often embedded within such environments (Arthur, 1996;
Malhotra, 2000c; Malhotra, 2000d). Such environments are characterized by fundamental
and ongoing changes in technologies as well as the strategic composition of market forces.
Increasing failures rates of KM technologies often result from their rapid obsolescence given
changing business needs and technology architectures. Popular re-labeling by vendors of
many information technologies as KM technologies has not helped the situation. Skeptics of

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 11
Table I Strategic execution as driver of technology deployment and utilization lessons from
companies that achieved high business performance

Lessons learned from some of the most successful business enterprises that distinguished
themselves by making the leap from ‘‘good to great’’ (Collins, 2001)

Lessons about outcomes: strategic execution, the primary enabler


(1) How a company reacts to technological change is a good indicator of its inner drive for greatness
versus mediocrity. Great companies respond with thoughtfulness and creativity, driven by a
compulsion to turn unrealized potential into results; mediocre companies react and lurch about,
motivated by fear of being left behind
(2) Any decision about technology needs to fit directly with three key non-technological questions:
What are you deeply passionate about? What can you be the best in the world at? What drives your
economic engine? If a technology does not fit squarely within the execution of these three core
business issues, the good-to-great companies ignore all hype and fear and just go about their
business with a remarkable degree of equanimity
(3) The good-to-great companies understood that doing what you are good at will only make you
good; focusing solely on what you can potentially do better than any other organization is the only
path to greatness

Lessons about processing: how strategic execution drives technology utilization


(1) Thoughtless reliance on technology is a liability, not an asset. When used right – when linked to a
simple, clear, and coherent concept rooted in deep understanding – technology is an essential
driver in accelerating forward momentum. But when used wrongly – when grasped as an easy
solution, without deep understanding of how it links to a clear and coherent concept – technology
simply accelerates your own self-created demise
(2) No evidence was found that good-to-great companies had more or better information than the
comparison companies. In fact both sets of companies had identical access to good information.
The key, then, lies not in better information, but in turning information into information that cannot
be ignored
(3) 80 percent of the good-to-great executives did not even mention technology as one of the top five
factors in their transition from good-to-great. Certainly not because they ignored technology: they
were technologically sophisticated and vastly superior to their comparisons
(4) A number of the good-to-great companies received extensive media coverage and awards for
their pioneering use of technology. Yet the executives hardly talked about technology. It is as if the
media articles and the executives were discussing two totally different sets of companies!

Lessons about technology inputs: how strategic execution drives technology deployment
(1) Technology-induced change is nothing new. The real question is not What is the role of technology?
Rather, the real question is How do good-to-great organizations think differently about
technology?
(2) It was never technology per se, but the pioneering application of carefully selected technologies.
Every good-to-great company became a pioneer in the application of technology, but the
technologies themselves varied greatly
(3) When used right, technology becomes an accelerator of momentum, not a creator of it. The
good-to-great companies never began their transitions with pioneering technology, for the simple
reason that you cannot make good use of technology until you know which technologies are
relevant
(4) You could have taken the exact same leading-edge technologies pioneered at the good-to-great
companies and handed them to their direct comparisons for free, and the comparisons still would
have failed to produce anywhere near the same results

technology have observed that real knowledge is created and applied in the processes of
socialization, externalization, combination, and internalization (Nonaka and Takeuchi, 1995)
and outside the realm of KM technologies. Practitioners’ inability to harness relevant
knowledge despite KM technologies and offices of the CKOs caused the backlash and KM
was temporarily branded as a fad. Scholarly research on latest information systems and
technologies, or lack thereof, has further contributed to the confusion between data
management, information management, and knowledge management.

j j
PAGE 12 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Recent reviews of theory and research on information systems and KM (Alavi and Leidner,
2001; Schultze and Leidner, 2002) seem to confirm Stewart’s (2000) observation about the
key flaw of knowledge management:
Knowledge management activities are all over the map . . . But no one claims the big question:
why?

Hence, it is critical that a robust distinction between technology management and


knowledge management should be based on theoretical arguments that have been tested
empirically in the ‘‘real world messes’’ (Ackoff, 1979) and the ‘‘world of re-everything’’
(Arthur, 1996). We are observing diminishing credibility of information technologists (Anthes
and Hoffman, 2003; Hoffman, 2003; Carr, 2003). A key reason for this is an urgent need for
understanding how technologies, people, and processes together influence business
performance (Murphy, 2003). Explicit focus on strategic execution as the driver of
technology configurations in the strategy-pull KM framework reconciles many of the above
problems. The evolving paradigm of technology architectures to on demand plug-and-play
inter-enterprise business process networks (Levitt, 2001) is expected to facilitate future
realization of KM value networks. Growing popularity of the web services architecture
(based upon XML, UDDI, SOAP, WSDL) is expected to support the realization of real-time
deployment of business performance driven systems based upon the proposed model
(Kirkpatrick, 2003; Zetie, 2003; Murphy, 2003).
The technology-push model is attributable for the inputs – and processing – driven KM
implementations with emphasis on pushing data, information, and decisions. In contrast, the
strategy-pull model recognizes that getting pre-programmed information to pre-determined
persons at the pre-specified time may not by itself ensure business performance. Even if
pre-programmed information does not become out-dated, the recipient’s attention and
engagement with that information is at least equally important. Equally important is the
reflective capability of the recipient to determine if novel interpretation of the information is
necessary or if consideration of novel responses is in order given external changes in the
business environment. The technology-push model relies upon single-loop automated and
unquestioned automatic and pre-programmed response to received stimulus. In contrast,
the strategy-pull model has built in double-loop process that can enable a true
sense-and-respond paradigm of KM[5]. The focus of the technology-push model is on
mechanistic information processing while the strategy-pull model facilitates organic sense
making (Malhotra, 2001b). The distinctive models of knowledge management have been
embedded in KM implementations of most organizations since KM became fashionable. For
instance, the contrast between the models can be illustrated be comparing the fundamental
paradigm of KM guiding the two organizations, a US global communications company and a
US global pharmaceutical firm. The telecommunications company adopted the mechanistic
information- and processing-driven paradigm of KM (Stewart and Kaufman, 1995):
What’s important is to find useful knowledge, bottle it, and pass it around.

In contrast, given their emphasis on insights, innovation, and creativity, the pharmaceutical
company adopted the organic sense-making model of KM (Dragoon, 1995, p. 52):
There’s a great big river of data out there. Rather than building dams to try and bottle it all up into
discrete little entities, we just give people canoes and compasses.

The former model enforces top-down compliance and control through delivery of
institutionalized information and decision models. In contrast, the latter model encourages
discovery and exploration for questioning given assumptions and surfacing new insights
(Nonaka and Takeuchi, 1995).

Real time strategic execution: the real enabler of the RTE


The issues of technology deployment, technology utilization, and business performance
need to be addressed together to ensure that technology can deliver upon the promise of
business performance. Interestingly, most implementations of KM systems motivated by the
technology-push model have inadvertently treated business performance as a residual:
what remains after issues of technology deployment and utilization are addressed[6]. This

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 13
perhaps explains the current malaise of IT executives and IT management in not being able
to connect with business performance needs (Hoffman, 2003). A sense-and-respond KM
system that can respond in real time would need to consider the holistic and collective effect
of:
B real-time deployment in terms of tech and human infrastructure (inputs);
B real-time utilization in terms of what is done about or with information (processing); and
B real-time performance in terms of how it delivers business performance (outcomes).
Deployment of intranets, extranets, or, groupware cannot of itself deliver business
performance. These technologies would need to be adopted and appropriated by the
human users, integrated within their respective work-contexts, and effectively utilized while
being driven by the performance outcomes of the enterprise. To deliver real-time response,
business performance would need to drive the information needs and technology
deployment needs. This is in congruence with the knowledge management logic of the top
performing companies discussed earlier. These enterprises may not have created the buzz
about the latest technologies. However, it is unquestionable that these best performing
organizations harnessed organizational and inter-organizational knowledge embedded in
business processes most effectively to deliver top-of-the-line results. The old model of
technology deployment spanning months or often years often resulted in increasing
misalignment with changing business needs. Interestingly, the proposed model turns the
technology-push model on its head. The strategy-pull model illustrated in Figure 2 treats
business performance not as the residual but as the prime driver of information utilization as
well as IT-deployment.
The contrast between the inputs-processing-output paradigms of KM implementations is
further explained in the following section to bridge the existing gaps in KM research and
practice.

Gaps in KM implementation research and practice


The ‘‘knowledge application gap’’ that is characteristic of the inputs- and processing-driven
technology-push model have also been the subject of criticism in scholarly research on KM
(Alavi and Leidner, 2001; Zack, 2001). However, these gaps seem to persist across most of
theoretical research and industry practices related to information systems and knowledge
management as shown in Table II. As discussed in Malhotra (2000a), such gaps have
persisted over the past decade despite advances in understanding of KM and
sophistication of technology architectures.

Figure 2 Strategic execution – the primary enabler of the RTE business model

Environment

()

STRATEGY

j j
PAGE 14 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Table II Driving KM with business performance from inputs- and processing-driven KM to
outcomes-driven KM

Additional theoretical and applied definitions of KM are discussed in Malhotra (2000a)

Technology-push models of KM
(Depicted in Figure 1)

Inputs-driven paradigm of KM
‘‘Knowledge management systems (KMS) refer to a class of information systems applied to managing
organizational knowledge. That is, they are IT-based systems developed to support and enhance the
organizational processes of knowledge creation, storage/retrieval, transfer, and application’’ (Alavi
and Leidner, 2001)
‘‘Knowledge management is the generation, representation, storage, transfer, transformation,
application, embedding, and protecting of organizational knowledge’’ (Schultze and Leidner, 2002)
‘‘For the most part, knowledge management efforts have focused on developing new applications of
information technology to support the capture, storage, retrieval, and distribution of explicit
knowledge’’ (Grover and Davenport, 2001)
‘‘Knowledge has the highest value, the most human contribution, the greatest relevance to decisions
and actions, and the greatest dependence on a specific situation or context. It is also the most difficult
of content types to manage, because it originates and is applied in the minds of human beings’’
(Grover and Davenport, 2001)
‘‘Knowledge management uses complex networks of information technology to leverage human
capital. The integration of user-friendly electronic formats facilitates inter-employee and customer
communication; a central requirement for successful KM programs’’ (eMarketer, 2001)
‘‘In companies that sell relatively standardized products that fill common needs, knowledge is
carefully codified and stored in databases, where it can be accessed and used – over and over again
– by anyone in the organization’’ (Hansen and Nohria, 1999)

Processing-driven paradigm of KM
‘‘KM entails helping people share and put knowledge into action by creating access, context,
infrastructure, and simultaneously reducing learning cycles’’ (Massey et al., 2001)
‘‘Knowledge management is a function of the generation and dissemination of information,
developing a shared understanding of the information, filtering shared understandings into degrees of
potential value, and storing valuable knowledge within the confines of an accessible organizational
mechanism’’ (CFP for Decision Sciences special issue on Knowledge Management, 2002)
‘‘In companies that provide highly customized solutions to unique problems, knowledge is shared
mainly through person-to-person contacts; the chief purpose of computers is to help people
communicate’’ (Hansen and Nohria, 1999)

Strategy-pull model of KM
(Depicted in Figure 2)

Outcomes-driven paradigm of KM
‘‘Knowledge Management refers to the critical issues of organizational adaptation, survival and
competence against discontinuous environmental change. Essentially it embodies organizational
processes that seek synergistic combination of data and information-processing capacity of
information technologies, and the creative and innovative capacity of human beings’’ (Malhotra,
1998b)

The sample of ‘‘definitions’’ of KM listed in Table II is not exhaustive but illustrative.


However, it gets the point across about the missing link between KM and business
performance in research and practice literatures. Despite lack of agreement on what is
KM, most such interpretations share common emphasis on the inputs- and
processing-driven technology-push model. Review of most such ‘‘definitions’’ also
leaves one begging for a response to Stewart’s pointed question to technologists’
evangelism about KM: ‘‘why?’’ In contrast, the strategy-pull model with its outcomes-driven
paradigm seems to offer a more meaningful and pragmatic foundation for KM. At least as
far as real world outcomes are concerned, this paradigm measures up to the expectations
about KM policy and its implementation in worldwide organizations[7]. Better
understanding of the gaps that we are trying to reconcile is possible by appreciating

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 15
‘‘ Increasing failures rates of KM technologies often result from
their rapid obsolescence given changing business needs and
technology architectures. ’’

the contrast between the three paradigms of KM implementation that have characterized
the technology-push and strategy-pull models of KM depicted in Figures 1 and 2. This
contrast is explained in terms of their primary and differential focus on the inputs,
processing, and outcomes.
The inputs-driven paradigm considers information technology and KM as synonymous. The
inputs-driven paradigm with its primary focuses on technologies such as digital repositories,
databases, intranets, and, groupware systems has been the mainstay of many KM
implementation projects. Specific choices of technologies drive the KM equation with
primary emphasis on getting the right information technologies in place. However, the
availability of such technologies does not ensure that they positively influence business
performance. For instance, installing a collaborative community platform may neither result
in collaboration nor community (Barth, 2000; Charles, 2002; Verton, 2002). The practitioners
influenced by this paradigm need to review the ‘‘lessons about technology inputs’’ listed
earlier in Table I.
The processing-driven paradigm of KM has its focus on best practices, training and learning
programs, cultural change, collaboration, and virtual organizations. This paradigm
considers KM primarily as means of processing information for various business activities.
Most proponents of RTE belong to this paradigm given their credo of getting the right
information to the right person at the right time. Specific focus is on the activities associated
with information processing such as process redesign, workflow optimization, or automation
of manual processes. Emphasis on processes ensures that relevant technologies are
adopted and possibly utilized in service of the processes. However, technology is often
depicted as an easy solution to achieve some type of information processing with tenuous if
any link to strategic execution needed for business performance. Implementation failures
and cost-and-time overruns that characterize many large-scale technology projects are
directly attributable to this paradigm (Anthes and Hoffman, 2003; Strassmann, 2003). Often
the missing link between technologies and business performance is attributable to choice of
technologies intended to fix broken processes, business models, or organizational cultures.
The practitioners influenced by this paradigm need to review the ‘‘lessons about
processing’’ listed earlier in Table I.
The outcomes-driven paradigm of KM has its primary focus on business performance. Key
emphasis is on strategic execution for driving selection and adaptation of processes and
activities, and carefully selected technologies. For instance, if collaborative community
activities do not contribute to the key customer value propositions or business value
propositions of the enterprise, such activities are replaced with others that are more directly
relevant to business performance (Malhotra, 2002a). If these activities are indeed relevant to
business performance, then appropriate business models, processes, and culture are
grown (Brooks, 1987) as a precursor to acceleration of their performance with the aid of KM
technologies. Accordingly, emphasis on business performance outcomes as the key driver
ensures that relevant processes and activities, as well as, related technologies are adopted,
modified, rejected, replaced, or enhanced in service of business performance. The
practitioners interested in this paradigm need to review the ‘‘lessons about outcomes’’ listed
earlier in Table I.
The contrast between the outcomes-driven strategy-pull model and the input- and
processing- driven technology-push model is even evident in the latest incarnation of KM

j j
PAGE 16 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
under the moniker of RTE. Given the confusion between KM and KM technologies that
resulted in the backlash against technology vendors, it is germane to point out a similar
future for the proponents of RTE. There is an imperative need for making a clear distinction
between the business performance capabilities afforded by the RTE business model and
the technologies that are labeled as RTE technologies. As discussed earlier, success in
strategic execution of a business process or business model may be accelerated with
carefully chosen technologies. However, in absence of good business processes and
business model, even the most sophisticated technologies cannot ensure corporate
survival.

Coming of the real time enterprise: the new knowledge management


The RTE enterprise is based upon the premise of getting the right information to the right
people at the right time (Gartner, Inc., 2002) in ‘‘real time’’, i.e. without latency or delay (cf.,
Lindorff, 2002; Lindquist, 2003; Margulius, 2002; Meyer, 2002; Siegele, 2002; Stewart,
2000). Enabling the RTE should lead to faster and better decisions, and enhanced agility
and adaptability. RTE represents the future of knowledge enabled business processes:
wherein digitized organizations interact with increasing and relentless speed and any
specific ‘‘event’’ results in a real-time ‘‘response’’. For instance, businesses such as Gillette
and Wal-Mart are trying to minimize the delay between a customer order, its shipment and
the restocking of inventory with the help of radio-frequency detection (RFID) tags, also
known as smart tags (Cuneo, 2003). The proponents of RTE technologies suggest that these
technologies would help companies to learn to adapt, evolve, and survive within increasingly
uncertain business environments. Their rationale still seems to be based on the
technology-push model of KM and may perhaps benefit from recognizing the
strategy-pull model as a complement. One such perspective of RTE (Khosla and Pal,
2002) that yet does not address Stewart’s (2000) big question: ‘‘why?’’ and may benefit from
focus proposed above is listed below:
Real time enterprises are organizations that enable automation of processes spanning different
systems, media, and enterprise boundaries. Real time enterprises provide real time information to
employees, customers, suppliers, and partners and implement processes to ensure that all
information is current and consistent across all systems, minimizing batch and manual processes
related to information. To achieve this, systems for a real time enterprise must be ‘‘adaptable’’ to
change and accept ‘‘change as the process’’.

The RTE will be able to operate at speeds with split-second reaction times that may far
exceed human speeds of gathering and processing of information, analysis, and response
(Meyer, 2002). At least, that is what the proponents of ‘‘RTE technologies’’ such as Khosla
and Pal (2002) claim. Examples of increase of business process velocity that are often
attributed to information technology include the following examples (Gartner, Inc., 2002):
B trading analytics: from 30 minutes to five seconds;
B airline operations: from 20 minutes to 30 seconds;
B call center inquires: from eight hours to ten seconds;
B tracking finances: from one day to five minutes;
B supply chain updates: from one day to 15 minutes;
B phone activation: from three days to one hour;
B document transfer: from three days to 45 seconds;
B trade settlement: from five days to one day; and
B build-to-order PCs: from six weeks to one day.
RTE enterprises would harness everything from radio frequency sensors and smart dust to
global positioning satellites and worker-monitoring software to monitor and control all
processes and activities. There are obvious benefits of the automated event-driven
architectures (Sliwa, 2003) for repetitive, structured, and routine decisions (Malhotra,
2004b). Well-tested business processes may be suitable candidates for acceleration with

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 17
automation of manual activities and workflows (Malhotra, 2000d). However, the more critical
problem can be understood in terms of the contrast between the technology-push model
and the strategy-pull model. The programmed logic of the RTE may yield diminishing returns
if environmental change outpaces the assumptions and logic embedded in its computerized
networks. Split-second decisions based upon pre-determined ‘‘rules’’ are efficient as they
follow the single-loop logic and are well suited to repetitive, structured, and routine
decisions. However when such decisions are made regardless of the obsolescing business
process or business model, the price is paid in terms of effectiveness (Drucker, 1994; Yuva,
2002). High-risk or high-return situations require reflection, and re-thinking as meaning of
information could change and previously non-existent responses become feasible. This is all
particularly applicable in contexts within which creativity and innovation facilitate
emergence of new meaning, insights, and actions. Such complex meaning making and
sense making capabilities for anticipating the unforeseen are yet unavailable in existing
technologies (cf., Wolpert, 2001)[8].

RTE business models: function should drive the choice of form


Successful RTE enterprises focus primarily on the function of the business model that guides
the choice of the infrastructure form for accelerating strategic execution. Unsuccessful RTE
enterprises, in contrast, often meet their fate because of the misplaced belief that form could
somehow compensate for the inadequacy of the function. Successful RTE business models
may be apparent in virtual companies such as e-Bay that owe most of their functioning to
social capital embedded in their users, buyers, and sellers. Successful RTE business
models may also be apparent in companies with brick-and-mortar stores such as Wal-Mart.
Regardless of the variations in form, most such companies share a similar but distinctive
focus on their higher purpose, which guides their strategy and its execution. This
observation perhaps explains how some companies achieved most sustained business
performance with lesser investments in related technologies. Often their success was
attributable to a differentiated business model based on strong ties with customers and
suppliers rather than most recent investments in CRM and SCM systems. Strategic
execution of the business models was accelerated with the help of technologies. However,
successful companies had superior business models and a consistent track record of
strategic execution as a precursor. Smart and selective investments in technologies afforded
them the ability to do more with less by accelerating their business capabilities. Also, strong
ties with suppliers and customers enabled them to spread the risk of investing, deploying,
and utilizing the technologies with their partners and customers[9].

Enabling the RTE: ends should drive the choice of means


The misplaced emphasis of technology-push models arose from their primary focus on the
means rather than the ends as explained in this section. Most such KM implementations
often happened to be caught in the convoluted complexities of technology deployment and
processing without making a real difference in business performance. Given the state of
technology and the long time spans necessary for getting business systems in place, an
obvious question is relevant about the superior business performers: how did the top
performing companies manage to produce stellar business results despite having to choose
same or similar technologies as their competitors? It may be argued that the top performers
always kept their key focus on business performance. They adopted new technologies and
adapted old technologies without compromising on that primary focus. Their technologies
were used for pushing data, information, and decision models just like their competitors.
However, unlike the competitors they vanquished, their choices of business processes and
technologies were still driven by their primary focus on strategic execution. They may not
have planned to be laggards in adopting new technologies or in spending less on such tech
investments. Rather their slow but steady progress in selecting, eliminating, modifying,
adapting, and integrating old and new technologies in service of their business models and
business processes seemed to pay off. As they accelerated their already superior business
models and business processes with new technologies, they realized greater returns in
business performance. It may also be argued that many of their competitors imitated their
choices of specific technologies often based upon ‘‘best practice’’ studies and

j j
PAGE 18 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
‘‘benchmarks’’ (Malhotra, 2002d). Mistakenly treated as easy and assured solutions for
fixing broken business processes and business models, new technologies further escalated
the ‘‘knowledge application gap’’. Some of these comparison companies saw a spate of
fickle and frequent technology and tech personnel changes, but their business problems
persisted eventually leading to corporate failures or bankruptcies. In contrast, top
performing companies have grown their business models around carefully thought out
customer value propositions and business value propositions in spite of their adoption, or
lack thereof, of latest technologies. Knowledge becomes the accelerator of business
performance when identified with execution of business strategy rather than with the choices
of tools and technologies that keep changing with time. In the eyes of the wise, knowledge
and action are one (Beer, 1994).

Why do some RTE businesses succeed (where others fail)?


The following cases were selected after reviewing the industry case studies of companies
that were often described as benchmarks in terms of their RTE business models. Specific
companies were chosen based on their visibility in the business technology press and
popular media. The reviews of industry cases studies were guided by our interest in
understanding the link between investments in advanced technologies and resulting
business performance.

Wal-Mart: RTE business model where technology matters less


Some IT analysts have attributed Wal-Mart’s success to its investment in RTE technologies.
However, Wal-Mart has emerged as a company that has set the benchmark of doing more
with less. Wal-Mart did not build its competitive advantage by investing heavily or by
investing in latest technologies (Schrage, 2002). A McKinsey Global Institute reports:
The technology that went into what Wal-Mart did was not brand new and not especially at the
technological frontiers, but when it was combined with the firm’s managerial and organizational
innovations, the impact was huge.

More recently, Collins (2003) has predicted that Wal-Mart may become the first company to
achieve trillion-dollar valuation within next ten years following the performance-driven model
delineated in Table I and discussed earlier. In contrast to its competitors, Wal-Mart
systematically and rigorously deployed its technologies with clear focus on its core value
proposition of lowest prices for mass consumers. With that singular focus, it went about
setting up its supply chains and inventory management systems to accelerate business
performance. Long before anyone had heard about the RTE technologies, Wal-Mart was
perfecting its logistic prowess based on the hub-and-spoke model of truck routes and
warehouses underlying its inventory management systems. It was much later in the process
when for its $4 billion investment in its supply chain systems its suppliers invested ten times
that amount to accelerate its RTE business model underlying its supply chain network
(Schrage, 2002). The business model created the strong linkages with suppliers, which not
only heavily subsidized the costs of technology investments but also pre-committed the
partners to the success of the shared systems. Simultaneously, given its retail channels,
distribution network, and proximity to customers through market scanner data, it has
preempted its suppliers from directly competing against it.

Dell: RTE business model that does more with less


Dell has developed and perfected its business model by developing strong ties with its
customer base over the past 17 years. It perfected its business model over several years
before accelerating its business performance with the aid of carefully selected technologies.
It has cultivated outstanding relationships with its virtual supply chain partners including
outsourcing providers (such as Solectron) and technology vendors (such as HP, Sony, and
EMC). Dell also leverages its customer reach and range and market penetration for deriving
commercial benefits from technologies developed by its technology partners. It has been
developing and extending the real time logic over the past several years first for selling and
servicing desktop computers, and later to aggregation and distribution of value-added
products and services servers, storage, networking, printers, switches, and handheld

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 19
computers. According to a survey of 7,500 companies conducted by Alinean (2002), Dell is
a thrifty IT spender. Dell is equally frugal in its R&D spending (1.5 percent of revenues),
according to a recent Business Week report, despite its continuing forays into new products
and services. Through its alliances with partners such as EMC, Dell is able to leverage their
research on product innovation while itself concentrating on perfecting the linkages with
customers as well as suppliers. Dell’s early innovations in passionate pursuit for being the
low cost ‘‘build on demand’’ leader for consumer computing products has yielded it the
advantage of real time business performance. More recently, it has been able to accelerate
the performance of its business model with the aid of carefully chosen technologies.

GE: RTE automation for operational efficiencies


GE views the real time movement as an extension of GE’s renowned emphasis on Six Sigma
quality drive. The business model defined for maintaining quality standards has been
extended to control costs by minimizing response time to problems affecting products
purchased by its customers. GE’s CIO Gary Reiner tracks once every 15 minutes what he
considers to be the few most critical variables including sales, daily order rates, inventory
levels, and savings from automation across the company’s 13 worldwide businesses. He
acknowledges that it is neither feasible nor desirable to track all kinds of information in real
time even with the aid of digital dashboards. Most operational information is tracked on daily
or weekly basis while other kinds of information is tracked on an exception-reporting basis.
The company claims operational savings of 35-60 percent in costs involved in customer
response, customer service, and sales. Most of these savings are attributable more to
management control rather than to technologies that are used to enforce pre-negotiated
contracts on its buyers who deal with its various suppliers. Operational automation that is
executed in terms of command and control logic seeking compliance has not been without
its adverse ramifications. GE has encountered labor management disputes resulting from
the workers who are not accustomed to minute-by-minute electronic surveillance.

Cisco: real time enterprise technology troubles


Cisco has been lauded for its RTE technologies since three years ago when its market cap
was 850 percent of its recent market capitalization during this year. The company prided
itself about the RTE technologies that offered apparently seamless integration of real time
data within and across its supply chain and customer ordering systems. The company had
legendary faith in its technologies for predictive modeling and decision-making (Carter,
2001). In a Harvard Business Review article, the company’s CFO (of that time) claimed that:
We can literally close our books within hours . . . the decision makers who need to achieve sales
targets, manage expenses and make daily tactical operating decisions now have real-time
access to detailed operating data.

Unfortunately, real-time access to data could not be of much help when, buoyed by its
unparalleled growth over several quarters[10], Cisco made some fundamentally incorrect
assumptions about the future. Cisco ignored a key lesson of KM that is often ignored by
many others: the past may not be an accurate predictor of the future. While other networking
companies with less sophisticated technologies had cut back on production schedules
months earlier seeing impending downturn in demand, Cisco stuck to the forecasts of their
‘‘virtual close’’ system that they considered invincible. As Cisco (or, rather, its
technology-driven forecasting systems) had never been proven wrong before, their
business partners saw little merit in trying to question their proven wisdom. As a result of
misplaced faith in the power of the forecasting systems, Cisco ended up writing off $2.2
billion in inventories and sacking 8,500 employees. Industry experts and analysts suggest
that Cisco’s write-off resulted from its blindsided over-reliance on its much vaunted ‘‘virtual
close’’ systems. Cisco’s case demonstrates that even the best technology offers no
protection against bad management decisions, especially when the assumptions
embedded in the dominant logic are taken for granted. Some Cisco executives do
maintain that in absence of the RTE ‘‘virtual close’’, the outcome could have been worse.
Cisco retains its optimism in perfecting its RTE systems hoping they would eventually
provide high certitude in the face of increasingly uncertain business environment.

j j
PAGE 20 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Enron: destroyed in real time
Given the dominant and intensive role of real-time information, many of the technologies
associated with real-time response were initially adopted by financial services firms on the
Wall Street. Given Enron Online’s primary business of exchanging and trading financial data,
the real-time response model seemed like a match made in heaven. Enron planned to
leverage its online exchange for facilitating direct real time instantaneous transactions in the
online trading of energy market commodities. In its communiqué submitted to the Federal
Trade Commission, Enron had emphasized that:
Efficiency gains made possible by dynamic pricing and trading are especially well suited to
Enron’s on-line business because electronic trading can match the speed with which commodity
pricing changes. Transactions that used to take up to three minutes to complete over the phone
now take just a second or two, including complex processes such as credit checks.

The company deployed Tibco’s vaunted ‘‘RTE platform’’, sought out new technology
wherever possible, and planned to spend hundreds of millions of dollars on technology
infrastructure. The management control and oversight vagaries of Enron’s management as
well its insider- and self-dealings with fictitious entities are well documented in the records of
the US Senate hearings as well as the public records of print and broadcast media. Post-hoc
analysis of Enron’s RTE technologies confirms prior observations about the technology-push
model (Berinato, 2002):
If these [accounting irregularities] hadn’t come up, the IT inefficiency might well have come up to
bite Enron . . . Enron IT was as cutting edge as it was Byzantine. There were plenty of great tools,
but there was precious little planning . . . The core systems supporting the main
revenue-generating activities were very disjointed . . . There were major disconnects from deal
capture to risk management to logistics to accounting. They all worked from different data
sources . . . They had teams and teams of people who had to comb through the data and
massage it so that it made sense . . . There was a lot of magic, transforming apples into oranges
and oranges into apples. Preparing annual reports was a joke . . . The breakneck deployment of
state-of-the-art technology was done with little regard for a management plan.

When the cover about the collusion between Enron insiders and its auditing firm blew open,
the RTE system triggered the freefall of Enron as it was also covering the risk exposure
related to its instantaneous transactions. Unfounded and overly optimistic belief in
technology as the means for generating profits despite an inadequate business model led to
Enron’s downfall resulting in one of the largest corporate bankruptcies in US history[11].

Conclusion
This article opened with the observation that although KM activities are ‘‘all over the map’’ in
terms of technology implementations, however, no one has asked the ‘‘big question’’: why?
Despite diverse propositions about ‘‘getting the right information to the right person at the
right time,’’ almost everyone neglects to ask what knowledge to manage and toward what
end. A review of the industry case studies of companies characterized in the recent years as
RTE business enterprises surfaced some interesting insights. Recent industry analyses that
have demonstrated inverse correlations between IT investments and business performance
provided some motivation for this analyses. Additional impetus was provided by the contrast
between the hype about ‘‘RTE technologies’’ propagated by some IT analysts and our
in-depth analysis of companies that achieved success as RTE benchmarks. To some extent
the search for the ‘‘next big thing’’ and the ‘‘killer app’’ is to blame for its narrow focus on IT
and innovation as ends rather than means for achieving sustainable business performance
(Business Week, 2003). The big question ‘‘Why?’’ should drive tactical and operational
aspects of technology and process related innovations in an organizational KM
implementation. As contrasted with the inputs- and processing-focused technology-push
model, explicit and specific performance outcomes oriented focus of the strategy-push
model, further emphasized the focus on the ‘‘big question.’’
The contrast between the three archetypes of inputs-, processing-, and outputs-driven
paradigms of KM explained in Table I and Table II further aided deconstruction of the
existing conceptualizations and practices of KM. One such conceptualization of KM that has

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 21
been applied in diverse worldwide governmental and corporate practices was then
discussed to motivate subsequent discussion on the RTE business models. The contrast
between information-processing capabilities of latest technologies and needed
sense-making capabilities was then explained. Additionally, the mechanistic emphasis of
technology-based linkages was contrasted with appreciation for organic and
socio-psychological relationships needed for nurturing knowledge processes. Two
propositions were offered based on prior discussion – one pertaining to the form and
function of the RTE, and the second relevant to the contrast between ends and means of
achieving performance outcomes. Based upon original analyses, review of prior research,
and review of industry case studies we made specific managerial recommendations about
realizing the real time performance of enterprise business models. Specifically, we
recommended that:
B organizational function should drive the choice of organizational form; and
B ends should drive the choice of means.
The above propositions were then illustrated with the aid of RTE industry case studies that
have been used by IT analysts to hype the benefits of RTE technologies. Based upon our
analyses, we counter-argued that the benefits attributed to the RTE technologies should
indeed be attributable to the RTE business model. We further contended that in absence of
an effective RTE business model, even the most expensive and sophisticated technology
could not ensure corporate survival in the short- or long-term. The RTE case studies lent
support to the primary role of strategic execution as the lever for sustained business
performance. As discussed, the successful RTE enterprises achieved their success by
staying a step ahead of competition and by offering value propositions that inspired
customers’ imagination instead of playing the ‘‘me too’’ game in an already crowded market.
These companies also selected and integrated ICT capabilities that fit directly with what they
were deeply passionate about, what they believed they could be the best at, and what
directly drove their steady economic growth. The successful RTE businesses did not adopt
new technologies motivated by fear of getting behind. Rather, they thought differently about
technology as an accelerator of business momentum and not its creator. Unlike the
successful models of RTE enterprises, the failures were characterized by thoughtless
reliance on technology often grasped as an easy solution, without coherent understanding of
how it links to strategic execution for business performance.

Background readings and research

KMNetwork: www.kmnetwork.com/
The above portal provides unrestricted access to several full-text articles and research
papers by the author that have preceded this milestone in fathoming the ongoing evolution
and progress in the field of knowledge management.
There are several excellent reviews of various types of information and communication
technologies (ICTs) that are used within the realms of KM applications. The focus of this
article is on the strategic and overarching framework of real time enterprises and business
performance within which specific ICTs are used. For more specifics on technologies that
are relevant to the input and processing aspects of both KM models discussed herein, the
reader is advised to peruse Tsui (2002); O’Leary (2002); Conway (2002); Gray and Tehrani
(2002); Gray and Tehrani (2002); Susarla et al. (2002); Wei et al. (2002); and Jackson (2001).

Notes
1. Strassmann’s research has primarily focused on macro-economic analysis of industry IT
investments data and has not empirically studied the behavioral and strategic disconnects
related to IT and KM performance discussed in this paper. Author’s seminal research in this context
– referenced in this article – specifically focuses on these disconnects between IT, information,
actions, and performance at individual, organizational, and national levels. Therefore, author’s
research on behavioral-strategic disconnects between IT- and KM-performance complements
research by others that has focused primarily on macro-economic aspects. An interesting focus for

j j
PAGE 22 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
future practice and research is in terms of reconciling existing gaps between economic,
sociological, and behavioral aspects of IT- and KM-performance as recommended in Malhotra
(2003).
2. Some may argue that the interest in digitizing knowledge of business enterprises pre-dates 1990s
as prior AI and expert systems have focused on such processes. Our focus in this article is on the
‘‘real-time enterprise’’ logic in which inter-connected value-chains can respond in real-time to
supply and demand changes almost in real time. As the commercialization of the web occurred
much later than the invention of the first browser version of Mosaic, such real-time capabilities of
networking across enterprises were not available and as affordable in the post-1995 era. However,
there are fundamental problems characterizing the AI and expert systems based focus on KM
systems that is discussed in greater depth in the contrast between ‘‘sense making’’ and
‘‘information processing’’ capabilities explained in the Expert Systems With Applications journal
special issue on knowledge management (Malhotra, 2001b).
3. This argument is supported by examples of technology pioneers of yesteryears that have faded into
oblivion. For instance, Visicalc, the company that pioneered the spreadsheet lost out to Lotus 1-2-3
which itself lot out to Ms-Excel. The first portable computers came from Osborne, a company that
ceased to exist long before portables became adopted by the mainstream.
4. The technology-push model and the strategy-pull model of KM implementation are discussed as
contrasting ‘‘archetypes’’ for business environments ranging from highly routine and predictable
environments to radically changing and discontinuous environments. It is, however, recognized that
most real world business environments as well as most real world business contexts would fall
between the two polar contrasts. Hence most such RTE models would effectively combine the two
models for balancing new knowledge creation and commercial exploitation of existing knowledge.
Balancing the two processes is discussed in author’s interview with the Institute for supply
management featured in the knowledge management cover story of inside supply management
(Yuva, 2002). Additional discussion on balancing the apparently paradoxical processes is available
in Malhotra (2000a, 2001a, 2002a).
5. For more details on single-loop and double-loop learning, the reader is advised to see seminal
writings of Chris Argyris such as Argyris (1990) and Argyris (1994).

6. In some cases of technology implementation such as ERP, the issues of technology deployment and
utilization could never get addressed, resulting in snowballing downslide of business performance
(see for instance, Strassmann, 2003).
7. Such as the US Federal Government, United States Army, European Commission, US Agency for
International Development, Government of UK, Government of South Africa, Parliament of Victoria
(Australia), Government of New Zealand, Government of Argentina, SAP North America, Microsoft
Europe, Verisign, Telecom Italia, Organization of Islamic Capitals and Cities (Saudi Arabia), and
United Nations and its worldwide agencies. More details accessible at: www.brint.com/
casestudies.html
8. Additional discussion on how existing ‘‘information processing’’ focus of technology on semantics
(meaning) has yet to address the ‘‘sense making’’ capacities of human beings within the context of
the new paradigm of self-regulation and self-control is available in Malhotra (2001b, c, 2002b).
9. It is understandable that WS-I and related web service based experiments (such as RosettaNet)
provide hope for technological feasibility of real-time information exchange. However, despite the
exploitation of most sophisticated technical standards, information exchange within and between
enterprises remain more of a sociological and cultural issue than a technical issue. Hence, despite
availability of technical standards that may ensure perfect real time communication, sociological
and cultural artifacts impose a major burden. Conversely, concerns that tend well to such
sociological and cultural concerns, as discussed in this article, accelerate their RTE business
models through adoption of facilitating technologies. More in-depth discussion on this theme is
available in the author’s Intel Corporation’s e-strategy research paper (Malhotra, 2001a).
10. Growth consisted of 40 quarters of straight growth and three immediate quarters of extreme growth
to the tune of 66 percent.
11. One news story had the following remarks about Enron’s business model: ‘‘In the aftermath of the
collapse, there have been suggestions that a few directors had mishandled the partnerships to
siphon off funds to their own accounts. However, it is clear that the more than 3,000 partnerships,

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 23
more than 800 of which were in tax havens like the Cayman Islands, played a far more purposeful
role in Enron’s business model.’’ Despite real time availability of information, the corporate crisis in
this case pertains to sociological and cultural issues such as senior management’s corruption and
auditors’ dishonesty that led to ‘‘real time’’ cover-ups despite access to best technology.

References
Ackoff, R. (1979), ‘‘The future of operations research is past’’, Journal of the Operations Research
Society, Vol. 30, p. 93.
Argyris, C. (1990), Integrating the Individual and the Organization, Transaction, New Brunswick, NJ.

Argyris, C. (1994), ‘‘Good communication that blocks learning’’, Harvard Business Review, Vol. 72 No. 4,
pp. 77-85.
Alavi, M. and Leidner, D. (2001), ‘‘Review: knowledge management and knowledge management
systems: conceptual foundations and research issues’’, MIS Quarterly, Vol. 25 No. 1, pp. 107-36.
Alinean (2002), ‘‘Alinean identifies why certain companies achieve higher ROI from IT investments’’,
available at: www.alinean.com
Anthes, G.H. and Hoffman, T. (2003), ‘‘Tarnished image’’, Computerworld, May 12, pp. 37-40.
Arthur, B. (1996), ‘‘Increasing returns and the new world of business’’, Harvard Business Review, Vol. 74
No. 4, pp. 100-9.

Barth, S. (2000), ‘‘KM horror stories’’, Knowledge Management, Vol. 3 No. 10, pp. 36-40.
Beer, S. (1994), ‘‘May the whole earth be happy: Loka Samastat Sukhino Bhavantu’’, Interfaces, Vol. 24
No. 4, pp. 83-93.
Berinato, S. (2002), ‘‘Enron IT: a tale of excess and chaos’’, CIO Magazine, available at:
www.cio.com/executive/edit/030502_enron.html
Brooks, F.P. Jr (1987), ‘‘No silver bullet: essence and accidents of software engineering’’, Computer,
Vol. 20 No. 4, pp. 10-19.
Brynjolfsson, E. (1993), ‘‘The productivity paradox of information technology’’, Communications of the
ACM, Vol. 36 No. 12, pp. 66-77.

Brynjolfsson, E. and Hitt, L.M. (1996), ‘‘Paradox lost? Firm-level evidence on the returns to information
systems spending’’, Management Science, Vol. 42, pp. 541-58.
Brynjolfsson, E. and Hitt, L.M. (1998), ‘‘Beyond the productivity paradox: computers are the catalyst for
bigger changes’’, Communications of the ACM, Vol. 41 No. 8, pp. 49-55.

Business Week (2003), ‘‘What you don’t know about Dell’’, Business Week, November 3, pp. 76-84.
Byrne, J.A. (1993), ‘‘The virtual corporation’’, Business Week, February 8, pp. 98-103.
Carr, N. (2003), ‘‘IT doesn’t matter’’, Harvard Business Review, Vol. 81 No. 5, pp. 41-9.
Carter, L. (2001), ‘‘Cisco’s virtual close’’, Harvard Business Review, April.
Charles, S.K. (2002), ‘‘Knowledge management lessons from the document trenches’’, Online, Vol. 26
No. 1, pp. 22-9.
Churchman, C.W. (1971), The Design of Inquiring Systems, Basic Books, New York, NY.
Collins, J. (2001), Good to Great: Why Some Companies Make the Leap and Others Don’t,
Harper-Business, New York, NY.
Collins, J. (2003), ‘‘Bigger, better, faster’’, Fast Company, available at: www.fastcompany.com/
magazine/71/walmart.html
Conway, S. (2002), ‘‘Knowledge searching and services’’, in Holsapple, C.W. (Ed.), Handbook on
Knowledge Management 1: Knowledge Directions, Springer-Verlag, Heidelberg, pp. 69-84.
Cuneo, E.C. (2003), ‘‘Safe at sea’’, Information Week, April 7, available at: www.informationweek.
com/story/showArticle.jhtml?articleID ¼ 8700375
Darrell, R., Reichheld, F.F. and Schefter, P. (2002), ‘‘Avoid the four perils of CRM’’, Harvard Business
Review, February, pp. 101-9.

j j
PAGE 24 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Dragoon, A. (1995), ‘‘Knowledge management: Rx for success’’, CIO Magazine, Vol. 8 No. 18, pp. 48-56.

Drucker, P.F. (1994), ‘‘The theory of business’’, Harvard Business Review, September-October,
pp. 95-104.

eMarketer (2001), ‘‘Knowledge management: executive brief’’, available at: www.info-edge.com/


samples/EM-2001free.pdf

Emery, F.E. and Trist, E.L. (1965), ‘‘The causal texture of organizational environments’’, Human Relations,
Vol. 18, pp. 21-32.

Evans, P. and Wurster, T.S. (2002), Blown to Bits, Harvard Business School Press, Boston, MA.

Gartner, Inc. (2002), ‘‘The real time enterprise’’, available at: http://rte.gartner.com/

Gray, P. and Tehrani, S. (2002), ‘‘Technologies for disseminating knowledge’’, in Holsapple, C.W. (Ed.),
Handbook on Knowledge Management 1: Knowledge Directions, Springer-Verlag, Heidelberg,
pp. 109-28.

Greenemeier, L. (2003a), ‘‘HP looks to utility computing for growth’’, Information Week, May 12, available
at: www.informationweek.com/story/showArticle.jhtml?articleID ¼ 9800052

Greenemeier, L. (2003b), ‘‘Utility computing meets real life’’, Information Week, April 21, available at:
www.informationweek.com/story/showArticle.jhtml?articleID ¼ 8800357

Grover, V. and Davenport, T.H. (2001), ‘‘General perspectives on knowledge management: fostering a
research agenda’’, Journal of Management Information Systems, Vol. 18 No. 1, pp. 5-21.

Hammer, M. (1990), ‘‘Reengineering work: don’t automate’’, Harvard Business Review, July, pp. 104-12.

Hansen, M.T. and Nohria, N. (1999), ‘‘What’s your strategy for managing knowledge?’’, Harvard
Business Review, March-April, pp. 106-16.

Hapgood, F. (2003), ‘‘Plug and pay’’, CIO Magazine, April 15, available at: www.cio.com/ archive/
041503/plug.html

Hildebrand, C. (1999), ‘‘Intellectual capitalism: does KM ¼ IT?’’, CIO Magazine, September 15,
available at: www.cio.com/archive/enterprise/091599_ic_content.html

Hoffman, T. (2002), ‘‘‘Frugal’ IT investors top best-performer list’’, Computerworld, December 6,


available at: www.computerworld.com/managementtopics/roi/story/0,10801,76468,00.html

Hoffman, T. (2003), ‘‘Survey points to continuing friction between business, IT’’, Computerworld, May 12,
p. 10.

Holsapple, C.W. (2002), ‘‘Knowledge and its attributes’’, in Holsapple, C.W. (Ed.), Handbook on
Knowledge Management 1: Knowledge Matters, Springer-Verlag, Heidelberg, pp. 165-88.

Holsapple, C.W. and Singh, M. (2001), ‘‘The knowledge chain model: activities for competitiveness’’,
Expert Systems with Applications, Vol. 20 No. 1, pp. 77-98.

Hopper, M.D. (1990), ‘‘Rattling SABRE – new ways to compete on information’’, Harvard Business
Review, May/June, pp. 118-25.

Huber, R.L. (1993), ‘‘How Continental Bank outsourced its ‘crown jewels’’’, Harvard Business Review,
January/February, pp. 121-9.

Jackson, C. (2001), ‘‘Process to product: creating tools in knowledge management’’, in Malhotra, Y.


(Ed.), Knowledge Management for Business Model Innovation, Idea Group Publishing, Hershey, PA,
pp. 402-13.

Khosla, V. and Pal, M. (2002), ‘‘Real time enterprises: a continuous migration approach’’, March,
available at: www.asera.com/technology/pdf/RTE-WHITEPAPER-PDF-VERSION.pdf

Kirkpatrick, T.A. (2003), ‘‘Complexity: how to stave off chaos’’, CIO Insight, February 1, available at:
www.cioinsight.com/print_article/0,3668,a ¼ 37126,00.asp

Koenig, M.D. and Srikantaiah, T.K. (2000a), ‘‘The evolution of knowledge management’’, in Srikantaiah, K.
and Koenig, M.E.D. (Eds), Knowledge Management for the Information Professional, Information Today
Inc., Medford, NJ, pp. 37-61.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 25
Kraemer, K. (2001), ‘‘The productivity paradox: is it resolved? Is there a new one? What does it all mean
for managers?’’, working paper, Center for Research on Information Technology and Organizations,
UC Irvine, Irvine, CA.

LeClaire, J. and Cooper, L. (2000), ‘‘Rapid-Fire IT Infrastructure’’, Information Week, January 31,
available at: www.informationweek.com/771/infrastruct.htm

Levitt, J. (2001), ‘‘Plug-and-play redefined’’, Information Week, April 2, available at:


www.informationweek.com/831/web.htm

Lindorff, D. (2002), ‘‘GE’s drive to real-time measurement’’, CIO Insight, November 11, available at:
www.cioinsight.com/article2/0,3959,686147,00.asp

Lindquist, C. (2003), ‘‘What time is real time?’’, CIO Magazine, February 10, available at:
www.cio.com/online/techtact_021003.html

Malhotra, Y. (1993), ‘‘Role of information technology in managing organizational change and


organizational interdependence’’, BRINT Institute, LLC, New York, NY, available at: www.brint.com/
papers/change/

Malhotra, Y. (1995), ‘‘IS productivity and outsourcing policy: a conceptual framework and empirical
analysis’’, Proceedings of Inaugural Americas Conference on Information Systems (Managerial Papers),
Pittsburgh, PA, August 25-27, available at: www.brint.com/papers/outsourc/

Malhotra, Y. (1996), ‘‘Enterprise architecture: an overview’’, BRINT Institute, LLC, New York, NY,
available at: www.brint.com/papers/enterarch.htm

Malhotra, Y. (1997), ‘‘Knowledge management in inquiring organizations’’, Proceedings of 3rd Americas


Conference on Information Systems (Philosophy of Information Systems Mini-track), Indianapolis, IN,
August 15-17, pp. 293-5, available at: www.kmnetwork.com/km.htm

Malhotra, Y. (1998a), ‘‘Role of social influence, self-determination, and quality of use in information
technology acceptance and utilization: a theoretical framework and empirical field study’’, PhD thesis,
Katz Graduate School of Business, University of Pittsburgh, Pittsburgh, PA.

Malhotra, Y. (1998b), ‘‘Knowledge management for the new world of business’’, Journal for Quality &
Participation, Vol. 21 No. 4, pp. 58-60.

Malhotra, Y. (1998c), ‘‘Knowledge management, knowledge organizations and knowledge workers:


a view from the front lines’’, Maeil Business Newspaper, February 19, available at: www.brint.com/
interview/maeil.htm

Malhotra, Y. (2000a), ‘‘From information management to knowledge management: beyond the ‘hi-tech
hidebound’ systems’’, in Srikantaiah, K. and Koenig, M.E.D. (Eds), Knowledge Management for the
Information Professional, Information Today Inc., Medford, NJ, pp. 37-61, available at:
www.brint.org/IMtoKM.pdf

Malhotra, Y. (2000b), ‘‘Knowledge assets in the global economy: assessment of national intellectual
capital’’, Journal of Global Information Management, Vol. 8 No. 3, pp. 5-15, available at:
www.kmnetwork.com/intellectualcapital.htm

Malhotra, Y. (2000c), ‘‘Knowledge management and new organization forms: a framework for business
model innovation’’, Information Resources Management Journal, Vol. 13 No. 1, pp. 5-14, available at:
www.brint.org/KMNewOrg.pdf

Malhotra, Y. (2000d), ‘‘Knowledge management for e-business performance: advancing information


strategy to ‘internet time’’’, Information Strategy: The Executive’s Journal, Vol. 16 No. 4, pp. 5-16,
available at: www.brint.com/papers/kmebiz/kmebiz.html

Malhotra, Y. (2001a), ‘‘Enabling next generation e-business architectures: balancing integration and
flexibility for managing business transformation’’, Intel e-Strategy White Paper, June, available at:
www.brint.net/members/01060524/intelebusiness.pdf

Malhotra, Y. (2001b), ‘‘Expert systems for knowledge management: crossing the chasm between
information processing and sense making’’, Expert Systems with Applications, Vol. 20 No. 1, pp. 7-16,
available at: www.brint.org/expertsystems.pdf

Malhotra, Y. (2001c), ‘‘Organizational controls as enablers and constraints in successful knowledge


management systems implementation’’, in Malhotra, Y. (Ed.), Knowledge Management and Business

j j
PAGE 26 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Model Innovation, Idea Group Publishing, Hershey, PA, pp. 326-36, available at: www.brint.org/
KMOutOfControl.pdf

Malhotra, Y. (2002a), ‘‘Enabling knowledge exchanges for e-business communities’’, Information


Strategy: The Executive’s Journal, Vol. 18 No. 3, pp. 26-31, available at: www.brint.org/
KnowledgeExchanges.pdf

Malhotra, Y. (2002b), ‘‘Information ecology and knowledge management: toward knowledge ecology for
hyperturbulent organizational environments’’, Encyclopedia of Life Support Systems (EOLSS),
UNESCO/Eolss Publishers, Oxford, available at: www.brint.org/KMEcology.pdf

Malhotra, Y. (2002c), ‘‘Is knowledge management really an oxymoron? Unraveling the role of organizational
controls in knowledge management’’, in White, D. (Ed.), Knowledge Mapping and Management, Idea
Group Publishing, Hershey, PA, pp. 1-13, available at: www.brint.org/KMOxymoron.pdf

Malhotra, Y. (2002d), ‘‘When best becomes worst’’, Momentum: The Quality Magazine of Australasia
(Quality Society of Australasia), September, available at: www.brint.org/ bestpractices.pdf

Malhotra, Y. (2003), ‘‘Measuring national knowledge assets of a nation: knowledge systems for
development (expert background paper)’’, Expanding Public Space for the Development of the
Knowledge Society: Report of the Ad Hoc Expert Group Meeting on Knowledge Systems for
Development, 4-5 September, Department of Economic and Social Affairs Division for Public
Administration and Development Management, United Nations, New York, pp. 68-126, available at:
www.kmnetwork.com/KnowledgeManagementMeasurementResearch.pdf; http://unpan1.un.org/
intradoc/groups/public/documents/un/unpan011601.pdf; http://unpan1.un.org/intradoc/groups/public/
documents/un/unpan014138.pdf

Malhotra, Y. (2004a), ‘‘Desperately seeking self-determination: key to the new enterprise logic of
customer relationships’’, Proceedings of the Americas Conference on Information Systems (Process
Automation and Management Track: Customer Relationship Management Mini-track), New York, NY,
August 5-8.

Malhotra, Y. (2004b), ‘‘Why knowledge management systems fail. Enablers and constraints of
knowledge management in human enterprises’’, in Koenig, M.E.D. and Srikantaiah, T.K. (Eds),
Knowledge Management Lessons Learned: What Works and What Doesn’t, Information Today Inc.,
Medford, NJ, pp. 87-112, available at: www.brint.org/WhyKMSFail.htm

Malhotra, Y. and Galletta, D.F. (1999), ‘‘Extending the technology acceptance model to account for social
influence: theoretical bases and empirical validation’’, Proceedings of the Hawaii International Conference
on System Sciences (HICSS 32), pp. 6-19, available at: www.brint.org/technologyacceptance.pdf

Malhotra, Y. and Galletta, D.F. (2003), ‘‘Role of commitment and motivation in knowledge management
systems implementation: theory, conceptualization, and measurement of antecedents of success’’,
Proceedings of the Hawaii International Conference on Systems Sciences (HICSS 36), available at:
www.brint.org/KMSuccess.pdf

Malhotra, Y. and Galletta, D.F. (n.d.a), ‘‘A multidimensional commitment model of knowledge
management systems acceptance and use’’, Journal of Management Information Systems (in press).

Malhotra, Y. and Galletta, D.F. (n.d.b), ‘‘If you build IT, and they come: building systems that users want
to use’’, Communications of the ACM (in press).

Margulius, D.L. (2002), ‘‘Dawn of the real-time enterprise’’, InfoWorld, January 17, available at:
www.infoworld.com/article/02/01/17/020121fetca_1.html

Massey, A.P., Montoya-Weiss, M.M. and Holcom, K. (2001), ‘‘Re-engineering the customer relationship:
leveraging knowledge assets at IBM’’, Decision Support Systems, Vol. 32 No. 2, pp. 155-70.

Meyer, C. (2002), ‘‘Keeping pace with the accelerating enterprise’’, CIO Insight, November 2, available
at: www.cioinsight.com/article2/0,3959,675333,00.asp

Murphy, C. (2003), ‘‘Tying it all together’’, Information Week, March 17, available at:
www.informationweek.com/shared/printableArticle.jhtml?articleID ¼ 8700225

Nadler, D.A. and Shaw, R.B. (1995), ‘‘Change leadership: core competency for the twenty-first century’’,
in Nadler, D.A., Shaw, R.B. and Walton, A.E. (Eds), Discontinuous Change: Leading Organizational
Transformation, Jossey-Bass, San Francisco, CA.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 27
Nonaka, I. and Takeuchi, H. (1995), The Knowledge-Creating Company, Oxford University Press,
New York, NY.
Odom, C. and Starns, J. (2003), ‘‘KM technologies assessment’’, KM World, May, pp. 18-28.
O’Leary, D. (2002), ‘‘Technologies of knowledge storage and assimilation’’, in Holsapple, C.W. (Ed.),
Handbook on Knowledge Management 1: Knowledge Directions, Springer-Verlag, Heidelberg,
pp. 29-46.
Porter, M.E. and Millar, V.E. (1985), ‘‘How information technology gives you competitive advantage’’,
Harvard Business Review, Vol. 63 No. 4, pp. 149-60.
Rayport, J.F. and Sviokla, J.J. (1995), ‘‘Exploiting the virtual value chain’’, Harvard Business Review,
Vol. 73 No. 6, pp. 75-99.
Sawhney, M. (2003), ‘‘Reality check’’, CIO Magazine, March 1, available at: www.cio.com/
archive/030103/netgains.html
Schrage, M. (2002), ‘‘Wal-Mart trumps Moore’s law’’, Technology Review, Vol. 105 No. 2, p. 21.
Schultze, U. and Leidner, D. (2002), ‘‘Studying knowledge management in information systems resarch:
discourses and theoretical assumptions’’, MIS Quarterly, Vol. 26 No. 3, pp. 213-42.
Siegele, L. (2002), ‘‘The real-time economy: how about now?’’, CFO (The Economist), February 1,
available at: www.cfo.com/printarticle/0,5317,6651%7C,00.html
Sliwa, C. (2003), ‘‘Event-driven architecture poised for wide adoption’’, Computerworld, May 12, p. 8.
Stewart, T.A. (2000), ‘‘How Cisco and Alcoa make real time work’’, Fortune, May 29.
Stewart, T.A. and Kaufman, D.C. (1995), ‘‘Getting real about brainpower’’, Fortune, December 11.
Strassmann, P. (1997), The Squandered Computer: Evaluating the Business Alignment of Information
Technologies, Information Economics Press, New Canaan, CT.
Strassmann, P. (2003), ‘‘Enterprise software’s end’’, Computerworld, May 12, p. 35.
Susarla, A., Liu, D. and Whinston, A.B. (2002), ‘‘Peer-to-peer knowledge management’’, in Holsapple, C.W.
(Ed.), Handbook on Knowledge Management 1: Knowledge Directions, Springer-Verlag, Heidelberg,
pp. 129-40.
Terreberry, S. (1968), ‘‘The evolution of organizational environments’’, Administrative Science Quarterly,
Vol. 12, pp. 590-613.
Thickins, G. (2003), ‘‘Utility computing: the next new IT model’’, Darwin Magazine, April, available at:
www.darwinmag.com/read/040103/utility.html
Tsui, E. (2002), ‘‘Tracking the role and evolution of commercial knowledge management software’’,
in Holsapple, C.W. (Ed.), Handbook on Knowledge Management 1: Knowledge Directions,
Springer-Verlag, Heidelberg, pp. 5-27.
Verton, D. (2002), ‘‘Insiders slam navy intranet’’, Computerworld, May 27, pp. 1-16.

Wei, C., Piramuthu, S. and Shaw, M.J. (2002), ‘‘Knowledge discovery and data mining’’, in Holsapple, C.W.
(Ed.), Handbook on Knowledge Management 1: Knowledge Directions, Springer-Verlag, Heidelberg,
pp. 157-92.
Wolpert, D.H. (2001), ‘‘Computational capabilities of physical systems’’, Physical Review E, Vol. 65 No. 1,
pp. 1-27, available at: www.santafe.edu/sfi/publications/Working-Papers/96-03-008.pdf
Yuva, J. (2002), ‘‘Knowledge management – the supply chain nerve center’’, Inside Supply
Management, Institute for Supply Management, July, pp. 34-43, available at: www.brint.org/
KnowledgeManagementTheSupplyChainNerveCenter.pdf
Zack, M.H. (2001), ‘‘If managing knowledge is the solution, then what’s the problem?’’, in Malhotra, Y.
(Ed.), Knowledge Management and Business Model Innovation, Idea Group Publishing, Hershey, PA.
Zetie, C. (2003), ‘‘Machine-to-machine integration: the next big thing?’’, Information Week, April 14,
available at: www.informationweek.com/story/showArticle.jhtml?articleID ¼ 8900042

j j
PAGE 28 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Balancing business process with business
practice for organizational advantage
Laurence Lock Lee

Abstract
Purpose – To provide an argument and a practical approach for achieving a balance between business
process optimization and the use of human-centred business practices.
Design/methodology/approach – The concepts of business process and business practice are
positioned in the academic literature with related concepts like tacit and explicit knowledge, routine
work, codification and bounded rationality. Process and practice are compared and contrasted prior to
the development of a model for their co-existence and interaction.
Laurence Lock Lee is the Research limitations/implications – This research builds on the separate research streams
Principal Knowledge supporting business process management and business practice development. The argument for their
Management Consultant with co-existence still requires further field research to support the organizational advantages claimed.
Computer Sciences Practical implications – A framework and approach are presented which can be applied directly as
Corporation, Australia. E-mail: part of new field research or practical application.
llocklee@csc.com.au Originality/value – This paper makes two original contributions. First, it anchors the modern concepts
of business process and business practice to foundation concepts from the academic literature.
Second, it provides a practical framework and approach for balancing business process and business
practice, that can be practically applied by the reader.
Keywords Knowledge management, Process management
Paper type Research paper

Introduction
The purpose of this paper is to champion the cause of John Seely Brown and Paul Duguid in
their pleas to not lose sight of the inherent value of business practices formed from the tacit
understanding of knowledge workers. Seely Brown and Duguid’s (2000) short paper on
‘‘Balancing act: how to capture knowledge without killing it’’ introduces the challenge of
balancing business processes with business practice. This paper aims to provide added
weight to the argument by positioning it within the academic literature. A connection will be
briefly built to foundational theories of ‘‘bounded rationality’’ (Simon, 1979) and ‘‘evolutionary
theory of economic change’’ (Nelson and Winter, 1982) and the general tacit knowledge
verses explicit knowledge discussion. Having established a foundational argument for a
dual focus on both business process and business practice, the paper moves on to provide
a practical framework for identifying and managing the balance between the two. The use of
the framework is illustrated with case study examples.

Foundation concepts
The concepts of codification, explicit knowledge, tacit knowledge, routine work, processes,
and practices are not new, but still engender a degree of confusion through their different
interpretations. It is crucial to distinguish ‘‘process’’ from ‘‘practice’’ if one is to attempt to
operationalize these concepts. In this paper it is argued that ‘‘process’’ is strongly
associated with concepts like ‘‘explicit knowledge’’, ‘‘routine’’ and ‘‘codification’’ while and
‘‘practice’’ has similarly strong associations with ‘‘tacit knowledge’’, ‘‘heuristics’’ and
‘‘non-codification’’. Figure 1 provides a positioning of the business practice/business
process argument in the literature by tracking three key themes through representative
publications.

DOI 10.1108/13673270510582947 VOL. 9 NO. 1 2005, pp. 29-41, Q Emerald Group Publishing Limited, ISSN 1367-3270 j JOURNAL OF KNOWLEDGE MANAGEMENT j PAGE 29
Figure 1 Positioning within the literature

Figure 1 identifies three themes culminating in the practice/process dialogue. The three
themes can be traced back to Simon’s theory of ‘‘bounded rationality’’ (Simon, 1979). This
theory identifies the limitations within which managers can employ rational decision-making
techniques. Rational decision-making implies an ability to make explicit the ‘‘process’’ of
decision-making. Outside the bounds of rationality managers will rely on intuition and
emotion to guide their decision-making (Simon, 1987).
The first theme traces Simon’s work through to the field of artificial intelligence and
knowledge based systems. Within this field the CYC project, in development since 1984, is
aimed at developing a system for storing commonsense knowledge and stands out as the
most ambitious attempt to codify knowledge (Lenat, 1995). Identifying standard means for
codifying knowledge evolved during the late 1980s supported by the European ESPRIT
collaborative research program (Hickman et al., 1989), though these efforts have stagnated
in favor of addressing knowledge from a more holistic perspective, i.e. knowledge
management (KM). The KM pioneers viewed knowledge from an organizational perspective,
in many cases making the argument for sharing tacit knowledge through socialization
techniques, e.g. communities of practice, rather than blindly attempting to codify tacit
knowledge within large knowledge bases for sharing (Sveiby, 1997; Allee, 1997; Lesser and
Prusak, 1999).

The second theme could be called the business process theme. Nelson and Winter (1982),
in their work on the evolutionary theory of economic change, refer to Simon’s work in
speaking of routine as the distinctive package of economic capabilities and coordinating
functions that a firm possesses and can deploy in a repeatable fashion. For Nelson and
Winter this includes the heuristic problem solving patterns of say a firm’s R&D department.
The linkage between Nelson and Winter’s ‘‘routine’’ and the business process reengineering
(BPR) phenomenon is more implied than explicit, with BPR promoting a focus on processes
or routines that are core to businesses, removing all others that are deemed to be
non-value-adding (Hammer and Champy, 1993). This position evolved to a finer articulation
of classes of business processes, e.g. identity or core processes, priority, mandatory or
background processes (Keen, 1997). The terms ‘‘business process’’ and now ‘‘business
process management’’ (BPM) have been loosely used to identify with just about every
activity that a firm participates in. Zairi’s (1997) examination of the literature has found that

j j
PAGE 30 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
BPM is far from pervasive and is no more than structural changes, the use of systems such
as EN ISO 9000 and the management of individual projects. Key features identified with a
process were its predictable and definable inputs; a linear, logical sequence; a clearly
definable set of activities; and a predictable and desired outcome.
The concept of process infers something that is definable, describable and repeatable. In
the context of BPM, we must tighten the specification to the extent that the process must be
describable in a standardized business process language and computationally executed to
provide the expected outputs in a repeatable fashion. This tighter specification of process
will similarly require a tightening of the associated terms of explicit knowledge, codification
and routine. An important contribution of the recent BPM initiatives is the creation of standard
languages to describe a business process in computer executable form, e.g. BPML,
BPEL[1]. Languages like BPML provide a link between the typical process designer’s flow
charts, process maps, and executable computer code (Smith and Fingar, 2002).
The third theme encompasses the dialog around tacit, explicit and codified knowledge,
which could arguably be seen as a pre-curser to the KM theme, but has been identified for
individual treatment here. Cowan et al. (2000) put forward an economist’s skeptical
argument that very little knowledge is inherently tacit and that its codification is simply an
argument of a cost/benefit analysis. In proposing this argument, the authors engage in a
discussion around articulation and codification, which converges on a view that that what
can be articulated, can be codified for economic benefit. Johnson et al. counter Cowan et al.
specifically on the impracticality of the proposition on a number of fronts. The example of the
art of bicycling is used as an example of how attempts at both articulation and codification of
the practice of bicycle riding would rarely be useful to the novice rider, even if it were
economically viable (Johnson et al., 2002). Off course these economic arguments ignore the
very real sociological issues present. Polanyi considered human knowledge from the
premise that ‘‘we know more than we can tell’’ (Polanyi, 1967, p. 4) with the natural extension
that ‘‘we tell more than we can write down’’ (Snowden, 2002). Snowden adds the further
heuristics that ‘‘knowledge can only be volunteered; it cannot be conscripted’’ and ‘‘we only
know what we know when we need to know it’’ for managing knowledge, in contrast to the
pure economic argument. Looking back to the BPM context, we could extend the analogy
further to ‘‘we can write down more than we can write in BPML’’. Put succinctly, ‘‘we know far
more than we can effectively automate’’, the gap arguably being attributable to business
practice.
Clearly some license has been taken in defining a business practice as the gap between
what a human might know and use and what knowledge can be effectively converted for
execution within a BPM system. A conventional use of the term ‘‘business practice’’ might
refer to a medical or legal practice that would encompass both the tacit understandings and
experiences of the staff within the practice as well as the business processes that the firm
conducts. The more limited usage of the term here is justified by the emphasis the term
connotes around a distinctive expertise developed around extensive work experiences.
In summary, the current argument for balancing business process with business practice
can be traced back to Simon’s theory of bounded rationality. In this paper we are interested
in ‘‘business processes’’ in the context of BPM and therefore its definition is restricted to
processes that can effectively be translated into a pre-defined business process language.
Business practice is conveniently defined as the complement, i.e. those business activities
that fall outside the scope of a business process. The remainder of the paper is devoted to
the more practical aspects of balancing business process with business practice for
organizational advantage.

Business process management


Since the emergence of business process re-engineering we have seen two other influential,
IT focused, ‘‘waves’’. The first of these was the emergence of ‘‘off-the shelf’’ ERP systems
such as SAP, Peoplesoft, Oracle. The initial driver for ERP adoption was the need to rapidly
standardize and support those largely non-differentiating business processes such as
finance, human resources, maintenance and the like. The second ‘‘wave’’ was e-business;

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 31
with the essential driver being the desire to leverage low cost internet-based technologies to
streamline the way business organizations did business with each other. Despite the
romance and the promise of radical new business models and the aspirations of the ERP
vendors to fulfill the promise, the reality is that the business benefits have mostly materialized
as a consequence of less romantic, business process improvement initiatives. Lower costs,
reduced cycle times, more satisfied customers have come from incremental improvements
to the way organizations are conducting their businesses.
However, what the e-business wave has achieved is a realization that an organization’s
business processes do not stop at the front door. They need to extend out to their suppliers,
customers and alliance partners. From the IT perspective this also means extending internal
systems to be externally facing, with the consequent difficulties in mixing and matching with
the plethora of different vendor products that trading partners might use. The IT industry’s
response to this challenge has been the formation of the business process management
initiative (BPMI) with a mission to ‘‘standardise the management of business processes that
span multiple applications, corporate departments and business partners, behind the
firewall and over the internet’’ (see www.BPMI.org). With over 125 industry members, BPMI
is looking at a means to separate the management of business processes from the software
that support and implement them. The core undertaking has been the development of a
common business process modeling language (BPML), which can enable business
processes to be described and managed independently from the software used to
implement and support them. The analogy has been drawn to the SQL data base query
language which today allows common queries to be described in the standard SQL
language, but executed against any relational data base system.
BPM is still in its infancy, as are the products that support it. There are eight identified basic
functions that would comprise a business process management system (www.BPMI.org):
(1) Process discovery: finding out how things are actually done.
(2) Process design: modeling, simulating and redesigning a process.
(3) Process deployment: distributing the process to all participants.
(4) Process execution: ensuring the process is carried out by all.
(5) Process maintenance: resolving exceptions, adaptations.
(6) Process interaction: allowance for human interaction with the process.
(7) Process optimization: process improvement.
(8) Process analysis: measuring performance and devising improvement strategies.

Business practice
A business practice is seen as a frequently repeated act, habit or custom performed to a
recognized level of skill. It is often thought of as the uncodified ‘‘know-how’’ resulting from
human experience, improvisation and innovation.
One of the key benefits attributed to KM has been the ability to share best practices across
large organizations. While there are clearly some great success stories to tell around how
knowledge has been leveraged around the sharing of practices across large organizations,
there are a far greater number of stories around how elusive the benefits can be from
attempts to share best practices. Different contexts, the ‘‘not invented here’’ syndrome, ‘‘our
business is different’’ have all been offered as barriers to achieving success. The reality is
that unless you are operating under a franchised business model, then there will be real
differences across the businesses. Seely Brown (2001) has gone as far as to claim that the
proportion of business practices that can be formally codified in process form is really only
the tip of the iceberg, and that the vast majority of ‘‘knowledge’’ encompassed in a
successful practice is uncodified and held tacitly in the minds of the staff performing the task
(Figure 2).

j j
PAGE 32 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Figure 2 The explicit/tacit process/practice divide

The challenge lies in judging whether a ‘‘practice’’ is truly transportable across the different
business environments; and this ability appears to be largely held in the collective judgment
of expert practitioners who have a view across the different business domains. Hence the
important role that communities of practice or expert networks play in facilitating the effective
sharing of best practices.

Business process versus business practice


Business processes and business process re-engineering gained much prominence and
some notoriety in the early 1990s as companies were challenged to break out of their
traditional indoctrinated ways of doing business. A typical re-engineering process would
start with a ‘‘mapping’’ of current business processes and then an intense assessment of
which non-value adding processes could be ‘‘obliterated’’. With the increased challenges of
globalization, and commoditization, re-engineering is now re-emerging in the form of BPM
(Smith and Fingar, 2002). The drive for BPM is coming from organizations wanting to engage
in inter-enterprise collaboration, instigating a demand for a common way to implement
inter-enterprise business processes that is independent of the technology used to support
them. Fundamental to the BPM concept is a standard business process modeling language
(BPML), which is designed to enable companies to jointly develop business processes with
their partners and collaborators, without the need to enforce a common technology platform,
e.g. SAP, PeopleSoft, Oracle (see www.BPMI.org).
Business practices are often not explicit, but couched tacitly in the minds of the employees
that conduct them. Etienne Wenger (1999) in his research on ‘‘communities of practice’’,
uses an example of health claim processing. One might believe that these processes could
be easily codified; yet the uncodified tacit understanding of the different claims processors
substantially separates good and bad performance. While the written procedures for health
claim processing were designed for individuals, the reality was that claims processors had
to form themselves into a tight social ‘‘community of practice’’ to effectively deal with the
contradictions, gaps and ambiguities that inevitably exist in written procedures. Wenger’s
work is strongly supported by John Seely Brown and Paul Duguid (2000) in their plea for not
ignoring business practice in the rush to automate processes. Johnson (2002) argues for
distinguishing tacit from explicit knowledge and devising strategies to manage them
independently. Johnson aligns explicit knowledge with intellectual property and knowledge
stocks and tacit knowledge with interactive or facilitated knowledge processes or
knowledge work. For every re-engineering success story there appeared to be many more
failed attempts. There are many opinions about why re-engineering efforts fail; from a lack of
appreciation organizational culture, power and structures (Cao et al., 2001) to simply poor
integration or implementation (Al-Mashari et al., 2001). A common experience in the rush to
obliterate non-value adding processes, was the oversight of the subtle uncodified activities

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 33
that skilled and experienced employees perform, i.e. the people factor was often overlooked
(White, 1996).
Clearly practice and process need to co-exist. In most business processes, as we start to
analyze them more closely, we typically find more ‘‘art’’ or tacitly held practices than we had
anticipated. However the economic push for lower costs, faster response, quicker product
launches will inevitably mean a growing need to codify and automate more ‘‘practices’’
through a greater focus on activities like BPM. However, KM practices will be required to
both achieve a common understanding of the intent behind codified processes, and to
generate ideas and innovations required for continuous business process improvement.

Rules for co-existence


To summarize, the differences between process and practice are characterized Table I.
The important point in trying to achieve an appropriate balance between process and
practice is to know what tacit knowledge are the best candidates for trying to make explicit
and which areas to not even try. Significant guidance can be gained from the artificial
intelligence/expert systems discipline in this regard. A key learning being that knowledge
acquisition and representation can be particularly difficult and complex. Some attempts
have been made to develop a standard method for knowledge acquisition and
representation. Perhaps the best known of these is the knowledge acquisition
documentation and structuring (KADS) methodology for developing knowledge based
systems (Hickman et al., 1989), initially launched as a European Co-operative Research
project (ESPRIT). KADS could be viewed as the knowledge equivalent of BPM. Eventually
the complexity of the different models required in KADS to be effective, worked against its
larger scale adoption, and little is seen of it now. More recently, Papavassiliou and Mentzas
(2003) have explored a modeling approach to the integration of knowledge management
into weakly structured business processes.
Expert systems are arguably the most sophisticated means for capturing tacit knowledge
and making it explicit. Yet the majority of successful expert systems that have been
deployed over the past 20 years, have been in well defined and constrained areas like fault
diagnosis, credit assessments, schedule checking, and process control, and have largely
failed in areas requiring some creative thinking like business planning, schedule creation,
and new product development. In summary, there are definitely limits to the extent to which
one can practically make tacit knowledge explicit. These limits are both in terms of our ability
to accurately represent the knowledge in explicit form and practical limits on the ‘‘knowledge
engineering’’ time it would take to achieve such a representation, if indeed it were possible.
Most business processes found within organizations are simply documented in ‘‘rules and
procedures’’ manuals that are distributed with an expectation that they will be consistently
understood and applied. For anything other than simple routine tasks this is a dangerous
assumption. First, for complex processes the business process designer has the challenge
of accurately representing his or her tacit understanding of the business process intent in
explicit written form. Secondly, those expected to perform the process will internalize their

Table I Process vs practice


Process Practice

The way tasks are organized The way tasks are done
Routine Spontaneous
Orchestrated Improvised
Assumes predictable environment Responds to a changing unpredictable
environment
Relies on explicit knowledge Relies on tacit knowledge
Linear Network or web-like

Source: Seely et al. (2000)

j j
PAGE 34 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
understanding of the written process, with significant scope for this understanding to be
quite different than the intent of the designer. This is where KM practices can assist in
developing a common understanding of the business process intent by connecting
designers and performers of the business process. This socialization process will eventually
evolve into a common business practice around the business process.
Figure 3 shows two cycles of process/practice interaction. The inner cycle is the ‘‘shared
understanding’’ cycle. It starts with the process designer documenting the business
process (tacit to explicit knowledge conversion). For a complex process one could argue
that the document might represent less than 30 percent of what the designer actually
understands about the process. The process performer is then expected to internalize this
knowledge from the document (explicit to tacit conversion) that likewise for a complex
process, might be a 30 percent efficient process. Therefore the degree of common
understanding between process designer and process performer could be less than 10
percent[2]. To improve the level of common understanding, socialization (tacit to tacit
knowledge transfer) processes are required. The research literature strongly supports the
value of networks in facilitating the sharing of tacit knowledge (Augier and Vendelo, 1999;
Hansen, 1999; Powell, 1998; Lesser and Prusak, 1999). Common vehicles for these
socialization processes within organizations are communities of practice, i.e.
cross-organizational groups who form naturally around a common interest or cause (see
Wenger, 1999).
The outer cycle is the innovation cycle. The improvement cycle for business processes is
triggered by a gap between current and desired performance. Ideas for improvements need
to be solicited, tested and agreed on for implementation. Again the community of practice is
an excellent vehicle for socializing improvement ideas and innovations.

Figure 3 Process/practice interaction

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 35
The following example is used to illustrate how process and practice inter-relate:
Mary works for a government welfare agency involved with aged care. Because of her long
experience with assessing candidates for government assistance for their nursing home care,
she is asked to write the new manual to assist field case-workers apply a consistent process in
assessing candidates for care. In preparing the manual, Mary struggles with trying to articulate all
the factors she would personally apply in her assessments. However, she perseveres and
distributes her draft manual for initial use. In reviewing how the case-workers interpret her
procedures in use, she is somewhat surprised at the variety of interpretations they have taken.
She quickly appreciates that it is not possible to capture in writing her full intent. To help develop a
common understanding of the procedures with her case-workers she decides to form an informal
community of practice around the aged care assessment processes. The group meets monthly to
discuss their experiences with the assessment processes and agree on a common best
‘‘practice’’ interpretation of the documented ‘‘rules’’.
Over time, the community grows and distributes across all the department offices state-wide.
Group meetings start to become a forum for discussing improvement ideas. Ideas discussed at a
local office forum are shared with the core community, who decides whether or not to include the
suggested improvements into the process manual.
As new case workers come into the department they are provided with the assessment
‘‘process’’ manual to help guide their assessment work. More importantly they are also
introduced to the aged care assessment community of practice, from which they will gain the
important ‘‘practice’’ knowledge required to effectively perform their duties.

A framework for balancing process with practice


The framework details how to manage the business practice/business process balance at
the application level. When we look closely at defined business processes, we will see many
instances where human intervention is required. At times this intervention may be replaced
by an automated response. In other instances it may not be cost effective or even possible to
do so. In any case, once we have human interaction taking place we need to respect the
different roles that humans can take in the overall business process as defined. We can draw
guidance from here the human factors discipline, which has invented ‘‘task analysis’’
techniques for designing effective interfaces between knowledge based human tasks and
programmed process tasks (Diaper, 1989). While these techniques are perhaps too involved
for the casual user, the principles for carefully managing the interface should be adhered to.
As an example, let’s take a look at a typical order entry process (Figure 4).

Figure 4 Framework example

j j
PAGE 36 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
The business process can be mapped as shown. Key decisions can also be identified from
each of the processes. Depending on the industry you are in, the level of human intervention
will change from virtually none, to intimately involved in the whole process. The order entry
officer will either be simply performing clerical tasks that might easily be automated, or
playing the role of a business manager, negotiating each order on an individual basis. For
example, Amazon.com has demonstrated that the order entry process can nearly be totally
automated for commodities like books. However, for companies who create a unique
product with every job, like a construction company, the order entry process and staff are
viewed as critical to the business. Major cost savings are available for those businesses that
can reduce the level of human intervention. We therefore see firms trying to standardize their
offerings with defined pricing models and delivery mechanisms. On the other hand,
customers are now becoming more demanding and are looking for more personalized
attention, for which they are often willing to play a premium. Ultimately we need to
understand the nature of the decisions that the order entry officer needs to make.
Schroeder and Benbasat (1975) provide the characterization from their experiments with
decision-makers across environments of varying complexity (Figure 5). The experiments
demonstrate that there exists a point where further information is of diminishing value in
supporting a decision when the environment is complex. For example, using the order entry
example, if your company was in the business of making aircraft and a request came in to
build the first space shuttle to fly to Mars, the decision to accept the order or not will be
largely judgmental. We therefore need a framework that takes into account the complexity of
the decision at hand. In essence, this is determining where the ‘‘bounds of rationality’’ lie.
Decision complexity can then be used to determine what the appropriate mix should be of
business process response and business practice response (Figure 6).
The process proposed is summarized in Figure 7.
Using the above framework implies a requirement to explicitly identify and categorize
decisions that need to be made. It also implies a more systematic and disciplined approach
to decision-making, something that does not come naturally. The need for more systematic
decision support processes are strongly supported by the findings of Kahneman et al.
(1982) in their studies on bias in human decision making. Their studies clearly demonstrated
how decisions based on ‘‘gut feel’’ can be unintentionally impacted by human bias, leading
to clearly erroneous decisions.
Additional investment in time will be required to characterize decisions as routine,
informational or judgmental and supporting them appropriately. This will be justified through
time saved in not debating decisions that should be routine, or avoiding poor outcomes from
more complex decisions, as a result of not involving appropriately expert staff (Figure 8).

Figure 5 Decision complexity

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 37
Figure 6 Balancing process and practice

Figure 7 Balancing process and practice around decisions

j j
PAGE 38 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Figure 8 Case study

Summary
We have seen major progress made in the formalization of work into manageable business
processes. BPR has had some undoubted success but also a fair share of failures. We have
learnt some hard lessons about the limits to which one can effectively codify, or make
explicit, the tacit knowledge of expert practitioners. The KM discipline has provided us with a
multitude of techniques for ‘‘managing’’ largely knowledge based, business practices. The
challenge has been to determine how we can facilitate the delicate balance between
business process and business practice to ensure that they are appropriately balanced for
optimal performance. For business processes to be effectively deployed, they must be
surrounded by a healthy dose of business practice. A two-cycle model of interaction
between process and practice was described. An inner cycle showed how a common
understanding between process designer and process performer could only be achieved
through their socialization, typically through informal communities of practice. The outer
cycle showed the key role communities of practice also played in progressing ideas for
process innovations and improvements.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 39
Moreover, an analytic framework has been provided to assist in achieving an appropriate
process/practice balance for maximum organizational advantage. This framework
articulates decisions at the application level. It provides a method for determining
decision complexity and then indicates the business process or business practice based
techniques recommended for supporting these decisions.

Notes
1. The two main contenders are BPML (see www.BPMI.org) and BPEL4WS (see www.oasis-open.org).
2. These percentages are based on the author’s experiences in the field.

References
Allee, V. (1997), The Knowledge Evolution: Expanding Organizational Intelligence,
Butterworth-Heinemann, Oxford.
Al-Mashari, M., Irani, Z. and Zairi, M. (2001), ‘‘Business process reengineering: a survey of international
experience’’, Business Process Management, Vol. 7 No. 5, pp. 437-55.
Augier, M. and Vendelo, M.T. (1999), ‘‘Networks, cognition and management of tacit knowledge’’,
Journal of Knowledge Management, Vol. 3 No. 4, pp. 252-61.
Cao, G., Clarke, S. and Lehaney, B. (2001), ‘‘A critique of BPR from a holistic perspective’’, Business
Process Management, Vol. 7 No. 4, pp. 332-9.
Cowan, R., David, P. and Foray, D. (2000), ‘‘The explicit economics of knowledge codification and
tacitness’’, Industrial and Corporate Change, Vol. 9 No. 2, pp. 211-53.

Diaper, D. (1989), ‘‘Task analysis for knowledge descriptions (TAKD): the method and an example’’, Task
Analysis for Human-Computer Interaction, Ellis Horwood, New York, NY, pp. 108-59.
Hammer, M. and Champy, J. (1993), Reengineering the Corporation: A Manifesto for Business
Revolution, HarperCollins, New York, NY.

Hansen, M.T. (1999), ‘‘The search-transfer problem: the role of weak ties in sharing knowledge across
organizational subunits’’, Administrative Science Quarterly, Vol. 44 No. 1, pp. 82-111.
Hickman, F., Killin, J., Land, L., Mulhall, T., Porter, D. and Taylor, R. (1989), Analysis for
Knowledge-Based Systems: A Practical Introduction to the KADS Methodology, Ellis Horwood,
Chichester.
Johnson, B., Lorenz, E. and Lundvall, B-Å (2002), ‘‘Why all the fuss about codified and tacit
knowledge?’’, Industrial and Corporate Change, Vol. 11 No. 2, pp. 245-62.
Johnson, W. (2002), ‘‘Leveraging intellectual capital through product and process management of
human captial’’, Journal of Intellectual Capital, Vol. 3 No. 4, pp. 415-29.
Kahneman, D., Slovic, P. and Tversky, A. (1982), Judgement under Uncertainty: Heuristics and Biases,
Cambridge University Press, Cambridge.
Keen, P. (1997), The Process Edge: Creating Value Where it Counts, Harvard Business School Press,
Boston, MA.
Lenat, D. (1995), ‘‘CYC: a large-scale investment in knowledge infrastructure’’, Communications of the
ACM, Vol. 38 No. 11, pp. 33-8.
Lesser, E. and Prusak, L. (1999), Communities of Practice, Social Capital and Organisational
Knowledge, IBM Institute of Knowledge Management, Boston, MA.
Nelson, R. and Winter, S. (1982), An Evolutionary Theory of Economic Change, Harvard University Press,
Boston, MA.
Papavassiliou, G. and Mentzas, G. (2003), ‘‘Knowledge modeling in weakly-structured business
processes’’, Journal of Knowledge Management, Vol. 7 No. 2, pp. 18-33.

Polanyi, M. (1967), The Tacit Dimension, Doubleday, New York, NY.


Powell, W.W. (1998), ‘‘Learning from collaboration: knowledge and networks in biotechnology and
pharmaceutical industries’’, California Management Review, Vol. 40 No. 3, pp. 228-40.

j j
PAGE 40 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Schroeder, R. and Benbasat, I. (1975), ‘‘An experimental evaluation of the relationship of uncertainty in
the environment to information used by decision makers’’, Decision Sciences, Vol. 6 No. 3, pp. 556-67.
Seely Brown, J. (2001), ‘‘Sharing knowledge across the organisation: knowledge dynamics and
emerging corporate landscape for the age’’, CSC CIO Forum, August.
Seely Brown, J. and Duguid, P. (2000), ‘‘Balancing act: how to capture knowledge without killing it’’,
Harvard Business Review, May-June, pp. 3-7.
Simon, H.A. (1979), ‘‘Rational decision making in business organizations’’, The American Economic
Review, Vol. 69 No. 4, pp. 493-513.
Simon, H.A. (1987), ‘‘Making management decisions: the role of intuition and emotion’’, The Academy of
Management Executive, Vol. 1, pp. 57-64.
Smith, H. and Fingar, P. (2002), Business Process Management: The Third Wave, Meghan-Kiffer Press,
Tampa, FL.
Snowden, D. (2002), ‘‘Complex acts of knowing: paradox and descriptive self-awareness’’, Journal of
Knowledge Management, Vol. 6 No. 2, pp. 100-11.
Sveiby, K. (1997), The New Organizational Wealth: Managing and Measuring Knowledge-Based Assets,
Berrett-Koehler, San Francisco, CA.

Wenger, E. (1999), Communities of Practice, Cambridge University Press, Cambridge.


White, J. (1996), ‘‘Re-engineering gurus take steps to remodel their stalling vehicles’’, Wall Street
Journal, 26 November, p. 1.
Zairi, M. (1997), ‘‘Business process management: a boundaryless approach to modern
competiitveness’’, Business Process Management Journal, Vol. 3 No. 1, pp. 64-80.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 41
The inseparability of modern knowledge
management and computer-based
technology
Clyde W. Holsapple

Abstract
Purpose – This paper makes the case that modern knowledge management (KM) is inseparable from a
consideration of technology. While recognizing that there are many non-technological facets to KM
research and practice, it takes issue with the perspective proposed by some that knowledge
management has little or nothing to do with technology. Similarly, the perspective that equates
knowledge management with information management is challenged.
Design/methodology/approach – The research method involves an analysis of the contrasting
perspectives to show that each has blind spots that obscure a clear vision of the relationship between
Clyde W. Holsapple is a computer-based technology and knowledge management. Building on the ideas of Newell, van
Professor at the School of Lohuizen, and others, the research advances an alternative perspective to overcome limitations in the
Management, Gatton College of other two.
Business and Economics, Findings – The KM perspective introduced here neither dismisses technology nor identifies with it.
University of Kentucky, From this perspective, this paper develops the contention that modern KM has been tremendously
Lexington, KY, USA. E-mail: enriched by advances in computer-based technology (CBT), discussing several specific examples.
cwhols@uky.edu Moreover, this paper concludes that CBT needs to be grounded in a clear, deep consideration of
knowledge management.
Research limitations/implications – As this is a relatively new perspective, the full extent of its utility
will unfold over time as it is adopted, used, and extended. KM researchers can adopt this perspective to
guide the conception and design their research projects. Moreover, several implications for business
computing systems researchers are outlined.
Practical implications – The new perspective offers students and practitioners a middle-ground
between two extremes for framing their understanding and observation of KM and CBT phenomena.
Originality/value – Both research and practice are shaped by the conceptions that underlie them. The
paper furnishes a fresh, inclusive conception of the relationship between KM and CBT.
Keywords Computers, Decision support systems, Electronic commerce, Knowledge management
Paper type Conceptual paper

s a field of study and practice, knowledge management is here to stay. Yet, it is still in

A a formative stage, marked by differences in terminologies, emphases, and


boundaries. This paper focuses on one of those boundaries: the relationship
between knowledge management (KM) and computer-based technology. It advocates a
perspective of the boundary that neither excludes technology, nor identifies with it. This is an
inclusive perspective based on a conception of knowledge that recognizes multiple
knowledge types (descriptive, procedural, reasoning), multiple gradations of knowledge,
and diverse processors of diverse knowledge representations.
Views on the relationship between KM and computer-based technology are wide-ranging.
Some say that there is little or no relationship. Some contend that any such relationship is
largely incidental. In contrast, others tend to use the terms information and knowledge
interchangeably, seeing information technologies and systems as being at the core of
knowledge management. Still others take positions between these poles. This diversity of
This research was supported in part
by the Kentucky Initiative for
perspectives spans the KM literature, being expressed with varying degrees of overtness.
Knowledge Management, Adopting some view on the KM-technology relationship is unavoidable for the KM
established in 1988 at the University practitioner, researcher, or student (even though it may be done implicitly). The adopted
of Kentucky.

PAGE 42 j JOURNAL OF KNOWLEDGE MANAGEMENT j VOL. 9 NO. 1 2005, pp. 42-52, Emerald Group Publishing Limited, ISSN 1367-3270 DOI 10.1108/13673270510582956
‘‘ Computer-based technology is essential to an understanding
and application of modern knowledge management. ’’

view is significant because it shapes one’s ability to appreciate KM issues, opportunities,


challenges, and possibilities.
This paper considers the role of technology in knowledge management. In so doing, it
stakes out a position that there is neither a barrier that differentiates information from
knowledge, nor can the terms knowledge and information be used interchangeably. Building
on this, it argues that computer-based technology (CBT) is essential to an understanding
and application of modern knowledge management. Furthermore, it concludes that
knowledge management forms the rationale and intellectual basis for studying
computer-based technology and systems. That is, KM is the ground on which
technological advances grow, giving such advances sustenance, relevance, and a raison
d’etre.
Exploration of how technology can complement and mesh with human knowledge handling
is where CBT researchers have added and can continue to add value to the knowledge
management movement. This paper considers several examples of CBT that have been
integral to improved knowledge handing: electronic commerce systems, the Deep Blue
system, decision support systems, and research support systems. It also identifies and
discusses several areas where CBT research has a potential to make further contributions to
the KM field.

Boundary perspectives
As background for exploring the KM-CBT boundary, consider two contrasting perspectives:
exclusion and identification. The exclusive perspective sees knowledge management as
being a strictly human and social phenomenon. It sees the representation and usage of
knowledge as being exclusively a human endeavor. In sharp contrast, the identification
perspective views knowledge management as mainly a re-naming of computer-based
technology’s various monikers and variants such as data processing (DP) systems,
information systems (IS), information technology (IT), enterprise resource planning (ERP)
systems, intranet systems, data warehousing, and so forth.

The exclusive perspective


In KM conference presentations, articles, and web sites, it is not uncommon to encounter the
perspective that knowledge management has little or nothing to do with technology. In this
perspective, knowledge management is about human relationships, interpretations,
processes, resources, and culture. Certainly, this is the case. However, the exclusive
perspective goes a step further to explicitly or even emphatically exclude CBT from the KM
domain. If it is to be considered at all, CBT is nothing more than an enabler to facilitate the
practice of KM. Curiously, by their very natures, ‘‘enabling’’ and ‘‘facilitating’’ are hardly
incidental; they are at least important, if not crucial.
So, from the exclusive perspective, there is a well-defined boundary between KM and
computer-based technology, in the sense that KM has little or nothing to do with technology
and CBT is only concerned with information or data, but never with knowledge. In its extreme
form, this perspective not only sees an impenetrable barrier between CBT and knowledge
management, but even denies the existence of KM as anything more than a fad, buzzword,
or label for managing work practices (e.g. Wilson, 2002). However, the exclusive
perspective’s mainstream is ably represented by Galliers and Newell (2003) who ‘‘eschew
IT-enabled knowledge management, both in theory and in practice.’’ They argue that the
information technology research community has little to contribute to the development of the
KM movement.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 43
It is useful to examine the roots of the exclusive perspective on KM. It appears to stem from a
conception of knowledge that precludes the relevance of technology. It does so by defining
knowledge as uniquely in the domain of human or social processing; if a computer does
something, then knowledge cannot be involved. In defining away any role for technology in
KM, the exclusive perspective labels the storage, generation, application, and distribution
activities of computers as ‘‘data’’ management or ‘‘information’’ management, while
reserving the KM term for activities performed by humans (‘‘information’’ becomes
‘‘knowledge’’ when it is processed in a human mind). Such labeling ignores other important
and long-recognized abilities of CBT: storing, generating, applying, and distributing
procedures and logic, as well as state descriptions such as data/information (Bonczek et al.,
1981).
Suppose a person makes a forecast based on his/her interpretation and analysis of a current
situation. This forecast is the person’s ‘‘justified true belief’’ (i.e. it is held to be true with some
appreciable degree of confidence) – his/her knowledge of what may happen. In this
forecasting exercise, knowledge is being used and produced. The knowledge that is used
may be of various types (e.g. situation descriptions, experience-based procedures,
reasoning logic). Historically, middle managers and staff assistants spent considerable
effort in these sorts of tasks before the advent of computers and decision support systems.
Now, what if a computer system does what the person did in years past (perhaps even being
modeled on that person’s methods), or does some portion of what was heretofore called
knowledge work?
Should we say that knowledge is not being used or produced, preferring to call it information
or data instead? Should we say that this activity is now out of the KM domain because
technologies have been devised to perform it? Should we say that the system that produced
the same (or even better) forecast is unable to regard it as being true with the same (or even
more accurate) degree of confidence, and to even take action on that ‘‘justified true belief,’’
because knowledge use and products are exclusively in the human domain? Should we say
that in an organization that does not use such CBT, knowledge management is being done,
while simultaneously, in another organization that is using CBT for the same task, we say that
a KM phenomenon is not occurring? If we answer affirmatively to all of these questions, then
we are comfortable with the exclusive view; if we answer negatively to one or more, then an
alternative perspective on the boundary between KM and CBT is in order.

The identification perspective


The identification perspective considers the terms ‘‘knowledge’’ and ‘‘information’’ to be
more or less synonymous. Thus, knowledge management and information management are
interchangeable in this viewpoint. Moreover, computer-based technologies traditionally
referred to as information technologies or information management systems are called
knowledge management technologies. Critics such as Wilson (2002) contend that:
B IT vendors reposition their offerings to latch onto the current fashionability of KM in their
marketing efforts, without really changing those offerings in any fundamental way;
B consultants rename their IT services (and business process re-engineering services) as
KM services to capitalize on the buzz surrounding knowledge management; and
B academicians from IT departments appropriate the KM term as a way to enhance the
topicality of their research and to seize programmatic turf in the offering of KM education.
Indeed, the exclusive perspective may, in part, be a reaction the identification view. In
opposition to the identification perspective, it seeks to clearly separate the concepts of
knowledge and information, and does so in a way that precludes CBT from consideration in
the study and practice of KM.
By focusing on computer-based technology as the core of KM, the identification view tends
to be unbalanced, giving inadequate attention to the many human elements of KM. These
elements include trust, ethics, incentives, human relations, leadership, culture,
organizational infrastructure, social networks, social capital, creativity and innovation,

j j
PAGE 44 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
‘‘ It is no coincidence that the 1990s dramatic rise in KM
development, adoption, and prominence coincided with
advances in CBT connectivity and enterprise support. ’’

strategy, best practices, human competencies, knowledge sharing proficiencies, learning,


and so forth. Clearly, such human aspects of KM deserve careful consideration, as they are
likely to determine how CBT is applied in an organization and whether that application will
have positive impacts.
By equating knowledge with information, the identification perspective tends to be overly
narrow, giving inadequate attention to the panorama of facets inherent in knowledge
management. These facets include diverse knowledge resources, myriad knowledge
processing possibilities involving activities and flows, and numerous situational influences on
the conduct of knowledge management (Holsapple and Joshi, 2004). Regarding knowledge
as nothing more than information also ignores the rich array of dozens of knowledge attribute
dimensions that characterize any specific instance of knowledge and which deserve careful
consideration by those who engage in knowledge work (Holsapple, 2003).

The role of computer-based technology in knowledge work


Neither the exclusive perspective, nor the inclusive perspective, gives a clear vision of the
KM-CBT boundary. They do no offer insight into what is needed along this boundary. They do
not offer hindsight about what has been accomplished at the KM-CBT frontier. They do not
offer foresight into possibilities about how CBT can assist KM. To better understand the role
of computer-based technology in knowledge work, consider a different perspective. In this
third perspective, knowledge is neither equated with information, nor is a barrier built
between them. It is an inclusive perspective that views the boundary between KM and CBT
as highly permeable, and that sees the value in CBT as ultimately coming from its
contribution to KM efforts.

The inclusive perspective


The inclusive perspective is based on a conception of knowledge advanced by Newell
(1982): knowledge is that which is conveyed in usable representations. These
representations include symbolic, visual, audio, mental, digital, behavioral, and other
patterns in time and space. When a specific representation is found to be usable by some
processor, then for that processor it is knowledge. There are, of course, degrees of usability
related to the clarity, meaningfulness, relevance, and significance of a given representation
in a particular situation faced by the processor. There are also many attribute dimensions for
characterizing an instance of knowledge and its degree of usability for a particular
processor may be a function of where it lies on these dimensions (Holsapple, 2003).
Observe that this definition of knowledge does not depend on the nature of its processor. It
does not exclude computer-based processors; nor does it define knowledge in terms of
information (as either being synonymous with information or in contrast with information). For
purposes of the discussion that follows, we adopt Newell’s neutral, but very rich and unifying,
conception of knowledge – admitting the possibility that usable representations exist for
both human processors and computer-based processors.
What, then, is the relationship between Newell’s conception of knowledge and the notion of
information? Machlup (1980) asserts that all information is knowledge, but all knowledge is
not information. According to van Lohuizen (1986), information is one of several states in a
progression of knowledge: data, information, structured information, evaluation, judgment,
and decision. An example of this progression is shown in Table I. A processor can work on
one state in order to achieve another knowledge state higher in the progression. As we
proceed from lower to higher states in the knowledge progression, usability (e.g. relevance,

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 45
Table I Progression of knowledge states
A progression of knowledge states
(van Lohuizen, 1986) A sample progression

Datum 240
Information 240 is the level of cholesterol
Structured information 240 is the current level of cholesterol for
John Doe
An evaluation John Doe’s level of cholesterol is now too high
A judgement John Doe’s health is presently in severe
jeopardy
A decision John Doe gets a prescription for Lipitor

importance) with respect to a particular situation increases, quality may improve, and
possibilities of overload diminish. Unlike the exclusive perspective that considers only states
above information (or structured information) as being knowledge, van Lohuizen regards all
states in the progression as being knowledge of varying degrees of refinement.
Each of the knowledge states in Table I belongs to the class of descriptive knowledge. This is
the type of knowledge that describes the nature of some world. It could be a future world, the
past world, the current world, a hypothetical world, or an expected world. The knowledge
states that van Lohuizen discusses are gradations of descriptive knowledge. But, there are
other major types of knowledge not covered by his progression (and typically overlooked by
the exclusive and identification perspectives). These are procedural knowledge and
reasoning knowledge (Bonczek et al., 1981; Holsapple and Whinston, 1987, 1988).

In contrast to representations used to convey the characteristics of some world, there is


knowledge that is concerned with procedures, with how to do something. Philosophers tell
us that this procedural knowledge is very different in nature and function than descriptive
knowledge (Russell, 1948; Ryle, 1949; Scheffler, 1965). A representation that a given
processor can use to accomplish a series of steps is procedural knowledge for that
processor. This could be steps that do something to physical materials (e.g. a recipe,
assembly instructions) or to other knowledge representations. As an example of the latter,
procedural knowledge may be instrumental in progressing from one state of descriptive
knowledge to another (e.g. how to distill information from data, or how to derive evaluations
from information).

A third important knowledge type is concerned with reasoning (Holsapple and Whinston,
1988). Reasoning knowledge specifies what conclusion is valid when certain circumstances
exist. It is knowledge that is concerned with logic, correlation, synchronicity, analogy, and
perhaps even causality. When a representation is used by a processor to infer why
something is the way it is or what action is appropriate in a particular situation, that
representation is conveying reasoning knowledge to the processor. A processor may use
reasoning knowledge in progressing from one state of descriptive knowledge to another. For
instance, rules may be available that indicate which procedure to employ for distilling
information from data or what line of reasoning will lead yield a sound decision in a given
situation.
Just as van Lohuizen recognizes gradations of descriptive knowledge (including data and
information), analogous gradations of procedural knowledge and reasoning knowledge are
also recognizable. As Table II indicates, the gradations reflect different degrees of
sense-making that a processor, faced with some situation, can perform for any of the three
knowledge types: making sense of a representation’s syntax, its semantics, its
interrelationships, its validity, its applicability, and its actionability. For instance, at a
semantic level, a processor focuses on the meaning of a descriptive representation (i.e.
information), an algorithmic representation, or a rule representation. At an interrelationship
level, the focus is on dependencies and consistencies among related descriptions,
algorithms, or rules.

j j
PAGE 46 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Table II Gradations of descriptive, procedural, and reasoning knowledge
Progression of Progression of Progression of
Sense-making focus on descriptive knowledge procedural knowledge reasoning knowledge

Syntax (clarity) Datum Algorithm syntax Rule syntax


Semantics (meaning) Information Algorithm semantics Rule semantics
Interrelationships (dependencies, Structured information Connections and patterns among Relationships among rules and sets of
consistency) algorithms rule families
Validity (correctness, confidence) Evaluation Algorithm validity Rule and rule set validity
Applicability (importance, relevance) Judgement Algorithm applicability Rule and rule set applicability
Choice (actionability) Decision Algorithm choice Rule choice

Compared to either the exclusive or identification perspectives, the inclusive perspective


offers a very different foundation for understanding the relationship between
computer-based technology and knowledge management. It neither dismisses CBT by
defining knowledge to preclude technological representations and processing, nor
minimizes KM by equating it with information management. It admits the possibility that
knowledge from any of the cells in Table II may be represented in ways that are usable to
either human and/or computer-based processors. The discussion that follows adopts the
inclusive perspective.

Technology for knowledge management


A basic KM assumption is that an organization’s performance and competitive standing
suffer if it fails to effectively capture/preserve/generate/apply knowledge and make it flow
appropriately within and beyond the organization (Singh, 2000). In the modern world this is
done by augmenting innate human knowledge handling capabilities with computer-based
technology. Limiting ourselves to paper, pencil, typewriters, filing cabinets, shelving
systems, face-to-face meetings, telephone conversation, postal services, and the like is
hardly a recipe for success in the twenty-first century. It is no coincidence that the 1990s
dramatic rise in KM development, adoption, and prominence coincided with (or perhaps
followed in the wake of) advances in CBT connectivity and enterprise support (Holsapple
and Singh, 2000).
The theme for the 2001 International Conference on Information Systems was to explore how
technology can change our lives and our organizations. One significant way that it has done
so has been to transform how knowledge work is actually done. If we look at what
computer-based technology is, it is fundamentally concerned with digital approaches to
representing and processing knowledge of various types and in various gradations. For
descriptive knowledge the gradations – raw data to structured information to problem
solutions for decisions – mirror the evolution of CBT from data processing systems to
management information systems to decision support systems. All of these systems have
dramatically changed the way knowledge work is done in organizations, releasing
tremendous human resources, enabling organizational growth (and necessitating
organizational restructuring), and facilitating improved performance. They have been
instrumental in the rise of the knowledge-based organization (Holsapple and Whinston,
1987; Paradice and Courtney, 1989; Bennet and Bennet, 2003).
Computer-based technology has transformed the way in which individuals and
organizations accomplish knowledge work by amplifying, complementing, leveraging,
and (in some cases) improving on innate human knowledge handling capabilities. Although
efforts at managing knowledge certainly preceded the computer, it has been
computer-based technology that has ushered in the modern era of knowledge
management. In the last few decades, and especially in the last decade, there has been
as much progress in understanding knowledge management and advancing its practice as
occurred in the many preceding centuries that dealt with traditional, conventional,
non-technologically-supported knowledge management. Much of this progress has been
either stimulated by or enabled by advances in computing technology.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 47
Computer-based technology is concerned with the representation and processing of various
distinct types of knowledge. A variety of technologies have been devised to represent and
process reasoning knowledge, procedural knowledge, and the gradations of descriptive
knowledge (Holsapple and Whinston, 1988, 1996; Tiwana, 2000; Tsui, 2003). Four areas for
deploying these technologies are briefly highlighted: e-commerce, the Deep Blue
experience, decision support systems, and research support systems. From different
angles, each illustrates the inseparability of modern knowledge management from
computer-based technology.
Interestingly, the boom in knowledge management coincides with the 1990s boom in
organizational computing: networking, e-commerce (fueled by the web and the internet),
collaborative commerce, and enterprise and trans-enterprise systems. E-business, which
includes e-commerce and collaborative commerce, is concerned with approaches to
achieving business goals in which technological means for representing and processing
knowledge are used to implement, facilitate, and enable the execution of activities within
value chains and across value chains, as well as the decision making that underlies those
activities (Holsapple and Singh, 2000).
As such, knowledge and its management form the lifeblood and linchpin of e-business.
Take e-Bay, for example. This CBT is an electronic marketplace whose fundamental nature
involves the use of digital representations that convey knowledge to market participants
and to the system itself. It acquires knowledge about sellers’ offerings, buyers’
commitments, and participants’ market experiences. It assimilates knowledge, filtering,
screening, and organizing. It selects assimilated knowledge as needed to satisfy
participants’ requests and to apply in its own internal processing for such activities as
coordinating participant interactions and enforcing rules of conduct. It distributes
knowledge about states of the world (e.g. issues alerts, shows auction status, indicates
participant reputation), about procedures (e.g. how to initiate an auction, how to ensure a
safe trading experience), and about logic (e.g. trading tips, policies). All of this KM is the
essence of what the e-Bay system does, with an actual trading transaction between
participants being practically incidental. This is what would be called knowledge work if
carried out by persons involved in running a physical auction, although without the virtual,
global scope of the CBT market.
As another case of CBT involvement in knowledge management, consider the chess series
between Garry Kasparov and IBM’s Deep Blue. In analyzing this series, Huang (1998)
observes that it was a contest between an individual human’s ability to process his own
considerable tacit knowledge and a computer system’s ability to process collectively
constructed explicit knowledge. Huang contends that the Deep Blue victory demonstrates
the value of applying technology ‘‘to assist in collaborative efforts and knowledge sharing to
achieve winning results.’’ The larger lesson learned is that organizations can use CBT to
‘‘capture and reproduce tacit knowledge of their workers, to be reused at different times, in
different locations, through different media, to create solutions more efficiently. This allows
more time for individuals to use their intuitive strengths, defining and solving problems more
creatively.’’
Decision support systems have a similar result. By aiming to relax cognitive, temporal,
economic, or competitive pressures on decision makers, these systems give decision
makers greater opportunity to exercise and exploit their own idiosyncratic KM capabilities
(Holsapple and Whinston, 1996). Decision-making has long been recognized as a
knowledge-intensive task. Knowledge comprises its raw materials, work-in-process,
byproducts, and finished good. It involves processing representations of descriptive,
procedural, and/or reasoning knowledge in order to help produce knowledge about what to
do (Bonczek et al., 1981; Holsapple, 1995). In the case of artificially intelligent decision
support systems (e.g. expert systems), the emphasis is on representing and processing
reasoning knowledge. In the case of solver-oriented decision support systems (e.g. online
analytical processing systems), the emphasis is on the representation and processing of
procedural knowledge.

j j
PAGE 48 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
‘‘ Neither is there a barrier that differentiates information from
knowledge, nor can the terms knowledge and information be
used interchangeably. ’’

Computer-based technology can support decisional activity in many ways, taking over
knowledge work otherwise performed by human processors (e.g. staff assistants, middle
managers). Advantages of decision support technology lie in the directions of speed,
scale, reliability, and cost improvements for the knowledge work involved in
decision-making.
This knowledge work includes problem recognition, pattern discovery and interpretation,
knowledge selection (e.g. filtering, screening, navigating, scanning), knowledge derivation
(e.g. deriving forecasts, plans, designs, or recommendations relevant to a decision
situation), problem solving (e.g. quantitative and qualitative analysis, synthesis), knowledge
assimilation, knowledge acquisition, knowledge reuse, and knowledge distribution. In the
case of a multiparticipant decision maker, such as a group or organization, a decision
support system additionally may perform such functions as routing messages among
participants, managing the public versus private knowledge stores, or enforcing knowledge
workflows during the decision process.
Research support systems aid investigative rather than decisional tasks. Instead of aiming
to produce knowledge about what to do, their users seek to assemble and synthesize
knowledge about what is, what works, or what could be. For instance, genealogical research
is a widespread activity concerned with arriving at a knowledge of ancestry and
descendents across many generations. Without engaging in such research, many persons
would be unable to state the surnames of their ancestors for more than a couple of
generations, much less the details of their lives, accomplishments, and difficulties in the
historical contexts in which they lived. Genealogical researchers have an interest in knowing
about such heritage, perhaps to better appreciate personal (or regional) histories and
relations to current circumstances.
Research support systems for genealogy include those that are oriented toward assimilation
of pedigree and personal history knowledge as it is found or deduced, and corresponding
selection of any desired portion of that archive for subsequent review. This can involve
complex, directional network structures with nodes representing hundreds or thousands of
related persons, plus many historical dimensions for each node. The volume and intricacy
such knowledge is typically well beyond what most researchers could reliably commit to
memory and makes paper documentation burdensome when it comes to knowledge
selection.
Other genealogical research support systems are oriented toward acquiring knowledge.
Some of these are web-based repositories with census images, historical directories,
biographical sketches, and records relating to births, marriages, deaths, courts, real estate,
and cemeteries. In the pre-web era, such access generally required physical searches
through bound volumes and microfilm in geographically dispersed courthouses and
libraries. Genealogical forums (organized by surname or geography, for instance) comprise
another CBT tool for acquiring and sharing knowledge, one that really had no effective prior
counterpart. Through this technology, researchers who have complementary knowledge are
able to identify each other and share what they know, resulting in new leads to investigate
that would never have arisen otherwise. This technology has been a tremendous boon to
these knowledge building efforts.
Similarly, many CBTs form the essential backbone of communities of interest and
communities of practice. Such communities are multipaticipant research efforts that seek to

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 49
jointly build knowledge (e.g. lessons learned) about some phenomenon. Without modern
technology, community efforts remain largely local and small in scale. It follows that KM
researchers and practitioners need to be attentive to and contribute to designing the
features and functions of these systems. After all, their sole purpose is to enable and
facilitate KM activity that would not otherwise exist on a substantial scale.

Implications
The modern knowledge worker is immersed in a social environment populated by other
knowledge workers having various specialized knowledge and/or special skills for
processing that knowledge, and in a technological environment populated by
interconnected computer-based processors having access to digital representations of
certain knowledge coupled with skills for making use of those representations. Of relatively
recent vintage, this technological component of the knowledge worker’s world requires
organizations to substantially alter their approaches to organizing work, fostering
interactions, representing knowledge, processing those representations (e.g. in the
course of solving problems), and taking action. Failure to do so jeopardizes the
organization’s survival in a rapidly changing, highly competitive world, regardless of how
an organization handles the social aspects of KM.
The proposition that modern knowledge management is inseparable from a consideration of
technology is compelling. We really cannot fully appreciate KM practices and possibilities
without paying attention to technology, to the users of that technology, and to the impacts of
that technology. If we were to eliminate technology from consideration, then modern
knowledge management is gutted. Traditional KM success stories such as Buckman Labs’
K’Netix and Ernst & Young’s Ernie would disappear. They are technologically based.
Moreover, technologies such as computer-mediated communication, computer-supported
cooperative work, databases, digital documents, search engines, web crawlers, solvers and
spreadsheets for deriving knowledge, text mining, data mining, pattern recognition in
general, and all of the rest are out of bounds (or, at most, perhaps on the fringes of the
knowledge management world).
Instead, we inhabit a world in which computer-based technology has tremendously enriched
knowledge management, in which technological advances will continue to do so, and in
which technology is becoming increasingly important in KM efforts that aim to keep an
organization competitive. A recent study asked CEOs and CKOs about the degree to which
computer-based technology for performing and supporting the nine knowledge chain
activities has yielded a competitive advantage for their organizations (Singh, 2000). For each
of the primary knowledge chain activities, between 40 percent and 65 percent of
respondents said yes, the practice of using technology in this knowledge management
activity makes a substantial contribution to our organization’s competitive advantage. For
each of the secondary knowledge chain activities, between 25 percent and 40 percent of the
respondents said yes, it makes a significant contribution. For the secondary activities, they
were asked about what they envisioned by 2005. That range jumped to 46-63 percent who
saw technology as a key for competitive advantage through the particular KM activities.
If we adopt the position that computer-based technology is inseparable from modern
knowledge management, what are the messages and opportunities for CBT researchers?
First of all, and perhaps most important, there is the message that CBT researchers are KM
researchers, whether they recognize it or not. This is particularly the case for researchers
concentrating on developing and studying the role of technology in business, management,
and organizations. We do not simply design, develop, and deploy technology for its own
sake, but rather because it helps us deal with knowledge in its various gradations, and of
various types, along the way to better organizational and individual performance. In this
effort to build such systems, it is useful to appreciate the broader KM context in which we
operate. It may well be that doing so will add greater value to the work of business
computing researchers. With such an appreciation, it also becomes clear that KM is far from
a mere renaming of ‘‘information systems’’ (IS); rather it is the enveloping domain in which IS
is indispensable and in which IS finds its non-technological footing and basis.

j j
PAGE 50 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Second, there are opportunities for researchers to improve on existing technologies for
knowledge handling. These include enabling or facilitating the knowledge flows among
knowledge processors, be those processors human or computer-based. They include
technologies for supporting and performing knowledge manipulation tasks, for acquiring
knowledge from the outside, for selecting knowledge from the inside, for knowledge
generation (derivation and discovery), for distributing knowledge, for assimilating
knowledge, and so forth. Another area is technologies for assisting in the measurement,
the control, the coordination, and maybe even the leadership of knowledge and knowledge
processors, again be they human or computer-based. Yet another approach is to devise
improvements in technologies for making the right knowledge (be it descriptive, procedural,
reasoning, or some combination thereof) available to the right processors (be they human,
computer-based, some mix of these) in the right format, at the right time, for the right cost.
A third area where CBT research can add value is to address the question of better
understanding the users and the usage of technologies in knowledge management. What
works, and under what conditions? Why does a particular technology not work well for KM?
Why is it not helpful? What technology advances and breakthroughs can be achieved by
spotting opportunities suggested through as understanding of the broader KM realm? How
do we cultivate good fits between technological infrastructure and organizational
infrastructure in the context of knowledge-based organizations?
Finally, to advance the KM field, researchers need to study outcomes of using technology for
knowledge management. What are its competitive impacts? How exactly can a particular
technology be used in order to achieve a competitive advantage for one of the knowledge
chain activities, perhaps by contributing to productivity, or by helping in agility, or by
fostering greater innovation, or by enhancing reputation?

Conclusion
While modern knowledge management and computer-based technology are inseparable,
there most certainly is more to KM than technology. There are people, organizations,
knowledge-based tasks, and their fits with technology. However, the existence of some
non-technological factors does not mean that we should ignore or dismiss technology from
consideration in KM research or KM practice. To do so would not only overlook many
accomplishments to date, but, more importantly, would discard an important part of KM’s
potential for the future.
CBT and IS researchers must avoid the tendency to simply rename what they have been
doing to call it knowledge management. It is fine to recognize that their work can contribute
to the advance of KM and to intentionally aim at making such contributions. But, such efforts
should be grounded in solid comprehension of major concepts, issues, and experiences in
this very interdisciplinary field. KM must be recognized as a reference discipline for CBT in
general and IS research in particular.

References
Bennet, A. and Bennet, D. (2003), ‘‘The rise of the knowledge organization’’, in Holsapple, C. (Ed.),
Handbook on Knowledge Management, Vol. 1, Springer, Berlin.
Bonczek, R., Holsapple, C. and Whinston, A. (1981), Foundations of Decision Support Systems,
Academic Press, New York, NY.
Galliers, R.D. and Newell, S. (2003), ‘‘Back to the future: from knowledge management to the
management of information and data’’, Information Systems and E-Business Management, Vol. 1 No. 1,
pp. 5-13.
Holsapple, C. (1995), ‘‘Knowledge management in decision making and decision support’’, Knowledge
and Policy, Vol. 8 No. 1, pp. 5-22.
Holsapple, C. (2003), ‘‘Knowledge and its attributes’’, in Holsapple, C. (Ed.), Handbook on Knowledge
Management, Springer, Berlin.
Holsapple, C. and Joshi, K. (2004), ‘‘A formal knowledge management ontology’’, Journal of the
American Society for Information Science and Technology, Vol. 55 No. 7, pp. 593-612.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 51
Holsapple, C. and Singh, M. (2000), ‘‘Toward a unified view of electronic commerce, electronic
business, and collaborative commerce: a knowledge management approach’’, Knowledge and Process
Management, Vol. 7 No. 3, pp. 151-64.

Holsapple, C. and Whinston, A. (1987), ‘‘Knowledge-based organizations’’, The Information Society,


Vol. 5 No. 2, pp. 77-90.
Holsapple, C. and Whinston, A. (1988), The Information Jungle: A Quasi-Novel Approach to Managing
Corporate Knowledge, Dow Jones, New York, NY.

Holsapple, C. and Whinston, A. (1996), Decision Support Systems – A Knowledge-based Approach,


West Publishing, St Paul, MN.
Huang, K. (1998), ‘‘Capitalizing on intellectual assets’’, IBM Systems Journal, Vol. 37 No. 4, pp. 570-83.
Machlup, F. (1980), Knowledge: Its Creation, Distribution, and Economic Significance, Vol. 1, Princeton
University Press, Princeton, NJ.
Newell, A. (1982), ‘‘The knowledge level’’, Artificial Intelligence, Vol. 18 No. 1, pp. 87-127.
Paradice, D. and Courtney, J. (1989), ‘‘Organizational knowledge management’’, Information Resources
Management Journal, Vol. 2 No. 3, pp. 1-13.
Russell, B. (1948), Human Knowledge, Simon & Schuster, New York, NY.

Ryle, G. (1949), The Concept of Mind, Hutchinson, London.


Scheffler, I. (1965), Conditions of Knowledge, Scott Foresman, Chicago, IL.
Singh, M. (2000), ‘‘Toward a knowledge management view of electronic business: introduction and
investigation of the knowledge chain model for competitive advantage’’, unpublished PhD dissertation,
University of Kentucky, Lexington, KY.

Tiwana, A. (2000), The Knowledge Management Toolkit: Practical Techniques for Building a Knowledge
Management System, Prentice-Hall, Upper Saddle River, NJ.
Tsui, E. (2003), ‘‘Tracking the role and evolution of commercial knowledge management software’’,
in Holsapple, C. (Ed.), Handbook on Knowledge Management, Vol. 2, Springer, Berlin.

Van Lohuizen, C. (1986), ‘‘Knowledge management and policymaking’’, Knowledge: Creation, Diffusion,
Utilization, Vol. 8 No. 1, pp. 12-38.
Wilson, T.D. (2002), ‘‘The nonsense of ‘knowledge management’’’, Information Research, Vol. 8 No. 1.

j j
PAGE 52 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Understanding computer-mediated
interorganizational collaboration:
a model and framework
Lei Chi and Clyde W. Holsapple

Abstract
Purpose – To develop a process model of interorganizational systems (IOS) collaboration and
systematic framework for understanding and classifying IOS technologies for interorganizational
collaboration.
Design/methodology/approach – This paper synthesizes relevant concepts and findings in the IOS,
economics, and management literature. It also presents empirical examples to illustrate key issues,
practices, and solutions involved in IOS collaboration.
Lei Chi and Clyde W. Holsapple Findings – An integrative model of IOS collaboration is introduced and knowledge sharing,
participative decision making, and conflict governance identified as three behavioral process elements
are in the Decision Science and
underlying effective interorganizational collaboration. Extending Kumar and van Dissel’s IOS framework
Information Systems Area,
to directly recognize these elements, a more complete collaboration-oriented framework for
School of Management, Gatton characterizing key elements of interorganizational collaboration and classifying IOS technologies is
College of Business and developed.
Economics, Lexington, KY, USA Research limitations/implications – This paper brings together diverse ideas into a systematic view of
(lchi0@uky.edu) collaboration via interorganizational systems. It contributes to a deeper, fuller understanding of issues
(cwhols@uky.edu). involved in achieving collaborative advantage with IOS technologies. The paper also identifies factors
and relationships that researchers should consider in designing empirical studies, posing hypotheses
about collaboration via IOS, and analyzing results.
Practical implications – The model and framework can serve as a check-list of considerations that
need to be dealt with by leaders of collaboration-oriented IOS initiatives. The IOS framework and
technology classification may also suggest ways in which IT vendors might provide better technological
solutions, services, and software for interorganizational collaboration.
Originality/value – This new IOS collaboration model and framework provide more complete and useful
guidance for researchers, educators, and practitioners.
Keywords Knowledge management, Organizations
Paper type Conceptual paper

1. Introduction
Interorganizational systems (IOS) have captured increasing interest of researchers and
practitioners since Kaufman’s (1966) visionary arguments about extra-corporate systems and
computer time sharing. By providing the electronic infrastructure for sharing task performance
between firms, these systems have opened avenues to collaborative knowledge work in
several directions. They have fostered a new set of organizational design variables, such as
shared repositories of knowledge, real-time integration of interrelated business processes,
electronic communities that foster learning and allow multiple relationships to occur
simultaneously, and virtual organizations that enable dynamic assembly of complimentary
resources and skills among the collaborating firms (Strader et al., 1998).

Early examples of successful IOS users provided strong evidence that aggressive pursuit of
new possibilities for joint performance improvement through IOS can be an important source
of sustainable competitiveness (Johnston and Vitale, 1988). IOS can reduce the cost of
communication while expanding its reach (time and distance), increase the number and

DOI 10.1108/13673270510582965 VOL. 9 NO. 1 2005, pp. 53-75, Q Emerald Group Publishing Limited, ISSN 1367-3270 j JOURNAL OF KNOWLEDGE MANAGEMENT j PAGE 53
quality of alternatives while decreasing the cost of transactions, enable tight integration
between firms while reducing the cost of coordination (Malone et al., 1987). They can also
facilitate knowledge sharing and trust building (Holland, 1995; Li and Williams, 1999; Gallivan
and Depledge, 2003), speed up expertise exploitation and knowledge application (Migliarese
and Paolucci, 1995; Christiaanse and Venkatraman, 2002), and enhance innovation and
knowledge generation (Thomke and von Hippel, 2002). Thus, by increasing competitive
bases in achieving efficiency, flexibility, innovation, quality, and speed, IOS comprise an
important class of knowledge management technology that offers significant opportunities for
improving economic performance and competitiveness of many companies.
To more fully realize the potential of integrating this interorganizational knowledge
management technology with business processes and competitive strategies, a systematic
study is needed to help identify innovative inter-firm applications based on IOS and identify
key factors in facilitating effective collaboration via IOS. Most existing studies on IOS are
based on anecdotes, personal opinions, and experiences rather than on systematic
research studies (Venkatraman and Zaheer, 1994). They are fragmented regarding the uses
and impacts of IOS, and largely focus on the roles of IOS as competitive weapons for
achieving power and efficiency.
Furthermore, underlying many studies is the assumption that humans produce errors while
automation produces reliability. These studies view IOS as technologies designed and
implemented to automate the relationships between firms. They largely fail to acknowledge
the part human ingenuity plays in the work practice and the importance of learning (Sachs,
1995).
Therefore, these studies provide limited understanding of the relationship between IOS and
the knowledge-intensive phenomenon of interorganizational collaboration. Many innovative
opportunities of exploiting IOS potential for learning and mining the funds of knowledge
across organizations for greater competitiveness are likely to be overlooked.
As such, this paper introduces a model and framework that more fully address the following
questions faced by leaders of knowledge management initiatives and by researchers of
knowledge management phenomena:
B What are the key elements underlying effective interorganizational collaboration among
IOS participants?
B How can IOS be classified to facilitate an understanding of collaboration? What are
characteristics and candidate implementation technologies for each type of IOS?
B What are key issues that a knowledge manager needs to address in IOS-based
collaboration? How can these issues be addressed to enhance the processes and
outcomes of this collaboration?
As a step toward answering these questions, this paper synthesizes relevant concepts and
findings in the IOS, economics, and management literature to develop a process model of
IOS collaboration and systematic framework for understanding and classifying IOS
technologies for interorganizational collaboration.
The rest of this paper is organized as follows: section 2 defines IOS as a class of knowledge
management technology for fostering interorganizational collaboration; section 3 introduces
a model of IOS collaboration and identifies key elements underlying effective
interorganizational collaboration processes; section 4 uses these elements to extend a
framework by Kumar and van Dissel (1996) for classifying IOS, resulting in a more fully
developed collaboration-oriented framework; and sections 5 briefly discusses contributions
and implications of this research for researchers, practitioners, and educators.

2. Defining IOS
In 1966, Kaufman implored general managers to think beyond their own organizational
boundaries and to explore the possibilities of extra-corporate systems for linking buyers and
sellers or companies performing similar functions. Kaufman convincingly argued that these
extra-corporate systems could greatly increase the efficiency of business operations and

j j
PAGE 54 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
‘‘ In the broadest sense, an IOS consists of computer and
communications infrastructure for managing
interdependencies between firms. ’’

enhance cooperation between firms through time sharing. In 1982, Barret and Konsynski
(1982) used the term ‘‘interorganizational information sharing systems’’ to describe such
systems. In 1985, Cash and Konsynski (1985) clearly defined the concept of
‘‘interorganizational systems’’ (IOS) as ‘‘automated information systems shared by two or
more companies.’’ Some well-known examples of IOS are American Airlines’ SABRE
reservation system, American Hospital Supply’s ASAP system, the CFAR system between
Wal-Mart and Warner-Lambert, and Cisco’s eHub.
In the broadest sense, an IOS consists of computer and communications infrastructure for
managing interdependencies between firms. From a knowledge management perspective,
this infrastructure enables and facilitates knowledge flows among organizations (and their
participating representatives) such that the needed knowledge gets to the relevant
participants on a timely basis in a suitable presentation(s) in an affordable way for
accomplishing their collaborative work. An IOS may involve one or more technologies, ranging
from an electronic funds transfer system for data transmission to a collaborative CAD/CAM
tool to a groupware system for joint product design. In recent years, rapid advancements in
computer and communications technologies have made feasible many new applications of
IOS that are greatly increasing the potential of effective inter-firm collaboration.
For instance, groupware encompasses previously considered independent technologies
(e.g. messaging, conferencing, collaborative authoring, workflows and coordination, and
group decision support) and has arisen to support dynamic business processes involving
communication, coordination, and cooperative work (Freed, 1999).
The internet integrates technologies of the world wide web (hypertext transportation protocol
(HTTP)), telnet, file transfer protocol (FTP), network news (network news transfer protocol
(NTTP)), internet relay chat (IRC), and e-mail (simple mail transport protocol (SMTP); internet
message access protocol (IMAP)). It provides high flexibility for quick electronic access to
external data and linkages to potential customers and partners around the world (Strader
et al., 1998).
An extranet combines the advantages of the internet (global access) with those of local area
networks (security, easy management of resources, and client/server functionality). Based
on internet technology and protocols, an extranet provides information in a way that is
immediate, cost-effective, easy to use, rich in format, versatile, and secure over a private
network (Strader et al., 1998).
Peer-to-peer (P2P) communication, by allowing users to bypass central exchanges and
exchange information directly with one other, provides a promising alternative to the
conventional client/server model. Compared to the client/server model, P2P may significantly
reduce the complexity and expense of networking (McAfee, 2000). In addition, P2P networks
have no bounds, while membership in the client/server model is limited. Thus, P2P may
provide solutions to the potential communication overflows that restrict the communication
capabilities of most current network communities (Yoshida et al., 2003).
Wireless communication uses wireless devices, sensors, positioning locators, and networks
to allow real-time communication with anyone at any time, anywhere. Radio frequency
identification (RFID), global positioning systems (GPS), voice e-mail, enhanced specialized
mobile radio (ESMR), and MicroBurst wireless are some of the available wireless
technologies that may have important implications for the collaborative area of supply chain
management (Shankar and O’Driscoll, 2002).

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 55
Extensible markup language (XML) has quickly arisen as a standard data representation
format. Being fully Unicode compliant, XML will greatly enhance EDI’s ability with its
extensibility, platform-independence, and support for a universal data access. Simple
object access protocol (SOAP) uses XML technologies to define an extensible message
framework that allows structured information to be exchanged over a variety of underlying
protocols and programming models in a decentralized, distributed environment. Web
services description language (WSDL) defines an XML-based grammar for describing
network services as a set of endpoints that accept messages containing either
document-oriented or procedure-oriented knowledge. WSDL is extensible to allow the
description of endpoints and their messages regardless of what message formats or
network protocols are being used to communicate. Universal description, discovery, and
integration (UDDI) defines a SOAP-based web service for locating WSDL-formatted protocol
descriptions of web services (MSDN Library – msdn.microsoft.com/library). XML, SOAP,
WSDL, and UDDI will provide a foundation for companies to have real-time access to
structured and semi-structured knowledge resources around the globe.

3. A model of IOS collaboration


Through an examination of the IOS literature, we identify eight distinct and critical motives[1]
underlying an organization’s use of IOS: necessity, asymmetry, reciprocity, efficiency, agility,
innovation, stability, and legitimacy. We contend that the leader of an knowledge management
initiative contemplating or implementing IOS technology needs to carefully consider which of
these motives are applicable to his/her situation, how they relate to relational bonding and
behavioral processes, and what are their impacts on collaborative advantage:
B The necessity motive: an organization adopts the use of an IOS in order to meet necessary
legal, regulatory, or deregulatory requirements from higher authorities (e.g. government
agencies, legislation, industry, or professional regulatory bodies) that otherwise might not
have been used voluntarily (as in the case of US Department of Transportation regulation in
1987 exemplified by Christiaanse and Venkatraman, 2002, and the case of London Stock
Exchange’s Big Bang in 1986 studied by Clemons and Weber, 1990).
B The asymmetry motive: an organization is prompted to use an IOS for purposes of
exerting power or control over other organizations (Kling, 1980; Webster, 1995; Iacovou
et al., 1995; Reekers and Smithson, 1995; Hart and Saunders, 1997).
B The reciprocity motive: an organization uses an IOS in order to pursue common or
mutually beneficial goals or interests and to facilitate collaboration, trust building, and
coordination (Holland, 1995; Ferrat et al., 1996; Kumar et al., 1998; Pouloudi, 1999).
B The efficiency motive: an organization is motivated to use an IOS in an attempt to improve
both its internal efficiency and its interorganizational efficiency (Malone et al., 1987;
Johnston and Vitale, 1988; Konsynski and McFarlan, 1990; Clemons and Row, 1991).
B The agility motive: an organization is prompted to use an IOS to increase agility and
responsiveness to environmental changes (Rockart and Short, 1991; Zaheer and Zaheer,
1997).
B The innovation motive: an organization is induced to use an IOS for purposes of
innovation and value creation (Strader et al., 1998; Bowker and Star, 2001; May and
Carter, 2001; Thomke and von Hippel, 2002).
B The stability motive: an organization is prompted to use an IOS in order to reduce
environmental uncertainty and to achieve stability, predictability, and dependability in its
relations with others (Li and Williams, 1999).
B The legitimacy motive: an organization is motivated to use an IOS to increase its
legitimacy and reputation in order to appear in agreement with prevailing norms, beliefs,
expectations of external constituents, or prevalence of a practice in the industry (Teo et al.,
2003).
Although each of the eight motives may be a separate and sufficient cause for an
organization’s IOS adoption, the decision to use IOS is commonly based on multiple motives.

j j
PAGE 56 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
‘‘ Effective knowledge sharing can promote understanding,
suppress opportunistic behaviors, and induce commitment
and trust among partners, thus leading to greater
collaboration. ’’

Furthermore, these eight motives are likely to interact with each other. Certain motives will
become dominant under favorable conditions and be suppressed under unfavorable
conditions. For example, the underlying process of IOS use prompted by the asymmetry
motive can be characterized by inequality, knowledge asymmetry, manipulation, coercion,
or conflict. Under transparent knowledge sharing, participative decision making and
effective governance for conflict resolution, the asymmetry motive is likely to be suppressed,
while the reciprocity motive tends to become dominant. Concurrently, the reciprocity motive
may interact with certain other motives. For example, when cooperative use of an IOS is also
expected to lead to the fulfillment of other organizational requirements and expectations
(e.g. higher levels of efficiency or productivity, greater agility, greater innovation, greater
stability, or greater legitimacy or reputation), cooperative behaviors and collaboration will be
more likely.
Based on the interaction among the eight motives, we introduce the model of IOS
collaboration depicted in Figure 1. An organization may be prompted to use an IOS under
certain motives (e.g. stability). When such behavioral processes as transparent knowledge
sharing, shared decision making, and effective governance for conflict resolution are
promoted among IOS participants, cooperative behaviors are likely to be induced and
prevail. These cooperative behaviors tend to interact with an organization’s effort to develop
stable and reliable relations. Power plays are likely to be suppressed in the hopes that
equity, reciprocity, and harmony will facilitate stability. As a result, trust and commitment will
increase among the partners. Increased trust and commitment in turn will facilitate the
processes of knowledge sharing, participative decision making and conflict resolution,
which further enhances trust and commitment of the participants and ultimately yields better
joint performance. Performance outcomes for knowledge-intensive work can be gauged in
several ways: productivity, agility, innovation, and reputation (Holsapple and Singh, 2001).
Collectively, improvements on these four dimensions are avenues for collaborative
advantage.
Therefore, the model asserts that knowledge sharing, shared decision making, and conflict
governance are three key elements underlying effective interorganizational collaboration. By
fostering trust and suppressing power plays, they not only can buttress the motivations of

Figure 1 A model of IOS collaboration

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 57
‘‘ Performance outcomes for knowledge-intensive work can be
gauged in several ways: productivity, agility, innovation, and
reputation. ’’

organizations to collaborate via IOS, but also make for sustainable collaborative advantage.
We next use these three elements in developing a collaboration-oriented IOS framework.

4. IOS frameworks
The forgoing model identifies several variables (and their relationships) that will need to be
managed or addressed to improve the chance of success for an IOS-based knowledge
management effort aimed at interorganizational collaboration. It is also important for the
leader of such an effort to have a framework for appreciating the nature and possibilities of
IOS options in accomplishing this knowledge work.

4.1 Limitations of Some IOS Frameworks


Several frameworks for understanding interorganizational systems have been proposed. For
example:
B Barret and Konsynski (1982) classify IOS into five levels based on an increasing degree of
the participant’s responsibility, cost commitment and complexity of operating
environment. At level 1, a firm only serves as a remote input/output node for one or
more higher level nodes. Level 2 participants design, develop, maintain, and share a
single system such as inventory query system. Level 3 participants develop and maintain
a network linking itself and its direct business partners. Level 4 participants develop and
share a network with diverse application systems that may be used by many different
types of participants. At level 5, any number of lower-level participants may be integrated
in real time over complex operating environments.
B Johnston and Vitale (1988) propose a framework using three dimensions: business
purpose, relationships with participants, and information function. Business purpose
indicates why an IOS is needed; it could be either to leverage present business or to enter
a new information-driven business. Relationships refer to those participants linked by an
IOS; they could be customers, dealers, suppliers, or competitors. Information function is
concerned with the functionality that an IOS is intended to perform; it may handle
boundary transactions, retrieve and analyze shared information, or be designed to
manipulate information as part of ‘‘back office’’ operations in the participants’
organizations.
B Meier and Sprague (1991) classify IOS into three categories: ordering systems that
connect a manufacturer with its suppliers or a retailer with its customers; electronic
markets that substitute the traditional means of trading with the electronic means of
trading; and online information dissemination systems.
B Hong (2002) classifies IOS into four types based on the role linkage (vertical vs horizontal)
and the system support level (operational vs strategic) of the IOS participant: resource
pooling, operational cooperation, operational coordination, and complementary
cooperation. A resource pooling IOS links participants to perform common value
activities in order to permit risk/cost-sharing by pooling resources. A complementary
cooperation IOS represents a form of cooperation between firms playing different roles in
an industry value chain. An operational cooperation IOS brings together firms in a
common value chain primarily to improve the quality of customer service or to share
information of common interest. An operational coordination IOS is used to link different
roles of participants serving an industry value chain to increase operational efficiency.

j j
PAGE 58 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Although these frameworks enhance an understanding of the uses and impacts of IOS, they
do not focus on IOS collaboration:
B According to the American Heritage Dictionary (1997), collaboration is defined as
working together, especially in a joint intellectual effort. Working together implies
managing interdependencies among participants toward some common end. Joint
intellectual effort recognizes that collaboration is a knowledge management episode
comprised of knowledge flows among participants who process knowledge resources in
various ways and under various influences in pursuit of the common end (Holsapple and
Joshi, 2000). Together with the three elements identified in the collaboration model in
Figure 1, we thus contend that a good understanding of IOS collaboration requires the
examination of four elements: managing interdependencies among knowledge
processors, knowledge sharing, participative decision making, and conflict
governance. However, these elements are not explained sufficiently by any of the
above IOS frameworks.
B The above frameworks tend to focus on the roles of IOS as competitive weapons for
achieving power and efficiency. For example, Johnston and Vitale’s framework (1988)
advances the concept of competitive advantage to explain the emergence and impact of
IOS. It regards IOS as instruments that, by locking in customers and dominating
suppliers, increase an organization’s bargaining power over them. The framework
suggests that, in the drive to optimize its self-interest, the objective of an organization is to
minimize its dependence on other organizations while maximizing the dependence of
other organizations on itself (Kumar and van Dissel, 1996). Thus, such frameworks
appear to be inconsistent with the spirit of interorganizational collaboration.
B Furthermore, underlying these studies is the assumption that humans produce errors
while automation produces reliability. These studies view IOS as technologies designed
and implemented to automate the relationships between firms. They fail to acknowledge
the part human ingenuity plays in the work practice and the importance of learning
(Sachs, 1995). The work is viewed as a process flow or the sequence of tasks in
operations that can be structured or coded, whereas the tacit, less structured learning
process whereby people discover and solve problems is omitted. In this regard, many
innovative opportunities of performance improvement by exploiting IOS potentials for
learning and utilizing knowledge resources distributed across organizations are likely to
be overlooked.
Kumar and van Dissel (1996) propose a framework that classifies IOS based on Thompson’s
(1967) typology of interorganizational interdependencies. As described in section 4.2.2
below, by highlighting IOS’s role in managing inter-firm dependencies and stressing trust
building through reducing potential conflicts for sustained collaboration, Kumar and van
Dissel’s framework addresses some of the limitations of the above frameworks and provides
a good basis for our collaboration-oriented IOS framework introduced here.

4.2 A Collaboration-oriented IOS framework


The collaboration-oriented IOS framework is summarized in Table I. It adopts Kumar and van
Dissel’s (1996) notions of using IOS for managing pooled, sequential, and reciprocal
interdependencies. It also incorporates the IOS collaboration model’s three behavioral
processes: knowledge sharing, participative decision making, and conflict governance, and
expands the classification of IOS technologies based on practices for enhancing
collaboration. We now explain the framework in detail.

4.2.1 Assumptions
Three assumptions underlie the framework. First, organizations are assumed to make
conscious, intentional decisions as to whether to use and how to use IOS for specific reasons
within the constraints of a variety of conditions that limit or influence their choices. Second,
IOS collaboration is viewed from an organizational (top-management) perspective, even
though an IOS may be used between subunits or individuals in the collaborating
organizations. An organizational perspective is assumed throughout the paper. Third,

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 59
Table I A collaboration-oriented IOS framework
Type of interdependency Pooled interdependency Sequential interdependency Reciprocal interdependency

Configuration
Coordination mechanisms Standards and rules Standards, rules, schedules and Standards, rules, schedules,
plans plans and mutual adjustment
Structurability High Medium Low
Amount of direct human Minimum Intermediate Highest
interaction

Type of IOS Pooled knowledge resources Value/supply-chain IOS Networked IOS


IOSa
Nature of knowledge exchanged Structured Structured Structured
Semi-structured Semi-structured
Unstructured
Key issues in knowledge sharing Design of interorganizational Design of interorganizational Design of interorganizational
interfaces interfaces interfaces
Compatibility Compatibility Compatibility
Knowledge quality Knowledge quality Knowledge quality
Privacy and confidentiality Privacy and confidentiality Privacy and confidentiality
Knowledge asymmetry Knowledge asymmetry
Knowledge-sharing routines
Key issues in decision making Reduce uncertainty Reduce uncertainty Reduce equivocality
Inability to assimilate quality Inability to assimilate quality Inability to assimilate quality
knowledge knowledge knowledge
Loss of resource control Loss of resource control Share understanding
Governance mechanisms for Technological governance: Technological governance: Technological governance:
conflict resolution Open standards; Open standards; Open standards;
Industry-specific standards; Industry-specific standards; Industry-specific standards;
Proprietary or company-specific Proprietary or company-specific Proprietary or company-specific
standards standards standards

Business governance: Business governance: Business governance:


Classical contracts; Neoclassical contracts; Relational contracts;
Institutional norms Institutional norms Institutional norms
Reputation Reputation; Reputation;
Interpersonal trust Interpersonal trust
Focus of implementation ‘‘Codification’’ ‘‘Codification’’ ‘‘Personalization’’
technologies
Examples of implementation E-mail Scheduling techniques CAD/CAM
technologies and systems Fax Customer relationship Collaborative authoring
Instant messaging management Calendaring systems
Voice mail Supply chain management Computer conferencing
Electronic bulletin board EDI systems Threaded discussion
FAQs Collaborative planning, Group decision support
Call center forecasting and replenishment Organizational decision support
EFTPoS systems
Web site Workflow systems
Wireless device
Peer-to-peer communication
Broadband communications
Intranet/extranet
Internet
Wireless networks
EDI
XML
SOAP/WSDL/UDDI
Databases, data warehouses
Documents, archives
Web browser
Expert Finder tool
Meta/web-crawler
Taxonomy/ontological tools
OLAP/simulation/modeling
Data/text mining
Intelligent agents
Case-based reasoning
Neural networks/genetic
algorithm
Rule engines

Note: aKumar and van Dissel (1996) refer to this type of IOS as ‘‘pooled information resources IOS’’
Source: Adapted from Kumar and van Dissel (1996, p. 287). Italicized areas indicate extensions introduced here

j j
PAGE 60 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
knowledge sharing in this paper is considered in its broadest sense, including flows involved
in knowledge transfer, knowledge generation, and/or related knowledge application.

4.2.2 Types of interdependencies and IOS classes


Thompson (1967, pp. 54-55) distinguishes three different ways in which companies can be
interrelated:
(1) Pooled interdependency.
(2) Sequential interdependency.
(3) Reciprocal interdependency.
In pooled interdependency, companies share and use common resources; ‘‘each renders a
discrete contribution to the whole and each is supported by the whole’’ (e.g. the use of a
common data processing center by a number of firms). Sequential interdependency refers
to the situation where companies are linked in a chain with direct directional and well-defined
relations, where the outputs from one task processor become inputs to another (e.g. the
customer-supplier relationship along a supply chain). Reciprocal interdependency
describes a relationship where each company’s outputs become inputs for the others
(e.g. a concurrent engineering team consisting of customers, suppliers, distribution centers,
dealers, shippers, and forwarders) (Thompson, 1967; Kumar and van Dissel, 1996).
Pooled interdependency involves minimal direct interaction among the units, and
coordination by standardization is appropriate. Sequential interdependency involves an
increasing degree of contingency because each position in the chain must be readjusted if
an upstream position fails to fulfill its expectation, and coordination by plan is appropriate.
Reciprocal interdependency involves the highest degree of interaction because actions of
each position in the set must be adjusted to the actions of many interacting positions, and
coordination by mutual adjustment is needed (Thompson, 1967; Kumar and van Dissel,
1996).
In correspondence with pooled interdependency, sequential interdependency, and
reciprocal interdependency, Kumar and van Dissel (1996) suggest a three-part typology
for IOS:
(1) Pooled information resources IOS.
(2) Value/supply-chain IOS.
(3) Networked IOS.
They regard interorganizational systems as technologies designed and implemented to
operationalize the interorganizational relationships. They assume that the structure of the
relationship influences the degree to which the relationship can be programmed and
embedded in the IOS.
(1) Pooled information resources IOS involve interorganizational sharing of a
technological system, such as common repositories (e.g. databases, digital
archives), common communication networks (e.g. internet, extranet, broadband
networks), common communication protocols and standards (e.g. EDI, XML),
common application systems (e.g. data/text mining systems), and electronic
markets which may include some combinations of common databases, common
application procedures and software, and/or common communications infrastructure.
Here, we suggest that extending the notion of pooled information resources IOS to
pooled knowledge resources IOS allows for a better understanding of IOS
collaboration. We thus use the term of pooled knowledge resources IOS in the
extended collaboration-oriented IOS framework.
For instance, the Amico Library (www.amico.org) is an internet-based archive with
digital copies of more than 100,000 paintings, sculptures and photographs initiated
and shared by 39 museums from goliaths like the Metropolitan Museum of Art to
smaller institutions like the Newark Museum (Mirapaul, 2003). The National Virtual

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 61
Observatory is another initiative to create a common internet-based cosmic database
for nation-wide collaboration in astronomy. The project is building sophisticated
data/text mining systems and intelligent searching tools, and is creating an
Internet-based registry of astronomical resources (Schechter, 2003).
Another example of pooled knowledge resources IOS is Cisco’s eHub. eHub is a
private electronic marketplace where participants share an extranet infrastructure that
uses XML standards, and a central repository that pools together supply chain
information for planning and executing tasks (Grosvenor and Austin, 2001).
(2) Value/supply-chain IOS support structured and semi-structured customer-supplier
relationships, which are likely to be coded and implemented through automation, and
institutionalize sequential interdependency between organizations along the
value/supply chain. The Collaborative Forecasting and Replenishment (CFAR) project
initiated in 1995 presents an example of value/supply-chain IOS between Wal-Mart store
and Warner-Lambert (now part of Pfizer) for forecasting and replenishing
pharmaceuticals and healthcare products. CFAR is an internet-based EDI system that
allows both companies to jointly create sales forecasts that include information, such as
expected alterations to store layouts and meteorological information (King, 1996).
Wal-Mart is also testing a wireless supply chain system with its suppliers including Pepsi,
Bounty and Gillette. Wal-Mart uses radio frequency identification (RFID) to track shipments
of Pepsi soft drinks, Bounty paper towels, and Gillette razors at a Sam’s Club store in Tulsa,
OK, from manufacturer to warehouse to store to checkout counter. The process is
illustrated in Figure 2. Information from RFID tags on each item in a Wal-Mart store goes into
Wal-Mart’s 101-terabyte sales transaction database. Thus, suppliers can get a real-time
view of what is happening at the store shelf level (Shankar and O’Driscoll, 2002).
(3) Networked IOS operationalize and implement reciprocal interdependencies between
organizations. Networked IOS provide a shared virtual space where people collaborate
for emerging relationships and learning (Nonaka and Konno, 1998). They focus on
supporting informal exchange of semi-structured or unstructured knowledge, which
sometimes cannot be described as a business process, such as posting a question on
the electronic bulletin board, asking an expert for a solution, and directly contacting
customer to elicit needs or problems.

Figure 2 Wal-Mart’s wireless supply chain system for tracking and replenishment

j j
PAGE 62 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
ComputerLink gives an example of the networked IOS. ComputerLink is a community
health information network built in Cleveland for Alzheimer’s caregivers. ComputerLink
involves using the internet, an electronic bulletin board, a decision support system, as
well as e-mail and electronic encyclopedia facilities to provide clinical and financial
services, and deliver just-in-time knowledge among patients, physicians, hospitals,
clinics, and home health agencies. The e-mail facility allows individual users to
communicate anonymously with a nurse-moderator and other Alzheimer’s caregivers.
The nurse-moderator serves as technical liaison by providing systems and health
support to ComputerLink users while maintaining all encyclopedia functions related to
Alzheimer and care giving. The decision support system guides users through a myriad
of scenarios allowing self-determined choices based on personal values. The bulletin
board enables users to communicate through an electronic support-group public forum
(Payton and Brennan, 1999).
The three types of IOS form a Guttman-type scale (Thompson, 1967). That is,
value/supply-chain IOS may possess the characteristics of pooled knowledge resources
IOS; and networked IOS are likely to possess characteristics of both value/supply-chain IOS
and pooled knowledge resources IOS (Kumar and van Dissel, 1996).

4.2.3 Key issues in knowledge sharing


Knowledge sharing is a key aspect of IOS collaboration, as discussed for the collaboration
model shown in Figure 1. Effective knowledge sharing can promote understanding,
suppress opportunistic behaviors, and induce commitment and trust among partners, thus
leading to greater collaboration. Knowledge sharing is primarily determined by two factors:
transparency, and receptivity (Hamel, 1991). For each of these factors, we discuss
implications for the three IOS technology classes that deserve consideration by leaders of
interorganizational knowledge management initiatives.
4.2.3.1 Knowledge sharing transparency. Transparency refers to the ‘‘openness’’ of an
organization to its partners (Hamel, 1991). It can be influenced by the design of
interorganizational interfaces[2] (Malone, 1985; Hamel, 1991). In addition, knowledge
quality, privacy and confidentiality can also influence transparency.
Pooled knowledge resources IOS.
B Design of interorganizational interfaces: in pooled knowledge resources IOS, the
coordination structure in terms of the level of roles, obligations, rights, procedures,
knowledge flows, as well as analysis and computational methods used, can be clearly
specified and standardized (Kumar and van Dissel, 1996). The knowledge exchanged
tends to be highly structured, such as product descriptions, customer characterizations,
and transaction status. As such, interorganizational interfaces mostly can be designed as
protocols, rules, and standards built in shared software, tools, and systems. The
transparency of an organization regarding what knowledge to share with whom and how
to share can be determined by the degree of ‘‘openness’’ inherent in the embedded
protocols, rules, and standards.
B Knowledge quality: in pooled knowledge resources IOS, one or more users of the
‘‘commons’’ may treat the common as a free dumping ground and contaminate the
shared knowledge archives by dumping/depositing corrupt knowledge representations,
or allowing non-standard or unedited transactions onto the network, or even worse,
unintentionally or intentionally infesting the system with viruses (Kumar and van Dissel,
1996). These contaminations will degrade the knowledge quality of the ‘‘commons,’’
whose attributes such as validity and utility, are important for quality decision making
(Holsapple, 1995) and transparent knowledge sharing.
Contaminations can be controlled by designing and enforcing representations and
access standards through technological governance mechanisms for security,
virus-scan, and access control (Kumar and van Dissel, 1996). Additionally, defining
and measuring key knowledge quality attributes, such as validity and utility, and aspects

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 63
of each (Holsapple, 1995), and properly maintaining these quality-related measures as
knowledge moves across systems and organizations is important (Madnick, 1995).
B Privacy and confidentiality: in pooled knowledge resources IOS, because the
‘‘commons’’ are a public resource among IOS users, some users may misuse the
system by ‘‘poaching,’’ ‘‘stealing,’’ or ‘‘unauthorized snooping’’ (Kumar and van Dissel,
1996; Levitin and Redman, 1998). A user may collect and summarize contents from the
entire archive; or monitor and analyze transactions over the common network to develop
strategies for private use; or collect and infer confidential and private information about
another firm’s customers through lookups in a shared archive or through monitoring online
transactions and then luring selected customers away from the current suppliers. Such
misuses of the ‘‘commons’’ can pose serious issues of privacy and confidentiality, thus
impeding transparent knowledge sharing while increasing the potential for opportunistic
behaviors and free riding among IOS users.
Misuses of the ‘‘commons’’ can be controlled by imposing security mechanisms, such
as software safeguards, access control, and transaction logs (Kumar and van Dissel,
1996). Additionally, fostering norms or spreading values among IOS users that encourage
transparent knowledge sharing, may also provide effective governance against misusing
the ‘‘commons’’.
Value/supply-chain IOS. In value/supply-chain IOS, roles and mutual expectations between
adjacent parties in a value/supply chain can be structured. Structured interactions could
range from tracking EDI-based orders, to looking up databases of adjacent partners in the
chain for sales forecasting, to transferring CAD-based specifications from customers to
suppliers (Kumar and van Dissel, 1996). The knowledge shared can range from structured
data, such as ordering and customer data, sales data, and production and inventory data, to
semi-structured representations, such as market research, category management, and
cost-related descriptions (Simatupang and Sridharan, 2001). As such, interorganizational
interfaces in value/supply-chain IOS also can be largely designed as protocols, rules, and
standards embedded in the software, tools, and systems (e.g. automated workflow
systems), determining the transparency of an organization in terms of what knowledge to
share, how much to share, with whom to share, and how to share.
Similar to a pooled knowledge resources IOS, knowledge quality and privacy and
confidentiality in a value/supply-chain IOS also influence the transparency of an
organization. In addition, knowledge asymmetry presents another issue influencing the
‘‘openness’’ of an organization in a value/supply-chain IOS.
Knowledge asymmetry[3] refers to the situation where different players in a
value/supply-chain IOS are likely to have different states of private knowledge about
resources (e.g. capacity, inventory status, and funds), various data-related costs, chain
operations (e.g. sales, production and delivery schedules), performance status, and market
conditions. This knowledge asymmetry can lead to misunderstanding among chain
members about their mutual efforts at collaboration. Because of their different roles,
positions, and objectives in the chain, conflict and suboptimal decisions may result, such as
unproductive allocation of resources (Lee and Whang, 2000; Simatupang and Sridharan,
2001).
Asymmetric knowledge may also cause difficulties among chain members in dealing with
market uncertainty. For example, when the downstream players poorly estimate or distort
demand conditions, the upstream players may experience larger variance of customer
demand, producing difficulties in managing genuine levels of production and inventory. This
can also produce difficulties in designing products that might be desirable, especially for
innovative products (Simatupang and Sridharan, 2001).
Furthermore, asymmetric knowledge can inhibit transparent knowledge sharing and
increase the potential for opportunism, either prior to the contract or after the contract
(Molho, 1997). Adverse selection can occur before a contract is signed; it involves
misrepresentation or concealment of true capability, resource, and demand conditions that
need to be shared. Moral hazards can occur after a contract is signed; they involve

j j
PAGE 64 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
providing misleading characterizations of performance status, lowering of service level
efforts, and committing only a minimum level of resources.
To reduce knowledge asymmetry, it is necessary to develop clear performance measures
and also promote mutual interests and trust among IOS participants. Additionally, financial
incentives, such as productive-behavior-based incentives, pay-for-performance, and
equitable arrangements (Simatupang and Sridharan, 2001), may be employed to promote
transparent knowledge sharing and discourage dysfunctional behaviors. The use of
technology may also help increase control by facilitating performance measures and
monitoring (Gallivan and Depledge, 2003).
Networked IOS. With networked IOS, the form, direction, and content of the relationships
among participants are much less structured than with the other two types of IOS (Kumar
and van Dissel, 1996). Reciprocal relationships can be viewed as consisting of exchange
processes and adaptation processes; exchange processes represent ‘‘the operational,
day-to-day exchanges of an economic, technical, social, or informational nature occurring
between firms;‘‘ adaptation involves the processes whereby firms adjust and maintain their
relationships by modifying routines and mutual expectations (Kumar et al., 1998, pp. 215). A
networked IOS thus involves an increasing degree of human interaction and requires
mechanisms such as trust to identify, assess, and manage the dynamically occurring
equivocality and risks in the situation. The nature of the knowledge exchanged can range
from structured (such as product data), to semi-structured (such as reports about industry
trends), to highly unstructured (such as expertise and know-how, problem-solving skills, and
ideas about a new product design). As such, many parts of interorganizational interfaces in
networked IOS, unlike those in the other two types of IOS, cannot be designed as built-in
protocols, rules, and standards. Instead, human processors positioned at organizational
boundaries tend to interface with each other, with the aid of IOS. Thus, the ‘‘openness’’ of
those individuals can greatly influence the transparency of knowledge sharing, in addition to
the embedded rules and protocols, knowledge quality, and privacy and confidentiality. This
‘‘openness’’ can be enhanced through nurturing knowledge sharing routines.
Knowledge sharing routines can be viewed as regular patterns of interorganizational
interactions that permit the transfer, application, or generation of specialized knowledge
(Grant, 1996; Dyer and Singh, 1998). These routines are largely dependent on an alignment
of incentives to encourage transparent knowledge sharing and discourage opportunistic
behaviors and free riding (Dyer and Singh, 1998). Financial incentives or informal norms of
reciprocity may be employed to promote mutual interests and highlight common goals
(Lewis, 1990; Badaracco, 1991), thus motivating transparent knowledge sharing (Mowery
et al., 1996; Dyer and Singh, 1998). Technological governance mechanisms for knowledge
security and system security may also be employed (Venkatraman, 1991; Kumar and van
Dissel, 1996) to discourage dysfunctional behaviors.
4.3.2.2 Knowledge sharing receptivity. Receptivity is also termed assimilative ability
(O’Leary, 2003), or ‘‘partner-specific absorptive capacity’’ (Dyer and Singh, 1998). It refers
to an organization’s ability to assimilate knowledge and skills from its partners. Receptivity
involves ‘‘implementing a set of interorganizational processes that allow collaborating firms
to systematically identify valuable know-how and then transfer it across organizational
boundaries’’ (Dyer and Singh, 1998, p. 665). It is a function of the compatibility between
partners and the absorptiveness of receptors/processors (Hamel, 1991).
Compatibility. Across all three types of IOS, incompatibility and inconsistency can result from
geographically and functionally dispersed business operations, as well as differences in
work processes and underlying cultures of organizations. There may be different semantics
for the same term, or different identifiers for key business entities, such as customer or
product, or different schemes for aggregating key indicators, such as sales or expenses, or
different ways of calculating key concepts, such as profit or return on investments (Goodhue
et al., 1992; Madnick, 1995).
In addition to knowledge incompatibility, there may also exist incompatibilities of
technological infrastructure across organizational boundaries. These incompatibilities not

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 65
only can thwart an organization’s ability to identify and transfer valuable knowledge to/from
its IOS partners, but may also increase the potential for mistrust and conflict.
To enhance compatibility, common technological standards and knowledge representations
with standard definitions and codes need to be shared, and a common language for
communicating about business procedures and events must be established across IOS
users.
Absorptiveness of receptors/processors. In pooled knowledge resources IOS, given a low
level of direct human interaction, patterned interorganizational interactions are mostly
between human processors and computer processors, or computer processors and
computer processors. Thus, the absorptiveness of receptors/processors can be largely
determined and enhanced by the use of knowledge technologies that focus on
‘‘codification’’ (Milton et al., 1999), or the ‘‘storing,’’ ‘‘massaging,’’ ‘‘structuring,’’
‘‘integrating,’’ ‘‘filtering,’’ ‘‘navigating,’’ and retrieving of reusable knowledge assets
from/to shared repositories (O’Leary, 2003). Examples of such technologies may include
artificial intelligence tools, meta/web crawlers, and taxonomy/ontological tools (Tsui, 2003).
Additionally, technologies that facilitate ‘‘discovery,’’ and ‘‘capture/monitor’’ (Milton et al.,
1999) across organizational boundaries, such as data/text mining, could also provide
effective means for increasing an organization’s receptivity (Upton and McAfee, 1996;
Majchrzak et al., 2000; Tsui, 2003).
In value/supply-chain IOS, contacts between chain members can be largely patterned into
human-computer and computer-computer interactions that employ interface standards.
Besides standards, plans and schedules are also used for interorganizational coordination,
increasing the degree of direct human interaction. Thus, in addition to the use of
technologies for enhancing ‘‘codification,’’ shared knowledge backgrounds and common
skills of human processors are also important in increasing the absorptive ability of an
organization.
In contrast with the other two types of IOS, networked IOS involve a high degree of
human-human interaction. A large proportion of the critical knowledge handled by a
networked IOS can be tacit and unstructured. As such, the receptivity of an organization can
be greatly influenced by the absorptive skills of individual human processors. Technologies
are used to provide process support for enhancing human absorptive skills:
B by connecting and locating people (Tsui, 2003) and optimizing the frequency and
intensity of socio-technical interactions (Dyer and Singh, 1998);
B by facilitating the sharing of context (Nomura, 2002) and development of common
knowledge bases;
B by increasing the capability of capturing and locating tacit and unstructured knowledge
(Majchrzak et al., 2000; May and Carter, 2001); and
B by improving analytical and decision-making capabilities.
Thus, technologies that focus on ‘‘personalization’’ (Milton et al., 1999; Tsui, 2003) and
support learning, such as collaborative construction tools (e.g. CAD/CAM, collaborative
authoring), computer conferencing, and group decision support systems are important in
enhancing an organization’s receptivity.

4.2.4 Key issues in decision making


Decision making is another key behavioral process that influences the outcome of
interorganizational collaboration (recall the collaboration model in Figure 1). Imbalance in
decision-making authority may lead to perceived injustice and mistrust, and create an
environment prone to opportunism and conflict, while shared decision making can facilitate
perceived equality and trusting relationship, thus enhancing interorganizational
collaboration (Sarkar et al., 1998).
In pooled knowledge resources IOS, the parties sharing the resources do not need to
directly interact with each other, and the decision-making process of each party is relatively

j j
PAGE 66 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
independent from that of others. However, perceived inequality and mistrust in decision
making could arise from one party’s possession and control of the shared resources. For
example, in the airline industry, American and United attempted to bias screen displays of
their computer-based reservation systems to discourage price comparisons (Bakos, 1991),
or to restrict travel agents from booking tickets from the other airlines (Malone et al., 1987).
In those situations where shared knowledge resources are controlled by one of the partners
and the controlling party is also a competitor of the other parties, this dominant party may
use its controlling position to intentionally damage other parties (Copeland and McKenney,
1988). As a result, distrust in the system controls and perceived loss of power in decision
making are likely to arise (Kumar and van Dissel, 1996), increasing conflict potential and
inhibiting cooperative behaviors. One way to address this issue is to place control of
common resources in the hands of a neutral third-party (such as a trade association,
government agency, or joint venture company) (Konsynski and McFarlan, 1990; Kumar and
van Dissel, 1996) in order to increase the participants’ perceived justice and control in
decision making, thus increasing trust and collaborative effectiveness.
In value/supply-chain IOS, particularly in a proprietary network, the loss of resource control
or an inability to access to quality knowledge may induce perceived inequality and loss of
power in decision making, thereby impeding trust building and retarding collaboration
success. For example, in the mid-1980s, Ford Motor established a proprietary EDI system,
Fordnet. In pursuing its agendas for reducing market uncertainty, or simply for locking
trading partners into its trading relationship, Ford imposed its information handling practices
on all of its European trading partners, extending its own hardware systems into its suppliers’
premises, dictating product and inventory coding according to its own propriety system,
and dictating the type and frequency of data to be exchanged (Webster, 1995).
Consequently, many Fordnet users experienced decreased trust arising from a perceived
loss of decision power in the trading relationship. Additionally, the transaction-specific
investment in the Fordnet EDI also increased the risk of suboptimization perceived by the
Fordnet users, further impeding the collaboration between Ford and its partners.

As suggested by the case of Fordnet, the use of more open standards and migration from a
proprietary network to a more open network may provide a viable solution for promoting
participative decision making and increasing perceived justice and reciprocity.
Furthermore, promoting shared understanding and mutual interests among participants
would also enhance perceived equality and decision power, facilitating the growth of trust
and collaboration.
In networked IOS, decision making is quite different from that in the other two types of
IOS. It involves highly interrelated processes and intense interactions among
participants. Many studies have indicated that use of interactive technologies can
greatly enhance the shared processes in decision making (Bowker and Star, 2001).
Some have found that technologically mediated communication creates less role
differentiation among the participants than does face-to-face communication (Kiesler
and Sproull, 1996). Others have found that for groups communicating via e-mail, there
tends to be uninhibited communication, more communication among participants of
different status, and more equal participation (Kiesler et al., 1984; Rice and Rogers,
1984; Siegel et al., 1986).
However, in networked IOS, the increased degree of human interaction and mutual
adaptation may also increase performance equivocality and human misunderstanding.
Thus, reducing such misunderstanding becomes important in facilitating the
decision-making process in networked IOS. One approach to reduce misunderstanding is
to foster trust. Another approach involves central repositories that provide a common
knowledge base for sharing visions and contexts among the participants, such as
discussion forums, frequently asked questions (FAQs) facilities, and electronic libraries with
problem definitions, successful experiences and best practices (Majchrzak et al., 2000; May
and Carter, 2001).

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 67
4.2.5 Governance mechanisms for conflict resolution
Besides transparent knowledge sharing and participative decision-making, the model
portrayed in Figure 1 identifies conflict governance as a third key element underlying
effective interorganizational collaboration. In an IOS network, conflict could arise from
opportunistic behaviors, impeding the success of interorganizational collaboration.
Opportunistic behaviors may occur when managing shared technology-based resources,
such as shared archives. Or, they may take place in the transactional activities that are
handled by the IOS. Thus, both technological governance and business governance are
needed for preventing opportunism and resolving conflict so as to foster trust and enhance
collaborative advantage.
Technological governance: this includes various technical protocols, standards, system
security controls, and knowledge security controls. Technological governance can be
decomposed into three subtypes:
(1) Open standards, such as XML.
(2) Industry-specific standards, such as the SWIFT standard used in the international
banking industry (Keen, 1991) and universal product code (UPC) in the grocery industry
(Cash and Konsynski, 1985).
(3) Proprietary or company-specific standards, such as the manufacturing automation
protocol (MAP) used by General Motor (Keen, 1991).
Open standards and industry-specific standards are likely to be used in pooled knowledge
resources IOS because of a large number of participants involved. Proprietary or
company-specific standards may be used in value/supply-chain IOS and networked IOS (Li
and Williams, 1999). Recent trends indicate that value/supply-chain IOS and networked IOS
are moving toward the use of more open standards:
B Business governance: business governance involves formal governance, such as legal
contracts (Macneil, 1974, 1978; Ring and van de Ven, 1992), and informal governance,
such as institutional norms (Zucker, 1986), reputation (Zucker, 1986; Resnick et al., 2000;
Adler, 2001), and trust (Arrow, 1974; Ouchi, 1979; Bradach and Eccles, 1989; Williamson,
1993).
B Formal governance: based on characteristics of the transactions to be conducted for the
three IOS classes, three types of legal contracts can be applied. With pooled knowledge
resources IOS, arms-length market transactions are likely to be involved, and thus a
classical contract[4] would be appropriate. With value/supply-chain IOS, recurrent
transactions are likely to occur between the chain members, and thus a neoclassical
contract[5] would be appropriate. With networked IOS, relational activities take place,
and thus a relational contract[6] is suitable.
B Informal governance: with pooled knowledge resources IOS, given the minimum amount of
interaction, institutional norms that define each other’s behaviors (Zucker, 1986) and
reputation that is established through a network of trusted third parties (Zucker, 1986;
Resnick et al., 2000; Adler, 2001) may serve as effective governance mechanisms
supplementing the classical contract. In value/supply-chain IOS and networked IOS, with an
increasing degree of human interaction, trust assumes a greater role as an effective
mechanism for governing opportunistic behaviors and resolving conflict (Arrow, 1974;
Ouchi, 1979; Bradach and Eccles, 1989; Williamson, 1993). Reputation and norms of
reciprocity can also provide useful governance, as well as further facilitate the growth of trust.

4.2.6 IOS technologies


Based on the characteristics and roles of each type of IOS as well as the practices involved
in knowledge sharing, decision making, and conflict governance, we next classify a variety
of candidate IOS technologies and application systems:
B Pooled knowledge resources IOS usually involve a large number of participants, highly
structured interactions among participants and a relatively low degree of human contact.

j j
PAGE 68 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
They are used to provide shared knowledge resources to reduce uncertainty, achieve
economies of scale and scope by sharing costs and risks among participants (Konsynski
and McFarlan, 1990). Implementation technologies require a focus on ‘‘codification’’ (i.e.
‘‘capturing existing knowledge and placing this in repositories in a structured manner’’)
(Milton et al., 1999, p. 619; Tsui, 2003). Thus, technologies for communications (e.g.
communications networks, standards and protocols) and for content management (e.g.
shared repositories, knowledge acquisition and retrieval) can serve as good application
candidates. Table II shows some examples.
B Value/supply-chain IOS involve relatively structured and linear relations between adjacent
chain members, whose interaction interfaces can be largely standardized. They are used
primarily for purposes of reducing uncertainty, streamlining flows of knowledge, services,
and products, and increasing efficiency. Implementation technologies also focus on
‘‘codification.’’ It is worth noting that interdependencies between firms are different from
the ways in which tasks/activities are interrelated. For example, sequential dependency
between firms along a supply chain may involve many different tasks/activities
relationships, such as ‘‘sharing,’’ ‘‘flow,’’ ‘‘fit,’’ concurrent tasks, task-subtask (Malone
and Crowston, 1999, p. 429; Holsapple and Whinston, 2001, p. 585). ‘‘Sharing’’
relationships occur when multiple activities use the same resource. ‘‘Flow’’ relationships
arise when one activity produces a resource that is used by another activity, involving
sequencing, transfer, and usability. ‘‘Fit’’ relationships occur when multiple activities
collectively produce one resource. Concurrent tasks arise when multiple activities occur
simultaneously. Task-subtask relationship arises when one activity involves multiple
subactivities.

Table II Pooled knowledge resources IOS: implementation technologies and applications


Interfirm communication Messaging services E-mail
Fax
Instant messaging
Voice mail
Publishing services
Open posting (e.g. electronic bulletin board)
Controlled posting (e.g. FAQs)
Channel management Call center
Electronic funds transfer at point-of-sales
(EFTPoS)
Web site
Wireless device
Communications network Peer-to-peer communication
Broadband communications
Intranet
Extranet
Internet
Wireless networks
Communication standards and protocols Electronic data interchange (EDI)
Extensible mark-up language (XML)
Simple object access protocol (SOAP)
Web services description language (WSDL)
Universal description, discovery, and integration
(UDDI)
Content management Shared repositories Databases, data warehouses
Digital documents, archives
Knowledge acquisition and retrieval Knowledge navigation (e.g. web browser)
Knowledge search (e.g. expert finder tool,
meta/web-crawler, taxonomy/ontological tools)
Knowledge discovery and generation Analytics (e.g. OLAP, simulation, modeling)
Mining (e.g. data mining, text mining)
Artificial intelligence (e.g. intelligent agents,
case-based reasoning, neural networks, genetic
algorithm, rule engines)

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 69
Therefore, the coordination technologies that focus on supporting structured and
semi-structured tasks/activities along the value/supply chain may serve as good
candidate technologies for value/supply-chain IOS. These technologies may include
scheduling resources and tasks across companies (Malone and Crowston, 1999;
Holsapple and Whinston, 2001, p. 585), managing customer-supplier relationships
(Holsapple and Whinston, 2001, p. 585), and interorganizational workflow automation
(van der Aalst, 2000). Scheduling techniques involve managing the ‘‘sharing’’
relationships based on the mechanisms, such as ‘‘first come/first serve,’’ priority order,
budget, managerial decision, and competitive bidding, and also the ‘‘flow’’ relationships,
such as CPM and PERT for project management. Managing customer-supplier
relationships focuses on the ‘‘flow’’ relationships between activities along a value
chain. Technologies may involve customer relationship management, supply chain
management, EDI systems, collaborative planning, forecasting and replenishment
systems. Workflow automation is used for structured business processes across firms
with a predefined set of tasks and routing constructs. Workflow automation involves
managing concurrent tasks, task-subtask relationships, and multi-participant tasks.
Table III lists some examples of candidate technologies and applications.
B Networked IOS have a focus on people and their work styles, especially how they create
ideas and what knowledge resources they use. Networked IOS are particularly
instrumental in three aspects: agile problem solving by delivering just-in-time knowledge
among individuals across organizations, expertise co-development by supporting
deeper and more tacit knowledge sharing among professionals, and innovation by
optimizing interactions with customers and utilizing their knowledge (Nomura, 2002).
Each of these aspects highlights human ingenuity and involves a tacit and less structured
learning process. Thus, implementation technologies focus on ‘‘personalization’’ (i.e.
‘‘locating and connecting people’’) (Milton et al., 1999; Tsui, 2003, p. 6). Groupware,
threaded discussions, computer conferencing, and collaborative construction tools (e.g.
design, authoring) may serve as good candidates. Table IV lists some examples.

5. Conclusion
IOS are assuming an increasing role in facilitating and enabling interorganizational
collaboration. Yet, the existing literature on IOS is fragmented and provides limited
understanding of the relationship between IOS technologies and the knowledge-intensive
phenomenon of interorganizational collaboration. In this paper, we introduce an integrative
model of IOS collaboration and identify knowledge sharing, participative decision making,
and conflict governance as three behavioral process elements underlying effective

Table III Value/supply-chain IOS: implementation technologies and applications


Interfirm coordination Scheduling resources and tasks e.g. ‘‘First come/first serve,’’ priority order,
budget, managerial decision, competitive
bidding, CPM, PERT for project management
Managing customer-supplier relationships e.g. Customer relationship management, supply
chain management, EDI systems, collaborative
planning, forecasting and replenishment
systems
Workflow automation e.g. Concurrent tasks, task-subtask relationship,
managing multi-participant tasks

Table IV Networked IOS: implementation technologies and applications


Interfirm cooperative work Collaborative construction e.g. CAD/CAM, authoring
Collaborative timing/meeting management e.g. Calendaring, computer conferencing
Threaded discussion e.g. Community of practice
Multi-participant decision support e.g. Group decision support, organizational
decision support

j j
PAGE 70 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
interorganizational collaboration. Extending Kumar and van Dissel’s framework to directly
recognize these elements, we develop a more complete collaboration-oriented IOS
framework for characterizing key elements of interorganizational collaboration and
classifying IOS technologies.
For researchers, this paper contributes to a deeper, fuller understanding of issues involved
in achieving collaborative advantage with IOS technologies. Both the IOS collaboration
model and the collaboration-oriented IOS framework provide a basis for further exploration
of the uses and impacts of IOS technologies in interorganizational collaboration. They
identify factors and relationships that researchers should consider in designing empirical
studies, posing hypotheses about collaboration via IOS, and analyzing results.
For educators, this paper brings together diverse ideas into a systematic view of
collaboration via interorganizational systems. It outlines IOS characteristics, classifies them,
and highlights issues related to their deployment. As such, the model and framework can be
used to identify and structure course content concerned with collaboration and IOS.
For practitioners, this paper provides useful guidance for IOS users by highlighting key
elements of collaboration, presenting empirical examples and addressing key issues,
practices, and solutions involved in the IOS collaboration. The model and framework serve
as a checklist of considerations that need to be dealt with by leaders of
collaboration-oriented IOS initiatives. The IOS framework and technology classification
may also suggest ways in which IT vendors might provide better technological solutions,
services, and software for interorganizational collaboration.
As more and more interorganizational system links are established between firms, the
question of how to develop collaborative IT relationships and optimize the use of IOS grows
in importance. The answer involves methods to promote such process behaviors as
knowledge sharing and participative decision making among IOS users, while
simultaneously aligning with effective governance mechanisms to facilitate these
behaviors, inhibit opportunistic behaviors and power plays, and ultimately yield
collaborative advantages in the directions of greater productivity, agility, innovation,
and/or reputation.

Notes
1. This is an application of Oliver’s (1990) contingency theory of interorganizational relationship
formation. Oliver proposes six critical causes – necessity, asymmetry, reciprocity, efficiency,
stability, and legitimacy – as generalizable predictors of interorganizational relationship formation
across organizations, settings, and linkages. These are used to structure our examination of the IOS
literature. Our examination yielded two additional motives: innovation and agility.
2. The notion of interorganizational interface comes from Malone’s (1985) concept of ‘‘organizational
interface.’’ Malone (1985, p. 66) suggests that ‘‘the term ‘interface’ was originally used in computer
science to mean a connection between programs or program modules;’’ later, the phrase ‘‘user
interface’’ becomes common and is used to include the connection between a human user and a
computer system; in the same spirit, this usage can be extended to include ‘‘organizational
interface,’’ which can be defined as ‘‘the parts of a computer system that connect human users to
each other and to the capabilities provided by computers.’’ Here, we further extend the concept of
‘‘organizational interface’’ to interorganizational interface to emphasize the parts of the computer
systems that connect human users to each other and to the capabilities provided by computers
shared by two or more organizations.
3. In the literature, a widely used term is ‘‘information asymmetry.’’ In this paper, we extend this
concept to knowledge asymmetry to emphasize that a less structured or more tacit knowledge
dimension (e.g. vision and understanding about certain markets and demands) are involved in the
interplay between different value/supply chain members.
4. Classical contract involves one-time, short-term, arms-length market transactions between
autonomous and independent parties. ‘‘The conditions associated with these transactions are
‘sharp in;’ that is, they are accompanied by a clear-cut, complete, and monetized agreement. They
are also ‘sharp out,’ i.e. the seller’s debt of performance and the buyer’s debt of payment are

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 71
unambiguous.’’ ‘‘The property, products, or services exchanged tend towards the non-specific, and
can be transacted among many traders’’ (Ring and Van De Ven, 1992, p. 485).
5. Neoclassical contract involves relatively short-term, ‘‘repeated exchanges of assets that have
moderate degrees of transaction specificity. The terms of these exchanges tend to be certain, but
some contingencies may be left to future resolution.’’ ‘‘The parties see themselves as autonomous,
legally equal, but contemplating a more embedded relationship’’ (Ring and Van De Ven, 1992, p.
487).
6. Relational contract involves ‘‘long-term investments that stem from groundwork laid by recurrent
bargaining on the production and transfer of property rights among these legally equal and
autonomous parties. The property, products, or services jointly developed and exchanged in these
transactions entail highly specific investments, in ventures that cannot be fully specified or
controlled by the parties in advance of their execution’’ (Ring and Van De Ven, 1992, p. 487).

References
Adler, P.S. (2001), ‘‘Market, hierarchy, and trust: the knowledge economy and the future of capitalism’’,
Organization Science, Vol. 12 No. 2, pp. 215-34.
Arrow, K.J. (1974), The Limits of Organization, Norton, New York, NY.
Badaracco, J.L. Jr (1991), Knowledge Link: How Firms Compete through Strategic Alliances, Harvard
Business School Press, Boston, MA.

Bakos, J.Y. (1991), ‘‘A strategic analysis of electronic marketplaces’’, MIS Quarterly, Vol. 15 No. 3,
pp. 295-310.
Barret, S. and Konsynski, B.R. (1982), ‘‘Interorganizational information sharing systems’’, MIS Quarterly,
Special Issue, pp. 93-105.

Bowker, G.C. and Star, S.L. (2001), ‘‘Social theoretical issues in the design of collaboratories:
customized software for community support versus large-scale infrastructure’’, in Olson, G.M., Malone,
T.W. and Smith, J.B. (Eds), Coordination Theory and Collaboration Technology, Erlbaum Associates,
Inc., Mahwah, NJ, pp. 713-38.

Bradach, J.L. and Eccles, R.G. (1989), ‘‘Price, authority, and trust: from ideal types to plural forms’’,
Annual Review of Sociology, Vol. 15, pp. 97-118.
Cash, J.I. and Konsynski, B.R. (1985), ‘‘IS redraws competitive boundaries’’, Harvard Business Review,
March-April, pp. 134-42.

Christiaanse, E. and Venkatraman, N. (2002), ‘‘Beyond SABRE: an empirical test of expertise


exploitation in electronic channels’’, MIS Quarterly, Vol. 26 No. 1, pp. 15-38.
Clemons, E.K. and Row, M.C. (1991), ‘‘Sustaining IT advantage: the role of structural differences’’, MIS
Quarterly, Vol. 15 No. 3, pp. 275-92.
Clemons, E.K. and Weber, B.W. (1990), ‘‘London’s big bang: a case study of information technology,
competitive impact, and organizational change’’, Journal of Management Information Systems, Vol. 6
No. 4, pp. 41-60.
Copeland, D.G. and McKenney, J.L. (1988), ‘‘Airline reservations systems: lessons form history’’, MIS
Quarterly, Vol. 12 No. 3, pp. 353-70.
Dyer, J.H. and Singh, H. (1998), ‘‘The relational view: cooperative strategy and sources of
interorganizational competitive advantage’’, Academy of Management Review, Vol. 23 No. 4, pp. 660-79.
Ferrat, T.W., Lederer, A.L., Hall, S.R. and Krella, J.M. (1996), ‘‘Swards and plowshares: information
technology for collaborative advantage’’, Information & Management, Vol. 30, pp. 131-42.
Freed, L. (1999), ‘‘E-mail servers and groupware’’, PC Magazine, November 2, pp. 151-71.
Gallivan, M.J. and Depledge, G. (2003), ‘‘Trust, control and the role of interorganizational systems in
electronic partnerships’’, Information Systems Journal, Vol. 13 No. 2, pp. 159-90.
Goodhue, D.L., Wybo, M.D. and Kirsch, L.J. (1992), ‘‘The impact of data integration on the costs and
benefits of information systems’’, MIS Quarterly, Vol. 16 No. 3, pp. 293-311.
Grant, R.M. (1996), ‘‘Prospering in dynamically-competitive environments: organizational capability as
knowledge integration’’, Organization Science, Vol. 7 No. 4, pp. 375-87.

j j
PAGE 72 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Grosvenor, F. and Austin, T.A. (2001), ‘‘Cisco’s eHub initiative’’, Supply Chain Management Review,
July/August, pp. 28-35.

Hamel, G. (1991), ‘‘Competition for competence and inter-partner learning within international strategic
alliances’’, Strategic Management Journal, Vol. 12, pp. 83-103.

Hart, P. and Saunders, C. (1997), ‘‘Power and trust: critical factors in the adoption and use of electronic
data interchange’’, Organization Science, Vol. 8 No. 1, pp. 23-42.

Holland, C.P. (1995), ‘‘Cooperative supply chain management: the impact of interorganizational
information systems’’, Journal of Strategic Information Systems, Vol. 4 No. 2, pp. 117-33.

Holsapple, C.W. (1995), ‘‘Knowledge management in decision making and decision support’’,
Knowledge and Policy, Vol. 8 No. 1, pp. 5-22.

Holsapple, C.W. and Joshi, K. (2000), ‘‘An investigation of factors that influence the management of
knowledge in organizations’’, Journal of Strategic Information Systems, Vol. 9 No. 2/3, pp. 237-63.

Holsapple, C.W. and Singh, M. (2001), ‘‘The knowledge chain model: activities for competitiveness’’,
Expert Systems with Applications, Vol. 20 No. 1, pp. 77-98.

Holsapple, C.W. and Whinston, A.B. (2001), Decision Support Systems: A Knowledge-Based Approach,
Thomson Learning Custom Publishing, Cincinnati, OH.

Hong, I.B. (2002), ‘‘A new framework for interorganizational systems based on the linkage of
participants’ roles’’, Information and Management, Vol. 39, pp. 261-70.

Iacovou, C.L., Benbasat, I. and Dexter, A.S. (1995), ‘‘Electronic data interchange and small
organizations: adoption and impact of technology’’, MIS Quarterly, Vol. 19 No. 4, pp. 465-85.

Johnston, H.R. and Vitale, M.R. (1988), ‘‘Creating competitive advantage with interoganizational
information systems’’, MIS Quarterly, Vol. 12 No. 2, pp. 153-65.

Kaufman, F. (1966), ‘‘Data systems that cross company boundaries’’, Harvard Business Review,
January-February, pp. 141-55.

Keen, P.G.W. (1991), Shaping the Future: Business Design through Information Technology, Harvard
Business School Press, Boston, MA.

Kiesler, S. and Sproull, L. (1996), Connections: New Ways of Working in the Networked Organization,
MIT Press, Cambridge, MA.

Kiesler, S., Siegel, J. and McGuire, T. (1984), ‘‘Social psychological aspects of computer-mediated
communication’’, American Psychologist, Vol. 39, pp. 1123-34.

King, J. (1996), ‘‘Sharing IS secrets’’, Computerworld, Vol. 30 No. 39, September 23.

Kling, R. (1980), ‘‘Social analysis of computing: theoretical perspectives in recent empirical research’’,
ACM Computing Surveys, Vol. 12 No. 1, pp. 61-110.

Konsynski, B.R. and McFarlan, F.W. (1990), ‘‘Information partnerships – shared data, shared scale’’,
Harvard Business Review, September-October, pp. 114-20,.

Kumar, K. and Van Dissel, H.G. (1996), ‘‘Sustainable collaboration: managing conflict and cooperation
in interorganizational systems’’, MIS Quarterly, Vol. 20 No. 3, pp. 279-300.

Kumar, K., Van Dissel, H.G. and Bielli, P. (1998), ‘‘The merchant of Prato – revisited: toward a third
rationality of information systems’’, MIS Quarterly, Vol. 22 No. 2, pp. 199-226.

Lee, H.L. and Whang, S. (2000), ‘‘Information sharing in supply chain’’, International Journal of
Technology Management, Vol. 20 No. 3/4, pp. 373-87.

Levitin, A.V. and Redman, T.C. (1998), ‘‘Data as a resource: properties, implications, and prescriptions’’,
Sloan Management Review, Vol. 40 No. 1, pp. 89-101.

Lewis, J.D. (1990), Partnerships for Profit: Structuring and Managing Strategic Alliances, Free Press,
New York, NY.

Li, F. and Williams, H. (1999), ‘‘New collaboration between firms: the role of interorganizational systems’’,
Proceedings of the 32nd Hawaii International Conference on System Sciences, Maui, HI.

McAfee, A. (2000), ‘‘The Napsterization of B2B’’, Harvard Business Review, November-December,


pp. 18-19.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 73
Macneil, I.R. (1974), ‘‘The many features of contract’’, Southern California Law Review, Vol. 47,
pp. 691-815.

Macneil, I.R. (1978), ‘‘Contracts: adjustments of long-term economic relationship under classical,
neoclassical, and relational contract law’’, Northwestern University Law Review, Vol. 72, pp. 854-906.

Madnick, S.E. (1995), ‘‘Integration technology: the reinvention of the linkage between information
systems and computer science’’, Decision Support Systems, Vol. 13, pp. 373-80.

Majchrzak, A., Rice, R.E., Malhotra, A. and King, N. (2000), ‘‘Technology adaptation: the case of a
computer-supported interorganizational virtual team’’, MIS Quarterly, Vol. 24 No. 4, pp. 569-600.

Malone, T.W. (1985), ‘‘Designing organizational interfaces’’, CHI ’85 Proceedings, April, pp. 66-71.

Malone, T.W. and Crowston, K. (1999), ‘‘Tools for inventing organizations: toward a handbook of
organizational processes’’, Management Science, Vol. 45 No. 3, pp. 425-43.

Malone, T.W., Yates, J. and Benjamin, R.I. (1987), ‘‘Electronic markets and electronic hierarchies’’,
Communications of the ACM, Vol. 30 No. 6, pp. 484-97.

May, A. and Carter, C. (2001), ‘‘A case study of virtual team working in the European automotive
industry’’, International Journal of Industrial Ergonomics, Vol. 27, pp. 171-86.

Meier, J. and Sprague, R.H. Jr (1991), ‘‘The evolution of interorganizational systems’’, Journal of
Information Technology, Vol. 6, pp. 184-91.

Migliarese, P. and Paolucci, E. (1995), ‘‘Improved communications and collaborations among tasks
induced by Groupware’’, Decision Support Systems, Vol. 14, pp. 237-50.

Milton, N., Shadbolt, N., Cottam, H. and Hammersley, M. (1999), ‘‘Towards a knowledge technology for
knowledge management’’, International Journal of Human-Computer Studies, Vol. 51, pp. 615-41.

Mirapaul, M. (2003), ‘‘Far-flung artworks, side by side online’’, The New York Times, May 22.

Molho, I. (1997), The Economics of Information: Lying and Cheating in Markets and Organizations,
Blackwell, Oxford.

Mowery, D.C., Oxley, J.E. and Silverman, B.S. (1996), ‘‘Strategic alliances and interfirm knowledge
transfer’’, Strategic Management Journal, Vol. 17, pp. 77-91.

Nomura, T. (2002), ‘‘Design of ‘Ba’ for successful knowledge management – how enterprises should
design the places of interaction to gain competitive advantage’’, Journal of Network and Computer
Applications, Vol. 25, pp. 263-78.

Nonaka, I. and Konno, N. (1998), ‘‘The concept of ‘Ba.’ Building a foundation for knowledge creation’’,
California Management Review, Vol. 40 No. 3, pp. 40-54.

O’Leary, D.E. (2003), ‘‘Technologies for knowledge assimilation’’, in Holsapple, C.W. (Ed.), Handbook on
Knowledge Management, Vol. 2, Springer-Verlag, New York, NY, pp. 29-46.

Oliver, C. (1990), ‘‘Determinants of interorganizational relationships: integration and future directions’’,


Academy of Management Review, Vol. 15 No. 2, pp. 241-65.

Ouchi, W.G. (1979), ‘‘A conceptual framework for the design of organizational control mechanisms’’,
Management Science, Vol. 25 No. 9, pp. 833-48.

Payton, F.C. and Brennan, P.F. (1999), ‘‘How a community health information network is really used’’,
Communications of the ACM, Vol. 42 No. 12, pp. 85-9.

Pouloudi, A. (1999), ‘‘Information technology for collaborative advantage in health care revisited’’,
Information and Management, Vol. 35, pp. 345-56.

Reekers, N. and Smithson, S. (1995), ‘‘The impact of electronic data interchange of interorganizational
relationships: integrating theoretical perspectives’’, in Nunemake, J.F. and Sprague, R.H. (Eds),
Proceedings of the 28th Annual Hawaii International Conference on System Science, Maui, HI, Vol. IV,
pp. 757-66.

Resnick, P., Zeckhauser, R., Friedman, E. and Kuwabara, K. (2000), ‘‘Reputation systems’’,
Communications of the ACM, Vol. 43 No. 12, pp. 45-8.

Rice, R.E. and Rogers, E.M. (1984), ‘‘New methods and data for the study of new media’’, in Rice, R.E.
et al. (Eds), The New Media: Communication, Research, and Technology, Sage, Beverly Hills, CA.

j j
PAGE 74 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Ring, S.P. and Van de Ven, A.H. (1992), ‘‘Structuring cooperative relationships between organizations’’,
Strategic Management Journal, Vol. 13, pp. 483-98.
Rockart, J.F. and Short, J.E. (1991), ‘‘The networked organization and the management of
interdependence’’, in Scott Morton, M.S. (Ed.), The Corporation of the 1990s: Information Technology
and Organizational Transformation, Oxford University Press, New York, NY, pp. 189-219.
Sachs, P. (1995), ‘‘Transforming work: collaboration, learning, and design’’, Communications of the
ACM, Vol. 38 No. 9, pp. 36-44.

Sarkar, M.B., Aulakh, P.S. and Cavusgil, S.T. (1998), ‘‘The strategic role of relational bonding in
interorganizational collaborations: an empirical study of the global construction industry’’, Journal of
International Management, Vol. 4 No. 2, pp. 85-107.
Schechter, B. (2003), ‘‘Telescopes of the world, unite! A cosmic database emerges’’, The New York
Times, May 20.
Shankar, V. and O’Driscoll, T. (2002), ‘‘How wireless networks are reshaping the supply chain’’, Supply
Chain Management Review, July-August, pp. 44-51.
Siegel, J., Dubrovsky, V., Kiesler, S. and McGuire, T.W. (1986), ‘‘Group processes in computer-mediated
communication’’, Organizational Behavior and Human Decision Processes, Vol. 37, pp. 157-87.
Simatupang, T.M. and Sridharan, R. (2001), ‘‘A characterization of information sharing in supply chains’’,
available at: www.mang.canterbury.ac.nz/orsnz/conf2001/papers/Simatupang.pdf
Strader, T.J., Lin, F.R. and Shaw, M.J. (1998), ‘‘Information structure for electronic virtual organization
management’’, Decision Support Systems, Vol. 23, pp. 75-94.
Teo, H.H., Wei, K.K. and Benbasat, I. (2003), ‘‘Predicting intention to adopt interorganizational linkages:
an institutional perspective’’, MIS Quarterly, Vol. 27 No. 1, pp. 19-49.
Thomke, S. and von Hippel, E. (2002), ‘‘Customers as innovators: a new way to create value’’, Harvard
Business Review, pp. April, 74-81.
Thompson, J. (1967), Organizations in Action, McGraw-Hill, New York, NY.
Tsui, E. (2003), ‘‘Tracking the role and evolution of commercial knowledge management software’’,
in Holsapple, C.W. (Ed.), Handbook on Knowledge Management, Vol. 2, Springer-Verlag, New York, NY,
pp. 5-27.

Upton, D.M. and McAfee, A. (1996), ‘‘The real virtual factory’’, Harvard Business Review, July-August,
pp. 123-33.
Van der Aalst, W. (2000), ‘‘Loosely coupled interorganizational workflows: modeling and analyzing
workflows crossing organizational boundaries’’, Information & Management, Vol. 37, pp. 67-75.

Venkatraman, N. (1991), ‘‘IT-induced business reconfiguration’’, in Scott Morton, M.S. (Ed.),


The Corporation of the 1990s: Information Technology and Organizational Transformation, Oxford
University Press, New York, NY, pp. 122-58.
Venkatraman, N. and Zaheer, A. (1994), ‘‘Electronic integration and strategic advantage:
a quasi-experiment study in the insurance industry’’, in Allen, T.J. and Scott Morton, M.S. (Eds),
Information Technology and the Corporation of the 1990s: Research Studies, Oxford University Press,
New York, NY, pp. 184-201.
Webster, J. (1995), ‘‘Networks of collaboration or conflict? Electronic data interchange and power in the
supply chain’’, Journal of Strategic Information Systems, Vol. 4 No. 1, pp. 31-42.
Williamson, O.E. (1993), ‘‘Calculativeness, trust, and economic organization’’, Journal of Law and
Economics, Vol. XXXVI, pp. 453-86.
Yoshida, S., Kamei, K., Ohguro, T. and Kuwabara, K. (2003), ‘‘Shine: a peer-to-peer based framework of
network community support systems’’, Computer Communications, Vol. 26, pp. 1199-209.
Zaheer, A. and Zaheer, S. (1997), ‘‘Catching the wave: alertness, responsiveness, and market influence
in global electronic networks’’, Management Science, Vol. 43 No. 11, pp. 1493-509.
Zucker, L.G. (1986), ‘‘Production of trust: institutional sources of economic structure, 1840-1920’’,
Research in Organizational Behavior, Vol. 8, pp. 53-111.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 75
Linking social network analysis with the
analytic hierarchy process for knowledge
mapping in organizations
Jay Liebowitz

Jay Liebowitz is in the Abstract


Department of Information Purpose – To provide an interesting approach for determining interval measures, through the analytic
Technology, Graduate Division hierarchy process, for integration with social network analysis for knowledge mapping in
of Business and Management, organizations.
Johns Hopkins University, Design/methodology/approach – In order to develop improved organizational and business
Rockville, Maryland, USA processes through knowledge management, a knowledge audit should be conducted to better
(Jliebow1@jhu.edu). understand the knowledge flows in the organization. An important technique to visualize these
knowledge flows is the use of a knowledge map. Social network analysis can be applied to develop this
knowledge map. Interval measures should be used in the social network analysis in order to determine
the strength of the connections between individuals or departments in the organization. This paper
applies the analytic hierarchy process to develop these interval measures, and integrates the values
within the social network analysis to produce a meaningful knowledge map.
Findings – The analytic hierarchy process, when coupled with social network analysis, can be a useful
technique for developing interval measures for knowledge-mapping purposes.
Research limitations/implications – The analytic hierarchy process may become tedious and
arduous for use in large social network maps. More research needs to be conducted in this area for
scalability.
Practical implications – As social network analysis is gaining more prominence in the knowledge
management community, the analytic hierarchy process may be able to provide more valuable
measures to determine the strengths of relationships between actors than simply using ordinal numbers.
Originality/value – Coupling the analytic hierarchy process with social network analysis provides a
novel approach for future knowledge-mapping activities.
Keywords Social networks, Analytical hierarchy process, Auditing
Paper type Research paper

1. Introduction
Why are organizations so enamored with knowledge management? One important reason is
that organizations are hoping that knowledge management processes will allow the creation of
knowledge to take place for increasing innovation in the organization (Chauvel and Despres,
2002; Earl, 2001). Innovation may be in the form of improved organizational business
processes, new products or services, or better ways for customer relationship management.
For example, JD Edwards applied knowledge management to their customer relationship
management by developing knowledge management for internal sales support first, taking
the lessons learned and successes into a second stage, and extending ‘‘the knowledge
garden’’ to its business partners and integrators (Harris et al., 2003).
Knowledge management can be in the form of idea management systems that allow
employee ideas and suggestions to be captured and shared online. These idea
management systems, such as Imaginatik’s Idea Central (Pluskowski, 2002), allow the
capture and sharing of ideas across the organization, and provides an efficient review
process to evaluate the ideas. Increased socialization encouraged by this approach can
lead to the impromptu formation of communities of practitioners who discover people with
similar interests from the ideas and the ensuring interactions (Pluskowski, 2002). These
interactions can hopefully lead to innovations in the organization.

PAGE 76 j JOURNAL OF KNOWLEDGE MANAGEMENT j VOL. 9 NO. 1 2005, pp. 76-86, Q Emerald Group Publishing Limited, ISSN 1367-3270 DOI 10.1108/13673270510582974
‘‘ A knowledge map portrays the sources, flows, constraints,
and sinks (losses or stopping points) of knowledge within an
organization. ’’

To enhance the knowledge flows between people to stimulate innovative thinking,


organizations should first conduct a knowledge audit and develop a knowledge map of the
sources, sinks, and flows of knowledge in the organization. In other words, whom do
people go to in the organization for answers to questions or how are departments in the
organization interacting within and between each other? To help develop the knowledge
map, social network analysis (SNA) and associated visualization tools can be used to aid in
the analysis of this information. One way to improve the current state-of-the-art in SNA is to
develop new ways to produce interval/ratio measures of relations between the various
individuals in the organization to determine the strength of their ties. This paper discusses
a novel approach in applying the analytic hierarchy process (AHP) to generate the ratio
scores for the valued graphs to be used in SNA in order to develop a knowledge map of the
organization.
This paper will first discuss knowledge mapping and then describe SNA. Then, the AHP
will be explained and linked with SNA to produce valued graphs used for knowledge
mapping.

2. Knowledge mapping
A key part of knowledge management (KM) is performing a knowledge audit to determine
knowledge flows within an organization. Specifically, the knowledge audit process involves
(Liebowitz et al., 2000):
(1) Identify what knowledge currently exists in the targeted area (typically select a core
competency that is cross-departmental/functional):
B determine existing and potential sinks, sources, flows, and constraints in the targeted
area, including environmental factors that could influence the targeted area;
B identify and locate explicit and tacit knowledge in the targeted area; and
B build a knowledge map of the taxonomy and flow of knowledge in the organization in
the targeted area. The knowledge map relates topics, people, documents, ideas, and
links to external resources, in respective densities, in ways that allow individuals to find
the knowledge they need quickly.
(2) Identify what knowledge is missing in the targeted area:
B perform a gap analysis to determine what knowledge is missing to achieve the
business goals; and
B determine who needs the missing knowledge.
(3) Provide recommendations from the knowledge audit to management regarding the
status quo and possible improvements to the knowledge management activities in the
targeted area.
An essential output of the knowledge audit process is the knowledge map for providing
insight for improving business and organizational processes. A knowledge map portrays the
sources, flows, constraints, and sinks (losses or stopping points) of knowledge within an
organization. Well-developed knowledge maps help identify intellectual capital (Liebowitz,
2003), socialize new members, and enhance organizational learning (Wexler, 2001).
Knowledge maps have been used for a variety of applications, even for developing a
knowledge map of knowledge management software tools (Noll et al., 2002). An
organization should map its knowledge to (Grey, 1999):

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 77
‘‘ A key part of KM is performing a knowledge audit to determine
knowledge flows within an organization. ’’

B encourage re-use and prevent re-invention, saving search time and acquisition costs;
B highlight islands of expertise and suggest ways to build bridges to increase knowledge
sharing;
B discover effective and emergent communities of practice where learning is happening;
B provide a baseline for measuring progress with KM projects; and
B reduce the burden on experts by helping staff find critical information/knowledge quickly.
Some of the key principles in knowledge mapping are establish boundaries and respect
personal disclosures, recognize and locate knowledge in a wide variety of forms, and locate
knowledge in processes, relationships, policies, people, documents, conversations,
suppliers, competitors, and customers. The types of questions asked to develop a
knowledge map include (Grey, 1999):
B What type of knowledge is needed to do your work?
B Who provides it, where do you get it, how does it arrive?
B What do you do, how do you add value, what are the critical issues?
B What happens when you are finished?
B How can the knowledge flow be improved, what is preventing you doing more, better,
faster?
B What would make your work easier?
B Who do you go to when there is a problem?
Typically, information is collected for the knowledge map by using the following methods
(Grey, 1999):
B conduct surveys, interviews, and focus groups;
B observe the work in progress;
B obtain network traffic logs, policy documents, org charts, process documentation;
B explore the common and individual file structures;
B concentrate on formal and informal gatherings, communications, and activities;
B gather from internal/external sources; and
B move across multiple levels (individual, team, department, organization).
To help develop a knowledge map, SNA can be used. The next section will address SNA
techniques and tools.

3. Social network analysis techniques and tools


According to Hanneman (2002), SNA is the mapping and measuring of relationships and
flows between people, groups, organizations, computers or other information/knowledge
processing entities. SNA involves actors (seeing how actors are located or ‘‘embedded’’ in
the overall network) and relations (seeing how the whole pattern of individual choices gives
rise to more holistic patterns). SNA has been used in sociology, anthropology, information
systems, organizational behavior, and many other disciplines. For example, Cross et al.
(2001) used SNA to partly determine if employees of the acquired firm are socializing and
seeking answers to questions from those of the acquiring firm. Related link analysis has also
been used in such applications as developing socio-spatial knowledge networks for chronic

j j
PAGE 78 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
‘‘ To help develop the knowledge map, SNA and associated
visualization tools can be used to aid in the analysis. ’’

disease prevention (Cravey et al., 2001) and developing academic networks and expertise
in British higher education (Eldridge and Wilson, 2003).
The basic idea of SNA is that individual people are nested within networks of face-to-face
relations with other persons. Families, neighborhoods, school districts, communities, and
even societies are, to varying degrees, social entities in and of themselves (Hanneman,
2002). The social network analyst is interested in how the individual is embedded within a
structure and how the structure emerges from the micro-relations between individual parts
(Hanneman, 2002). This could then be applied at an organizational level to see how the
‘‘actors’’ (e.g. employees, departments, etc.) relate to each other via their interactions.
Through SNA, a knowledge map could be generated to aid in the knowledge audit process.
Within SNA, there are a variety of strategies used to collect measurements on the relations
among the set of actors. One method is called ‘‘full network methods’’ whereby information
about each actor’s ties with all other actors is collected. One major limitation of this technique is
that it may be costly and difficult to collect full network data. A second group of methods is
called the ‘‘snowball methods’’ whereby the analyst begins with a focal actor or set of actors.
Then, each of these actors is asked to name some or all of their ties to other actors. Then, these
‘‘other actors’’ are asked for some or all of their ties, and the snowball effect continues until no
new actors are identified (or until some stopping rule is determined). The limitations with this
method are that those who are ‘‘isolated’’ may not be identified and it may be unclear as to
where to start the snowball rolling. The third major strategy used in SNA is the use of
egocentric networks. With this approach, one would begin with a selection of focal nodes
(egos) and identify the nodes to which they are connected. Then, one would determine which
of the nodes identified in the first stage are connected to one another. Egocentric methods
focus on the individual, rather than on the network, as a whole (Hanneman, 2002).
In order to measure the information collected about ties between actors, there are nominal,
ordinal, and interval/ratio levels of measurement. The zero-one binary scale is an example of
a nominal scale and would show if ties are absent (zero) or present (one). The
multiple-category nominal measures of relations is similar to multiple choice versus the
true-false binary representation. The ordinal measures of relations are similar to a Likert
scale and can determine the ‘‘strength’’ of ties. However, the third class of measures of
relations, namely the interval/ratio method, is the most advanced level of measurement that
allows the actors to discriminate among relations (e.g. this tie is three times as strong as that
tie) (Hanneman, 2002).
Examples of social networks are shown in Figures 1-3.

3.1 Social network analysis tools


There are a variety of SNA tools that are available for developing and visualizing social
networks. The International Network of Social Network Analysis (www.sfu.ca/,insna/) is an
excellent site that provides access to these various tools. Some of the leading SNA software is:
B Agna (www.geocities.com/imbenta/agna/index.htm);
B Analytic Technologies (Ucinet, Krackplot, Netdraw, Anthropac etc.) (www.analytictech.
com/);
B Classroom Sociometrics software (www.classroomsociometrics.com/);
B Fatcat (www.sfu.ca/ , richards/Pages/fatcat.htm);
B InFlow (www.orgnet.com/index.html);

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 79
Figure 1 Social network of 9/11 terrorist network (www.orgnet.com/hijackers.html)

j j
PAGE 80 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Figure 2 Social network analysis of individuals within an IT department (using InFlow
software) (www.orgnet.com)

B Java for Social Networks (www.public.asu.edu/ , corman/jsocnet/);


B MultiNet (www.sfu.ca/ , richards/Multinet/Pages/multinet.htm);
B Negopy (www.sfu.ca/ , richards/Pages/negopy4.html);
B NetMiner (www.netminer.com/NetMiner/home_01.jsp);
B Pajek (http://vlado.fmf.uni-lj.si/pub/networks/pajek/default.htm);
B Siena (http://stat.gamma.rug.nl/snijders/siena.html);
B SocioMetrica LinkAlyzer (www.md-logic.com/id142.htm);
B STOCNET (http://stat.gamma.rug.nl/stocnet/); and
B Visone (www.visone.de/).

InFlow, Krackplot, and NetMiner are three of the leading SNA tools. InFlow was developed
by Valdis Krebs and has been used for a myriad of applications ranging from analyzing

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 81
Figure 3 Social network analysis (www.orgnet.com) – org. mapping depicting
departments

patterns of e-mail usage to discovering book buying behaviors at Amazon.com (Johnson,


2003). InFlow performs network analysis and network visualization in one integrated
package (www.orgnet.com). Krackplot, originally developed by David Krackhardt and now
distributed by Analytic Technologies, is a social network visualization program. NetMiner
also is a network analysis and visualization tool that includes 15 analysis tools commonly
used in SNA research.

4. The analytic hierarchy process


An important part of SNA is being able to provide the measures of relations (i.e. strengths of
the ties) between the actors. We previously indicated that the interval/ratio method is
preferred for determining the measures of relations. One technique that has not been
applied to SNA to calculate the interval/ratio measures is the AHP.
AHP was developed by Thomas Saaty (1980) at the University of Pittsburgh to aid decision
makers in the evaluation process. It quantifies subjective judgments used in decision-making,
and has been applied in numerous applications throughout the world (Saaty, 1982). AHP uses
pairwise comparisons in order to determine relative levels of importance or preference. Expert
Choice is a software package that automates the use of AHP on a PC.
Let us take an example using AHP for use in SNA. The first step is to construct a tree
hierarchy to show the goal at the top, then criteria in the next level, and then alternatives at
the lowest level. If we are trying to develop a knowledge map using SNA, then we would
like to determine knowledge flows between individuals in the organizations. Thus, we might
want to ask Jay (an employee in the organization) the question ‘‘who do you ask when you
have a question involving . . . ’’ to be the goal at the top of the hierarchy, as shown in
Figure 4.
The criteria might be: office politics, institutional knowledge, general advice, and content
knowledge. The alternatives might be Bill, Bob, Mary, and June (of course, you should

j j
PAGE 82 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Figure 4 AHP/expert choice application

include all the necessary individuals as pertinent to the application). Then, Jay can start to
enter his verbal judgments to weight the criteria and then weight the alternatives versus each
criterion, using AHP/Expert Choice, as follows: With respect to the goal, are ‘‘office politics’’
more important than ‘‘institutional knowledge’’, and if so, how much more important? A scale
of relative importance developed by Saaty is applied to translate the verbal judgments into
numeric values:
(1) Equal importance.
(3) Moderate importance.
(5) Strong importance.
(7) Very strong importance.
(9) Extreme importance.

The values of 2, 4, 6, and 8 are intermediate values. Jay would then enter all the pairwise
comparisons for the criteria, which would result in weighted criteria that add up to one. In this
example, Jay’s preferences on his criteria, with respect to the goal, are shown in Table I.
Now, the overall weight that Jay is assigning to each criterion must be determined. It is
calculated by taking each entry and dividing by the sum of the column where it appears
(http://mat.gsia.cmu.edu/mstc/multiple/node4.html). For example, the (office politics, office
politics) entry would end up as 1=ð1 þ 1=3 þ 1 þ 3Þ ¼ 0:188. The other entries are shown in
Table II.
The average weights on the criteria suggest that the most important criteria, in order,
according to Jay are content knowledge (about 39 percent), general advice (about 29
percent), office politics (about 22 percent), and institutional knowledge (about 10 percent).
Note that the percentages should add up to 100 percent.
The next step is to weight the alternatives versus each criterion using pairwise comparisons.
The same procedure is conducted. The relative scores for each of the alternatives versus the
criteria are shown in Table III.

Table I Jay’s preferences on his criteria


Office politics Institutional knowledge General advice Content knowledge

Office politics 1 3 1 1/3


Institutional knowledge 1/3 1 1/3 1/3
General advice 1 3 1 1
Content knowledge 3 3 1 1

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 83
Table II The overall weight of Jay’s entries
Office politics Institutional knowledge General advice Content knowledge Average

Office politics 0.188 0.333 0.300 0.124 0.223


Institutional knowledge 0.062 0.100 0.100 0.124 0.096
General advice 0.188 0.333 0.300 0.375 0.287
Content knowledge 0.563 0.333 0.300 0.375 0.394

Table III The relative scores for each of the alternatives versus the criteria
Bill Bob Mary June

Office politics 0.433 0.238 0.169 0.161


Institutional knowledge 0.250 0.250 0.250 0.250
General advice 0.357 0.172 0.235 0.235
Content knowledge 0.147 0.548 0.158 0.147

In terms of whom Jay should ask for getting answers to his questions about office politics, Bill
would be the best person to provide advice (due to the highest weight (0.433) for Bill versus
that for Bob, Mary, and June). Referring back to our overall weights, we can now get an
overall value for each alternative. This is called the synthesis step. For example, the overall
value for Bill is calculated as:
ð0:223Þð0:433Þ þ ð0:096Þð0:250Þ þ ð0:287Þð0:357Þ þ ð0:394Þð0:147Þ ¼ 0:281:
Again, these weights should total one. Based on the synthesis, Bob has the highest weight
(0.342), then Bill (0.281), Mary (0.192) and lastly June (0.185), as shown in Figure 5. This
suggests, for example, that Jay seeks out Bill’s help the most when it comes to getting
answers to Jay’s questions on office politics and general advice, and Bob’s help the most

Figure 5 Synthesis using AHP/expert choice

j j
PAGE 84 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Figure 6 Valued graph based upon AHP results

when Jay has content knowledge-related questions. We could then use these values to be
the ‘‘strengths of the ties’’ between Jay and the other individuals. Thus, a valued graph as
shown in Figure 6 can be developed using these ratio values. These could then be
integrated within the SNA to help develop the knowledge map.
Integrating AHP with SNA can be a novel approach. However, there is a possible drawback.
If there are many individuals in the organization, it may become laborious and tedious to
apply the pairwise comparisons as there will be a tremendous number of comparisons to be
made. But, if the focus of study is on interactions within a specific department or between
departments in the organization, then this approach may be quite feasible.

5. Summary
This paper introduces the integrated use of the AHP with SNA to produce interval/ratio
measures for use in an organization’s knowledge map. This approach has wider implications
in the knowledge management field than just knowledge mapping. For example, Weber and
Aha (2002) indicate that about 70 percent of the lessons learned systems are ineffective. A
major reason given is the lack of an active or ‘‘push’’ feature for analysis and dissemination of
the lessons. In order to determine what lessons would be appropriate to ‘‘push’’ to users, a
user profile could be developed. Here, AHP could be applied in building the profile to
determine the relative preferences on which topics of interest are relatively more important
than others to the user in order to receive lessons learned in these areas. SNA could then be
used to develop the valued graphs used in a knowledge map for depicting, for example, the
different types of lessons received.
Developing AHP weightings on nodes should enhance the typical SNA algorithms for things
like centrality, betweenness, and others. We could also use the weightings to do a more
fine-grained filtering of the maps (e.g. remove weak links). Additionally, SNA maps may be
used to annotate an AHP decision hierarchy. For example, when traversing the decision
hierarchy, we might be able to understand how a particular weighting was achieved by
referencing across to the people making the judgments to see how they might be related in a
social network sense (e.g. tightly clustered or quite disparate).
This paper hopefully encourages others to apply AHP or other methods to determine the
ratio measures of relations used in SNA for knowledge mapping. In this manner, knowledge
maps could become more meaningful in depicting the strength of relationships for
organizational business process improvement.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 85
References
Chauvel, D. and Despres, C. (2002), ‘‘A review of survey research in knowledge management:
1997-2001’’, Journal of Knowledge Management, Vol. 6 No. 3, pp. 207-23.
Cravey, A., Washburn, S., Gesler, W., Arcury, T. and Skelly, A. (2001), ‘‘Developing socio-spatial
knowledge networks: a qualitative methodology for chronic disease prevention’’, Social Science and
Medicine Journal, Vol. 52 No. 12, pp. 1763-75.
Cross, R., Borgatti, S. and Parker, A. (2001), ‘‘Beyond answers: dimensions of the advice network’’,
Social Networks Journal, Vol. 23 No. 3, pp. 215-35.
Earl, M. (2001), ‘‘Knowledge management strategies: toward a taxonomy’’, Journal of Management
Information Systems, Vol. 18 No. 1, pp. 215-33.
Eldridge, D. and Wilson, E. (2003), ‘‘Nurturing knowledge: the UK higher education links scheme’’,
Public Administration and Development Journal, Vol. 23 No. 2, pp. 125-209.
Grey, D. (1999), ‘‘Knowledge mapping:a practical overview’’, available at: www.it-consultancy.com/
extern/sws/knowmap.html (accessed March).
Hanneman, R. (2002), ‘‘Introduction to social network methods’’, available at: www.faculty.ucr.edu/
,hanneman/
Harris, K., Kolsky, E. and Lundy, J. (2003), ‘‘The case for knowledge management in CRM’’, Gartner
Research Note, Stamford, CT, April 21.
Johnson, S. (2003), ‘‘Who loves ya, baby?’’, Discover, Vol. 24 No. 4.
Liebowitz, J. (2003), Addressing the Human Capital Crisis in the Federal Government: A Knowledge
Management Perspective, Butterworth-Heinemann/Elsevier, Oxford.
Liebowitz, J., Montano, B., McCaw, D., Buchwalter, J., Browning, C., Newman, B. and Rebeck, K.
(2000), ‘‘The knowledge audit’’, Journal of Knowledge and Process Management, Vol. 7 No. 1.
Noll, M., Frohlich, D. and Schiebel, E. (2002), ‘‘Knowledge maps of knowledge management tools:
information visualization with BibTechMon’’, in Karagiannis, D. and Reimer, U. (Eds), Practical Applications
of Knowledge Management 2002 Conference Proceedings, Springer-Verlag, New York, NY.

Pluskowski, B. (2002), ‘‘Dynamic knowledge systems’’, White Paper, Imaginatik Research, Boston, MA,
June, available at: www.imaginatik.com
Saaty, T. (1980), The Analytic Hierarchy Process, McGraw Hill, New York, NY.
Saaty, T. (1982), Decision Makers for Leaders, Wadsworth Publishing, Belmont, CA.
Weber, R. and Aha, D. (2002), ‘‘Intelligent delivery of military lessons learned’’, Decision Support
Systems Journal, Vol. 34, pp. 287-304.
Wexler, M. (2001), ‘‘The who, what, and why of knowledge mapping’’, Journal of Knowledge Management,
Vol. 5 No. 3, pp. 249-64.

j j
PAGE 86 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
A knowledge-based system to support
procurement decision
H.C.W. Lau, A. Ning, K.F. Pun, K.S. Chin and W.H. Ip

Abstract
Purpose – To propose an infrastructure of a knowledge-based system to capture and maintain the
procurement information and purchasers’ knowledge, regarding how to choose partners in the supply
chain network, with the adopting of the neural networks that mimic the operation of human brain to
generate solutions systematically.
Design/methodology/approach – The proposed system encompasses hybrid artificial intelligence
(AI) technologies, Online analytical processing (OLAP) applications and neural networks.
Findings – Be able to capture the procurement data and vendors’ information that are generated in the
workflows to ensure tthat he knowledge and structured information are captured without additional time
and effort. Recognizes the void of research in the infrastructure of the hybrid AI technologies for
knowledge discovery.
Research limitations/implications – Neural network does not have the sensibility characteristic of the
purchasing staff, it is not able to identify the environment changes, which need to re-adjust the output to
fit the environment.
Practical implications – The proposed system obtains useful information related to the trend of sales
H.C.W. Lau is an Associate demand in terms of customer preference and expected requirement using the OLAP module and then
Professor, A. Ning is a Project based on this information, the neural network provides recommendation related to the supported
Associate and W.H. Ip is an suppliers that are capable of fulfilling the requirements.
Associate Professor, all in the Originality/value – This paper proposes a knowledge-based system that offers expandability and
Department of Industrial and flexibility to allow users to add more related factors for analysis to enhance the quality of decision
Systems Engineering, The Hong making.
Kong Polytechnic University, Keywords Deductive databases, Decision support systems
Hong Kong. K.F. Pun is a Senior Paper type Research paper
Lecturer in the Department of
Mechanical Engineering, The Introduction
University of the West Indies,
Trinidad and Tobago, West To compete in the ever-changing global market, it is crucial for the manufacturing firms to be
Indies. K.S. Chin is an Associate able to exploit and develop its competitive advantages and achieves low cost productions
Professor in the Department of by seeking reliable venders who provide the best value supplies to the firms. Most of the
Manufacturing Engineering and manufacturing firms rely heavily on the purchasers’ expertise and personal network to make
Engineering Management, City decision on the selection of vendors. Indeed, it is critical for manufacturers to maintain and
University of Hong Kong, Hong exploit the purchasers’ knowledge and expertise for the long-term benefit of the corporation
Kong. (Davenport, 1997). Knowledge management is a set of business processes that capture and
deliver the collective experience. Corporations are investing seriously in the development of
knowledge management systems, especially the professional services firms, e.g. consultant
firms, accounting firms, etc. Ofek and Sarvary (2001) identified the reasons why knowledge
managements are emerging the professional services firms. Growth and globalization are
part of the objective as both created the opportunity to utilize the dispersed experience of
the expertise in the firms. Another key reason of deployment of knowledge management
system is due to the recent advances in the information technologies that enable the firms to
build systems that integrate and collaborate the experts’ experiences that ensure the
companies provide better services to the customers. Therefore, knowledge management
becomes a crucial tool for the corporations to survive in the volatile marketplace and to
achieve competitive edge (Tiwana, 2000). There is no doubt that knowledge is one of the
organization’s most valuable resources, indeed, there are companies that treat knowledge

DOI 10.1108/13673270510582983 VOL. 9 NO. 1 2005, pp. 87-100, Q Emerald Group Publishing Limited, ISSN 1367-3270 j JOURNAL OF KNOWLEDGE MANAGEMENT j PAGE 87
‘‘ The learning process of a human is through repetitive learning
cycles which is similar to the learning process of a neural
network. ’’

as an asset, just as real as any other assets that appear on the companies balance sheet. For
example, Skandia performs an internal audit of the company’s intellectual properties and
issues annual reports to stockholders to prove the investors the value of Skandia’s
knowledge capital (Davenport and Prusak, 2000).
In order to manage knowledge efficiently in an organization, the companies have to define
where the knowledge is kept, for instance, the memos, reports, presentations, press articles,
etc., then put these unstructured forms of data into knowledge repositories. In fact, these
knowledge repositories can be categorized into three types:
(1) External knowledge, such as the journal articles and market research information on the
competitors.
(2) Structured internal knowledge, such as the bills of materials, product specifications, and
production procedures.
(3) Informal internal knowledge, such as know-how, in the minds of people in the
organization (Davenport and Prusak, 2000).
In order to build a system with a knowledge repository that captures and embeds the value
added knowledge and able to enhance the ability of decision making in the corporation, it is
necessary to build the knowledge based system that has the capabilities in categorizing and
sorting out tremendous amount of data while generating information to support decision
making with intelligence features. For example, for the internal knowledge repository,
Hewlett-Packard’s electronic sales partner has added value to their repository through
careful categorization and filtering. Therefore, the proposed system encompasses hybrid
artificial intelligence (AI) technologies, on-line analytical processing (OLAP) applications
and neural networks. The system repository captures the procurement data and vendors’
information that are generated in the workflows to ensure the knowledge and structured
information are captured without additional time and effort. The proposed system obtains
useful information related to the trend of sales demand in terms of customer preference and
expected requirement using the OLAP module and then based on this information, the
neural network provides a recommendation related to the supported suppliers that are
capable to fulfill the requirements. The OLAP module is responsible to categorize, access,
view, and analyze the data in the repository and generate inputs to feed the neural networks.
For the neural network module in this research, the feed-forward back-propagation network
is selected to build a system, which enhances decision making in choosing the most
appropriate vendors. However, it should be noted that other neural network’s configurations,
such as feed backward back propagation, could also be used.
When the trained network is available, it is possible that recommended action can be
obtained with the purpose to rectify some hidden problems, should that occur at a later
stage. Therefore, in the training process of the neural network, the nodes of the input layer of
the neural network represent the data from the OLAP and those of the output layer represent
the predictions and extrapolations.

Related studies
Today’s industrial environment is rapidly changing due to global competition as well as fast
advances of information technology. The major activity of manufacturing firms is no longer
confined to product productions but lies in the systematic managing of knowledge in order
to create products that meet customer’s needs.

j j
PAGE 88 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
‘‘ The major activity of manufacturing firms is no longer
confined to product productions but lies in the systematic
managing of knowledge in order to create products that meet
customers’ needs. ’’

It had been proven that the best way to capture and share knowledge is to embed
knowledge management into the job of the workers, and to ensure that knowledge
management is no longer a separate task that requires additional time and effort to input
what they have learned and to learn from others. Partners Healthcare, a Boston-based
umbrella organization that included Brigham and Women’s Massachusetts General, and
several other hospitals and physician’s groups, had adopted an information technology
system that is built with this concept. The system was built by targeting on the essential work
process (the physician order entry) and formally reported problems (drug allergies and lab
reports). The system was linked to a massive repository that contains constantly updated
clinical knowledge and patient’s records so that the physicians can handle the medical facts
when providing a consultation for the patients. A controlled study illustrated that the system
reducing the medication errors by 55 percent, improvements on prescribing the proper
drugs, and prescribing cheaper but more effective drugs for patients. These improvements
not only saved lives but saved money as well (Davenport and Prusak, 2000).
Turban (1988), Mockler (1989) and Ba et al. (2001) had examined the development of a
knowledge-based and decision-support system from a management point-of-view. They had
studied the incentive issues in the information system areas of knowledge management and
supply chain coordination. They outlined the requirements for designing incentive-aligned
information systems, such as embody the right incentives so that users would not have an
incentive to cheat the system nor be better off by distorting the information. On the other
hand, Ofek and Sarvary (2001) studied how knowledge management affects competition
among the professional service firms in terms of reducing operating costs and creating
added value to customers by significantly increasing the product quality. They analyzed the
competitive dynamics and market structure that emerge as the result of the firms competing
with knowledge management systems and their study showed that knowledge management
leads to quality improvement, and the higher ability to leverage the customer base was able
to actually lower profits and lead to industry shakeout. These research studies had studied
knowledge management from a management prospective and how a knowledge-based
system affected business operations. On the other hand, some researchers proposed
models and system architectures for knowledge-based systems.
A number of researchers had proposed and developed knowledge-based systems with the
agent technology. Koch and Westphall (2001) presented the work on an application of
distributed artificial intelligence for computer network management by implementing the
software platform using an intelligence autonomous agent. Alternatively, Montaldo et al. (2003)
proposed an agent-based architecture that was applied to workflow management system to
manage new functionalities, such as customer relationship management in electronic
commerce. They proposed architecture use agent technology to distribute the intricacies of
managing manufacturing workflow information according to the complexity of the enterprises.
Besides agent technology, a number of studies had been found to use the OLAP system for
decision support and knowledge discovery purposes. Tan et al. (2003) assessed the
opportunities and challenges of the combination of data warehousing and web technology in
terms of efficiency improvements, security, key success factors, and benefits. They found
that a combination of data warehouses, analytical applications, and internet technologies
were able to eliminate the reporting and analysis processes, which truly exposed the power
of the web. Moreover, Thomas and Datta (2001) had proposed a concept model of data
cube and algebra to support OLAP operations in a decision support system. The algebra

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 89
provided a means to concisely express complex OLAP queries for knowledge discovery
and enhanced the overall ability of the knowledge-based system to implement OLAP
applications with the standard relational database technology.
Furthermore, Weigend et al. (1999) addressed the recent trend of increasing electronic
access to documents through text categorization that becomes the center of knowledge
management. In their study, they had brought upon the problems of using a
divide-and-conquer strategy that mimicked the hierarchical structure of knowledge that
was ignored in flat inference models and provided the internal structure of the categories in
order to improve text categorization performance.
Moreover, Davenport and Prusak (2000) positioned the knowledge technologies: knowledge
component, case-based reasoning, constraint-based, expert systems, web, notes, and
neural nets, into dimensions that distinguish different types of knowledge technologies in
terms of time to find a solution and level of user knowledge required. Among the knowledge
technologies, web, notes, and neural networks required a high level of user knowledge and a
lot of time to find a solution. Furthermore, Kumar and Olmeda (1999) and Fowler (2000) had
studied knowledge discovery and management with the employment of artificial intelligence
(AI) technologies, such as case based reasoning, knowledge based system, neural
networks, etc. Both studies found that by embedding hybrid AI technologies they are able to
outperform all the individual methods in terms of performance in knowledge discovery.
However, there was a void of research in the infrastructure of the hybrid AI technologies for
knowledge discovery. Therefore, to fill the gap in the research, we proposed the
infrastructure of a knowledge-based system that captures and maintains information and
expert knowledge, and proposes a solution to users.

System infrastructure
It is indeed difficult to capture and embed informal knowledge that resides within minds of
people into a system. McDonnell Douglas, which is now part of Boeing, tried to develop an
expert system that contains the expert knowledge to determine the aircraft are positioned
properly for landing. They gathered the human knowledge of the ground crews by interview
and over-the-shoulder observation. The system took two years and a tremendous amount of
resources to capture the human expertise that demonstrated how difficult it is to capture and
embed tacit knowledge in a system (Davenport and Prusak, 2000). However, it would be less
time consuming and less resources are needed to develop a system that captures and
embeds structured knowledge. For the proposed system, the transaction data and vendor’s
data are selected to be the input because they are structured information which are easier to
be captured through transaction documents and embed the data into the system.
The primary objective of the system is that it should generate outputs that should be as good
as or even better than the decision made by an expert would have made, provided that the
same set of input data is given. The learning process of a human is through repetitive learning
cycles which is similar to the learning process of a neural network. Therefore, a neural network
is selected to be part of the hybrid system to take advantage of its capability to operate with
incomplete data to generalize, abstract, and reveal insight (Wasserman, 1989; Sharda, 1994;
Kasabov, 1999). A neural network is a statistically oriented tool that excels at using data to
classify cases into categories (Davenport and Prusak, 2000). Neural networks ‘‘learn’’ patterns
from data directly by examining the data repeatedly, searching for relationships, automatically
building models, and correcting the model’s own mistakes over and over again (Dhar and
Stein, 1997). In other words, neural networks build models from data by trial and error.
Data analysis capabilities become crucial as a tremendous amount of information is pouring
into the company through different mediums, while the company itself is creating loads of
information as well. Data mining tools become popular as they identify and characterize
interrelationships among multivariable dimensions without requiring human efforts to formulate
specific questions. It is concerned with discovering newfound and vital information, which
enables the executives to find out the undiscovered facts from the valuable data. Nowadays,
there are many off-the-shelf data mining applications. For example, Microsoft SQL server,
which has incorporated with OLAP technology, provides services in accessing, viewing, and

j j
PAGE 90 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
‘‘ Neural networks have been found to be very good at modeling
complex and poorly understood problems. ’’

analyzing large volumes of data with high flexibility and performance (Thomsen, 1999;
Peterson, 2000). According to the definition of OLAP Council, OLAP is a:
Category of software technology that enables analysts, managers and executives to gain insight
into data through fast, consistent, interactive access to a wide variety of possible views of
information that has been transformed from raw data to reflect the real dimensionality of the
enterprise as understood by the user (Inmon, 1992).

The OLAP tool is used to assist decision makers in creating appropriate knowledge and
models by browsing the related data groups, and defining the model-based relations
between the data groups (Lau et al., 2001). It creates multidimensional views of data in
relational databases. It has the capabilities to manipulate and analyze large volumes of data
from multiple perspectives (Laudon and Laudon, 2000). The key of adopting OLAP is that it
enables executives to gain insight to a wide variety of possible views of information with
quick and interactive access (Forman, 1997). Most importantly, OLAP has the ability to
provide managers with information that they need to make effective decisions related to an
organization’s strategic directions. It generates a multidimensional view of aggregate data to
provide quick access to strategic information for further analysis by its special drill-and-slice
functionality. It has ability to provide just-in-time information for effective decision-making.
Using OLAP, executives are able to access information buried in the database and to
analyze data across any dimension and at any level of aggregation (Lau et al., 2001).
Nevertheless, OLAP technology lacks the ability in predicting and forecasting forthcoming
events or alerting the possible unnoticed mishaps (Lau et al., 2002). Therefore, an intelligent
element is needed to enhance the performance of the decision support system. A neural
network is therefore suggested to complement the OLAP technology in the proposed system
infrastructure, taking advantage of neural network learning and intelligence capability. A
neural network, also known as connectionist set, is a set of simple but highly interconnected
processing elements that are capable of learning information that is presented to network; it
is a system arranged in patterns similar to biological neural nets and is modeled like the
human brain (Dhar and Stein, 1997). However, the neural network’s limitation is that its input
and output variables are linked together based on a ‘‘black-box’’ mechanism and therefore
cannot be easily explained. This drawback is now dealt with the approach of OLAP that
adopts traditional data management technology. Like other simulation models, the neural
network substitutes the real system in predicting and controlling system responses for the
purpose of dynamic control. It is a technology that has been used for prediction, clustering,
classification, and alerting of abnormal patterns (Haykin, 1994).
The proposed system contains a knowledge-based system, which encompasses an OLAP
module and a neural network module. The knowledge-based system extracts data from
system repository which stores the transaction data and vendors’ information and enshrine the
data in OLAP to take advantage of its provision of multi-dimensional views on the scattered
data and generation of aggregated data to gather and provide the vendors’ information for
further assessment (see Figure 1). As mentioned above, OLAP is lack of intelligence function;
a neural network is built to complement the deficiency of OLAP. A neural network is
responsible for assessing the candidate vendors based on their performance history, and to
benchmark against the product desire relevant criteria with the best-of-class performance
measure. A neural network can be effectively applied to a vast array of problems and data,
many of which have been thought to be too complex or lack sophisticated theoretical models.
Furthermore, neural networks do not learn by adding representatives to their knowledge base,
instead, they learn by modifying their overall structure in order to adapt to the contingencies of

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 91
Figure 1 System infrastructure

g y m

Purchasers

N
Network Final Decision

the world they inhabit (Sauter, 1997; Luger and Stubblefield, 1998). Indeed, neural networks
have been found to be very good at modeling complex and poor understood problems for
which sufficient data are collected (Dhar and Stein, 1997). Most importantly, neural networks
are especially suitable when used as a forecasting and further analysis tool because it is
capable to find a solution based on categorizing a multi-dimensional input vector and thereby
selecting an appropriate output vector (Fowler, 2000). A benchmarking process that is part of
the neural network module is used to select the most appropriate vendors, providing the
relevant facts and data that are needed as input for the projected performance. Specific
measures should be aligned with strategic objectives to ensure that the factors for
benchmarking are consistent with the corporate goals. The benchmarking process predicts,
based on the industry past data, how well or how poor the vendors will perform over the
forthcoming period of time. Since the system predicts the performance of the vendors with the
machine-learning feature based on historical data, neural network lends itself to be the most
appropriate method with its power for generating a forecast.

System implementation
The OLAP module provides analytical capabilities and the neural network model
benchmarks the vendors to suggest the most suitable vendor. It is assumed that the
companies using the system already have records of the best-of-class vendor performance
as a benchmarking reference in its relevant industry and detailed information on the
vendors. The procedures and operations of the system are shown in Figure 2.
Before the implementation of OLAP, the calculated member is constructed by the measures.
The first step of the process is to identify the criteria in terms of quality, cost, and delivery
schedule based on the new product requirements. The purpose of the OLAP module is to
collect data related to the procurements and vendors, and generate aggregated data to
feed the neural network. The OLAP module is capable of providing insights to the data by
putting the data into multi-dimensional views. The OLAP module evaluates the vendor based
on six assessment factors:

j j
PAGE 92 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Figure 2 Proposed system processes

(1) Product type.


(2) Product methodology.
(3) Materials.
(4) Production cost.
(5) Delivery time.
(6) Defect rate.
Each of the factors would have a weight (W) that signifies the importance of the relevant
factor. The OLAP aggregated the data to generate the seven assessment factors and
assesses each by applying the weight on each factor to generate scores (S):
X
n
Total weighted score ¼ Si W i :
i¼1

The OLAP module maintains the information about each vendor with vectors containing the
following factors: product code, product type, production methodology, materials,
production cost, defect rate, delivery time, vendor name, vendor phone, customer
service, production methodology score, quality score, production cost score, delivery time
score, and customer service score.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 93
To identify the competency of the potential vendors, the total weight score of each of the
vendors is compared with an ideal score assigned that depends on the product type. For
example, the ideal score for product type code C95-G81M is 0.85. Any vendor with total
weighted score higher than 0.85 would be considered as suitable candidate and further
analysis would be carried out to identify the best vendor among the selected group who
meet the threshold. However, if no vendor meets the threshold, the system would advise the
user to seek new vendors or lower the threshold qualification level.
With the OLAP module being regarded as a front-end part, the neural network module is
employed as a back-end part of the proposed system. The neural network module is made
to evaluate the selected vendors, which are identified in the OLAP module, who are the best
among the group to undertake the specific purchase order for the parts of a product. With
the data results from the OLAP module, this module analyzes the vendors’ relevant
characteristics of which are necessary to meet the standard of the product (see Figure 3).
The input layer of the neural network includes three categories:
(1) Quality: the quality standard of the vendor is measured by the defect rate and scrap rate
of the materials supplied.
(2) Cost: the cost includes the material cost and delivery cost.
(3) Delivery schedule: the delivery schedule describes the records of on-time delivery. All of
which are abstracted from past company performance records, are studied and utilized.
Each category is assigned with a score ranging from 1 (poor or low) to 7 (excellent or high).
For the neural networks consists of 15 input nodes (five sets of the last five records including
quality, cost and delivery schedule) and five output nodes (various suggested action to be
taken) are used, as shown in Figure 4. The historical records each associated vendor is
submitted to the input layer of the neural network. After processing, the neural network will
give an output node that is the score that includes the performance value on reliability of
quality, consistence of delivery time, and competitiveness of cost. To identify the most
qualified vendor, the output score is then compared with the best-in-class performance
standard to benchmark the vendor’s performance, and judged against the expected

Figure 3 Neural network mapping

j j
PAGE 94 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Figure 4 Mapping of input and output nodes of neural network module of the proposed
system

Latest Track Assessment Report

N r

p m

standard score of the specific product to see the fitness of the vendor’s performance with the
product desire quality and standards. Figure 5 showed an example of the performance
record of a vendor with the vendor’s quality, cost, and delivery schedule. From the example,
one can see the trend of delivery is improving with an upturn tendency of quality. However,
the cost is fluctuating dramatically, which may suggest that the vendor is putting afford in
high quality and delivery on time, but the vendor may still have difficulty in controlling the
cost. After the vendor is selected and the purchase order is issued, performance reviews of
the selected vendor are carried out continuously with the latest data captured by the system
repository to update the records. This would be able to ensure the system is acquiring more
knowledge and experience through the learning process over time.

Figure 5 Performance of vendor historical records

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 95
It is important to note that the performance of the neural network depends heavily on the
training sets of data. In other words, enough training for the neural network model is
essential, especially trained with the actual purchasing orders in which decisions have been
made rather than with the data that is not realistic. Furthermore, it is desirable to have better
distribution of training sets that cover as many situations as possible.

Case example
To validate the proposed system, a prototype has been developed and tested in GPP
Limited, which produces plastic toys and premium products in Hong Kong and exports their
products directly to customers in the US, Europe, and Japan. The company manufactures
toys for a number of worldwide cartoon companies; it produces products based on the
customer needs.
The company has a procurement department that is responsible to purchase raw materials
and parts from all over the world. However, as the company expands rapidly and works with
numerous numbers of vendors, the purchasing staffs is now facing the problems in
identifying the appropriate vendors for specific product parts and raw materials. The
procedures that GPP Ltd used to select the vendors were based on the experience of the
purchasing staff. However, problems have occurred due to the purchasing staff turnover,
and the selection process often leads to unexpected outcome such as late delivery, high
defect rate, etc. In addition, the vendor selecting process is inconsistent as the decisions are
made with subjective views of the purchasing staff.
The hybrid system encompassing OLAP and neural network was built using Visual Basic as
the main development program with Qnet (qnetv2k.com) for the implementation of neural
network module. GPP Ltd provides sets of historical records of the performance of vendors
who have supplied materials and parts to GPP Ltd, as well as best-in-class performance
measures of the industry.
The new product that is used to test the system is a cartoon character plastic watch – Super
FaFa Watch. GPP Ltd decided to assemble the watch in-house and order all the parts from
vendors to reduce production cost. The product specifications are shown in Table I.
To identify the potential vendors after identifying the assessment factors, the next step is to
assign the weights to the relevant factors. The factors include the production methodology
score, quality score, production cost score, delivery time score, and customer service score,
and the weights are 0.3, 0.2, 0.4, 0.1, and 0.1 respectively. A total weighted score is
calculated for all the vendors and the vendors who have total weighted score of more than
0.75 are chosen and considered to be potential vendors who are suitable to supply the parts
and materials based on the given product specifications. Four vendors have a total weighted
score of more than 0.75 and they are PP Ltd, SWS Ltd, WY Ltd, and KCC Ltd.
Training for the neural network is necessary to ensure the neural network produces reliable
outputs. GPP Ltd trained the network with 120 sets of historical data based on previous
actual results. The vendors selected from the OLAP module would be compared with their
performance against the best-in-class performance and the product (Super FaFa Watch)
desire quality and standard. The factors that are fed into the neural network include quality,
cost, and delivery schedule.

Table I Product specifications


Assessment factors Specifications

Product name Super FaFa Watch


Product type Plastic watch
Production methodology Assembly
Material Plastic bracelet, watch dial, plastic stamp
Production cost ,$4,000
Defect rate ,5 percent
Delivery time 30 days

j j
PAGE 96 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Figure 6 shows the latest five records of the selected vendors. The quality of supply of PP Ltd
is dropping, so as its delivery schedule was not always on time. However, PP Ltd is able to
offer better discount to GPP Ltd over time. Indeed, it is difficult to tell the projected
performance of PP Ltd as the trends are fluctuating without a consistent path. There is an
upward trend on all the characteristics of SWS Ltd. In other words, SWS Ltd is able to supply
quality parts and delivery on time, but it also charges a higher price over time.
In contrast, WY Ltd offers more discounts to GPP Ltd and it is able to deliver on time, but the
quality of its supplied products is having a downturn. It would be difficult to predict the future
performance of WY Ltd since there are ups and downs in different areas based on the
historical data.
Lastly, KCC Ltd is able to provide quality supplies and offer more discounts in the recent
transactions. However, it is in generaly unable to deliver on time.
After such information has been input, the neural network module gives an assessment
report back to the user, thus supporting the user to take action if deemed necessary. In
Table II, ‘‘0’’ output from the neural node indicates a negative suggestion to the associated
statement and ‘‘1’’ is the positive suggestion whereas ‘‘0.5’’ indicates that there is not
enough data to justify a firm suggestion.

Figure 6 Selected vendors performance

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 97
Table II Output from the neural node
Company/output from NN module PP Ltd SWS Ltd WY Ltd KCC Ltd

Potentially competent 0.5 0.5 1 1


Suggested to be replaced 0 0 0 0
Service quality is compromised to meet the
quoted price 1 0 0 1
Further assessment of company performance is
required 1 0 1 0
Delivery time seems to be inconsistent due to
certain company problems 1 0 1 0.5

Based on the analysis by the neural network, the results of benchmarking with the
best-in-class performance and the product desire quality and standard, KCC Ltd is
suggested by the system.
To evaluate the performance of the proposed system, GPP Ltd has used the proposed
system to select vendors for a number of products and has compared the results with the
decision made by the purchasing staff. The result is shown in Table III and indicates that the
adoption of the proposed system is able to choose the right vendors, however, there is room
for further improvement of the proposed system and the system is expected to produce
more accurate output as the neural network is trained with more data.

Evaluation
From the case example, one can find that the system has embedded the knowledge of the
purchasing staff and is able to make decisions for selecting suitable vendors consistently
based on the historical performance of the vendors and the requirements of the product.
Although the proposed system is able to provide impressive results, the system still has
some limitations. The maintenance and amendment of the system requires a thorough
understanding of the system and relationships among the parameters. In order to increase
the system accuracy, the management of the feedback system plays a vital part in the
system maintenance. The information engineer is responsible to learn the needs of the
enterprise, and to be able to define requirements in order to learn how to design an
enterprise-wide information system.
Since the neural network is built with sets of historical data, it is difficult to guarantee the
network will provide satisfactory results, especially when the network is used in different
situations where the input feed into the network is not from the same domain. Moreover, the
neural network does not have the sensibility characteristic like the purchasing staff; it is not
able to identify the environment changes which need to re-adjust the output to fit the
environment, which leaves room for further improvement to the system.
To make sure that the predicted variable could be adjusted in response to change in the
performance of vendors, updated data should be fed into the repository and passed to the
neural network for training regularly.

Table III Vendors selection results


By human By proposed system Management expectation
Assessment (%) (%) (%)

Delay in final product delivery 23 15 5


Defect rate 15 8 5
Customer complaint 12 8 5

j j
PAGE 98 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Conclusion
Economic organizations always devote their full efforts to obtain the best available
information in order to make information decisions. The proposed system that embedded
human knowledge with the OLAP system and the neural network has been described in the
paper. It demonstrates the benefits of using combinations of technologies to form an
integrated system, which capitalizes on the merits and at the same time offsets the pitfalls of
the involved technologies. The unique feature of this knowledge-based system has been
tested with impressive results while comparing with the decisions made by the purchasing
staffs. Further research and development of the system can be done to expand the system
domain and to provide flexibility to the system that allows users to add more related factors
for analysis in order to enhance the quality of decision making in choosing the most
appropriate vendors.

References
Ba, S., Stallaert, J. and Whinston, A.B. (2001), ‘‘Research commentary: introducing a third dimension in
information systems design – the case for incentive alignment’’, Information Systems Research, Vol. 12
No. 3, pp. 226-39.
Davenport, T. (1997), ‘‘Secrets of successful knowledge management’’, available at: http://webcom.
com/quantera/secrets.html
Davenport, T.H. and Prusak, L. (2000), Working Knowledge, Harvard Business School Press,
Boston, MA.
Dhar, V. and Stein, R. (1997), Seven Methods for Transforming Corporate Data into Business
Intelligence, Prentice-Hall, Upper Saddle River, NJ.
Forman, S. (1997), ‘‘OLAP Council’’, White paper, available at: www.olapcouncil.org
Fowler, A. (2000), ‘‘The role of AI-based technology in support of the knowledge management value
activity cycle’’, Journal of Strategic Information Systems, Vol. 9 No. 2/3, pp. 107-28.

Haykin, S. (1994), Neural Networks. A Comprehensive Foundation, Macmillan, New York, NY.
Inmon, W.H. (1992), ‘‘Data warehouse – a perspective of data over time’’, Database Management,
February, pp. 370-90.
Kasabov, N.K. (1999), Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering,
NetLibrary Inc., Boulder, CO.

Koch, F.L. and Westphall, C.B. (2001), ‘‘Decentralized network management using distributed artificial
intelligence’’, Journal of Networks and Systems Management, Vol. 9 No. 4, pp. 375-88.
Kumar, A. and Olmeda, I. (1999), ‘‘A study of composite or hybrid classifiers for knowledge discovery’’,
Journal of Computing, Vol. 11 No. 3, pp. 267-77.

Lau, H.C.W., Ip, R.W.L. and Chan, F.T.S. (2002), ‘‘An intelligent information infrastructure to support
knowledge discovery’’, Expert System with Applications, Vol. 22, pp. 1-10.
Lau, H.C.W., Bing, J., Lee, W.B. and Lau, K.H. (2001), ‘‘Development of an intelligent data-mining
system for dispersed manufacturing network’’, Expert Systems, Vol. 18 No. 4.

Laudon, C.K. and Laudon, J.P. (2000), Management Information Systems: Organization and Technology
in the Networked Enterprise, Prentice-Hall, Englewood Cliffs, NJ.
Luger, G.F. and Stubblefield, W.A. (1998), Artificial Intelligence Structures and Strategies for Complex
Problem Solving, 3rd ed., Addison-Wesley Longman, Glen View, IL.
Mockler, R.J. (1989), Knowledge-based Systems for Management Decisions, Prentice-Hall, Englewood
Cliffs, NJ.
Montaldo, E., Sacile, R. and Boccalatte, A. (2003), ‘‘Enhancing workflow management in the
manufacturing information system of a small-medium enterprise: an agent-based approach’’,
Information Systems Frontiers, Vol. 5 No. 2, pp. 195-205.
Ofek, E. and Sarvary, M. (2001), ‘‘Leveraging the customer base: creating competitive advantage
through knowledge management’’, Management Science, Vol. 47 No. 11, pp. 1441-56.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 99
Peterson, T. (2000), Microsoft OLAP Unleashed, 2nd ed., Sams Publishing, Indianapolis, IN.
Sauter, V.L. (1997), Decision Support Systems: An Applied Managerial Approach, John Wiley,
New York, NY.
Sharda, R. (1994), ‘‘Neural networks for the MS/OR analyst: an application biography’’, Interfaces,
Vol. 24 No. 2, pp. 116-30.
Tan, X., Yen, D.C. and Fang, X. (2003), ‘‘Web warehousing: web technology meets data warehousing’’,
Tehcnology in Society, Vol. 25, pp. 131-48.
Thomas, H. and Datta, A. (2001), ‘‘A conceptual model and algebra for online analytical processing in
decision support databases’’, Information Systems Research, Vol. 12 No. 1, pp. 83-102.
Thomsen, E. (1999), Microsoft OLAP Solutions, Wiley, New York, NY.
Tiwana, A. (2000), The Knowledge Management Toolkit, Prentice-Hall, Englewood Cliffs, NJ.
Turban, E. (1988), Decision Support and Expert Systems: Managerial Perspective, Macmillan Publishing
Company, New York, NY.
Wasserman, P.D. (1989), Neural Computing Theory and Practice, Van Nostrand Reinhold, New York, NY.
Weigend, A.S., Wiener, E.D. and Pedersen, J.O. (1999), ‘‘Exploiting hierarchy in text categorization’’,
Information Retrieval, Vol. 1 No. 3, pp. 193-216.

j j
PAGE 100 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
The ‘‘global’’ and the ‘‘local’’ in knowledge
management
Joseph G. Davis, Eswaran Subrahmanian and Arthur W. Westerberg

Joseph G. Davis is at the School Abstract


of Information Technologies, and Purpose – This paper aims to unravel the complexities associated with knowledge sharing in large
Language and Knowledge global organizations through a field study carried out in a large, multinational company (Du Pont),
focusing on the critical issues, concrete practices, bottle-necks, and constraints in knowledge sharing.
Management Research
The tension between ‘‘local’’ production of much of the knowledge and its globalizing is specifically
Laboratory, The University of
addressed.
Sydney, Sydney, NSW, Australia
Design/methodology/approach – Qualitative analysis based on a detailed case study of the
(jdavis@it.usyd.edu.au).
knowledge-sharing practices in two business units, two functional areas (R&D and engineering project
Eswaran Subrahmanian is management) in four countries.
Principal Research Scientist at
Findings – Focus on certain types of organizational knowledge to the exclusion of others can be
the Institute for Complex counter-productive. Knowledge management (KM) systems need to be integrative and flexible
Engineered Systems (ICES), enough to facilitate the dynamic interplay between different forms of knowledge across the space and
Carnegie Mellon University, time.
Pittsburgh, PA, USA Research limitations/implications – The results of a case study are somewhat limited in terms of their
(sub@edrc.cmu.edu). Arthur W. generalizablity.
Westerberg is University Practical implications – The insights from the study offer useful guidelines for designing systems and
Professor and Swearingan processes for sharing and managing knowledge in large, diversified organizations.
Professor of Chemical Originality/value – Most field-based investigations into knowledge management tend to focus on
Engineering at Carnegie Mellon specific KM projects. This is one of the few comprehensive studies that analyzed knowledge-sharing
University, Pittsburgh, PA, USA practices and constraints at both local and global level in large organizations. It elucidates the key
(aw0a@edrc.cmu.edu). facilitators and inhibitors of knowledge sharing in such organizations.
Keywords Knowledge management, Knowledge organizations, Multinational companies
Paper type Case study

Introduction
The notion of contemporary organizations as knowledge producing, sharing, and
disseminating entities is gaining rapid currency among researchers in a variety of
fields. While the critical role played by the stock and application of knowledge in
economic development at the macro-economic level is relatively well understood
(Machlup, 1980; Nelson and Winter 1982; Eliasson et al., 1990), its centrality in the
management of individual firms is more of a recent concern. This interest is perhaps a
response to the challenges posed by an increasingly complex business environment
characterized by intensified competition, greater globalization, and compressed
product life cycles and the consequent information overload for senior management.
Concurrently, advances in information and communications technologies (ICT) in the
form computer-supported cooperative work (CSCW) systems, groupware, internet,
intranet, and the world-wide web (WWW) offer capabilities for developing effective
This research was funded by a grant solutions to the KSM problem.
from the Carnegie Bosch Institute,
Pittsburgh, PA under the ‘‘impact of
global information revolution on
The issues referred to above have been faced in a more acute form by large, multinational
international management’’ program. corporations (MNCs) for which the forces of global integration, local differentiation, and

DOI 10.1108/13673270510582992 VOL. 9 NO. 1 2005, pp. 101-112, Emerald Group Publishing Limited, ISSN 1367-3270 j JOURNAL OF KNOWLEDGE MANAGEMENT j PAGE 101
‘‘ Information overload is a constant refrain, especially among
the R&D personnel we interviewed. ’’

worldwide innovation have become more compelling. Several scholars have argued that
such firms have had to devise means to enhance their global flexibility and learning levels
in order to stay competitive (Bartlett and Ghoshal, 1989; Doz and Prahalad, 1991). This is
increasingly achieved through the adoption of new organizational capabilities for pooling
world-wide knowledge and to transfer and adapt innovative product and process
technologies and project management know-how to international markets. We investigate
empirically the organizational and technological mechanisms employed by MNCs to
promote knowledge sharing and to develop and manage their intellectual resources. The
implicit assumptions that underpin the category of ‘‘knowledge’’ in this context as
reflected in the relevant literature and contemporary organizational practices will be
explored and the local-global dialectic in the creation and sharing of this knowledge will
be investigated.
The KSM problematic in large, global organizations is exacerbated by the local-global
dialectic arising from the tension between ‘‘local’’ production of much of the knowledge and
its globalization and recreation in new contexts. The local production occurs as a result of
distributed R&D operations, joint ventures with strategic partners, and through collaborative
work with advanced and demanding customers. From an anthropological perspective, most
local knowledge is constituted as complex and coherent wholes in the form of intricate webs
of meaning (Geertz, 1983). Van Krogh et al. have addressed some of the challenges in
globalizing and recontextualizing such local knowledge through a process of triggering,
packaging/dispatching, and re-creation of the knowledge in new contexts (Van Krogh et al.,
2000, pp. 207-239).
This study was carried out as a field-based investigation into the sources and the
conceptualization of organizational knowledge and KSM practices in R&D and project
management in a large MNC – Du Pont, a diversified, manufacturing company headquartered
in the US. The company has extensive R&D and project management operations in a number
of countries. The data gathering was carried out through in-depth, semi-structured interviews
with key R&D and engineering project management (EPM) executives in four different
countries in two separate business units and the corporate headquarters.

Overview of the relevant literature

Organizational knowledge
There is emerging consensus that perhaps the most important source of sustainable
competitive advantage in an increasingly turbulent global business environment is knowledge.
The organizational capability to create, recognize, disseminate widely, and embody knowledge
in new products and technologies is critical when faced with shifting markets, rapid product
obsolescence, hyper-competition, and financial upheavals (Nonaka, 1991).
From an epistemological standpoint, most of what is characterized as organizational
knowledge falls within the purview of ‘‘weak knowledge’’ (Goldman, 1999). This is sharply
contrasted with ‘‘strong knowledge’’ which is the traditional focus of philosophical inquiry.
Strong knowledge imposes very stringent criteria for the use of the term ‘‘knowledge’’ which
includes true belief backed up by valid justification or warrant for the belief, and the
systematic exclusion of alternative possibilities. Such knowledge is rarely attainable in
organizational domains. Besides, this form of skeptical rationalism bears little resemblance
to the action-centered processes of generating, sharing, and utilizing knowledge in
organizations. It is unlikely that the ‘‘strong knowledge’’ epistemology is particularly
helpful[1] in view of the inter-subjective and social nature of much of organizational

j j
PAGE 102 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
‘‘ Diversity of media and the lack of integration present
challenges for effective knowledge processing. ’’

knowledge. The latter tends to privilege consensus, immediacy, and perceived use value
over strong-verifiability, super-objectivity, and transcendence. As well, weak knowledge
tends to be ‘‘sticky’’ and local and rendering it more global is one of the significant
challenges for knowledge management.
Traditional microeconomic theory depicts (technical) knowledge as a quasi-public good. It is
characterized by high levels of indivisibility and non-excludability. Its generation is the result
of scientific research and general methodological procedures. Its transfer is largely
unproblematic and is viewed as a spontaneous aspect of the economic system. The ability
to appropriate the knowledge by the innovator is low even though patenting and intellectual
property rights can reduce the scope for societal benefits from the knowledge (Arrow, 1969,
1994; Antonelli, 1999).
This perspective has been challenged by a number of researchers. The distinction between
technological information and technological knowledge is sharply drawn with the latter
conceptualized as incorporating a set of capabilities and competences needed to utilize the
knowledge which in turn can be leveraged to generate new knowledge. Such knowledge is
generated by a process characterized by cumulativeness and path dependence (Jorde and
Teece, 1990; David, 1993). Knowledge, according to this view, is highly localized and
embedded in the previous background and experience of individual firms. It is the result of a
learning process and involves highly specific and relatively ‘‘tacit’’ knowledge processing
(Antonelli, 1999).
The central role of knowledge in the firm and the organizational processes and mechanisms
for its integration and sharing across national borders is the primary basis of Kogut and
Zander’s theory of the multinational corporation. They have also highlighted the need for the
mechanisms to be sensitive to the degree of tacitness or codifiability of the knowledge
(Kogut and Zander, 1992a, b).
A range of definitions and perspectives on knowledge has been presented in the
organizational literature. Kerssens-Van Drongelen et al. (1996) defines knowledge primarily in
the context of R&D as ‘‘. . . information internalized by means of research, study, or experience,
that has value for the organization’’ (Kerssens-Van Drongelen et al., 1996, p. 214). Similar
conceptualizations of knowledge as the result of processing and refining of information have
been implicitly or explicitly employed by a number of authors. This view is an extension of the
information-processing paradigm popularized by March and Simon (1959), Simon (1977),
and Galbraith (1974). More recently Simon (1996) has argued that the challenge for today’s
managers is to filter and extract relevant knowledge from the vast amounts of potential and
actual information available from internal and external sources. Effective systems need to be
designed to intelligently filter information. In a similar vein, Davenport et al. (1998) describe
knowledge as information combined with experience, context, interpretation, and reflection
and knowledge production as comprising value addition to information (Davenport et al.,
1998). Information is the flow of messages or meanings that might add to, restructure, or
change knowledge according to Machlup (1980). Coyne (1997) refers to these and related
representations of knowledge as the system theoretical view according to which the essential
knowledge is contained in the information content and the subjective, inter-subjective, and
spatial aspects are largely ignored.
An alternative view has been championed by Dretske (1981), Nonaka (1991 1994), among
others. This perspective acknowledges the importance of subjective factors such as beliefs
and their links to actions as well as the relatively tacit dimension of knowledge. Knowledge is
associated with beliefs produced and sustained by information (Dretske, 1981). Information

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 103
represents a flow of messages but knowledge is created and organized from it, anchored by
the commitments and beliefs of the concerned individuals. There is also a connection
between such knowledge and the subject’s ability to plan and act.
The more implicit and tacit dimension of knowledge has also been highlighted. For Polanyi
(1967), explicit or codified knowledge is what is transmittable through formal and systematic
languages. Tacit knowledge is more personal and subjective, making it difficult to be
formalized and communicated. It tends to be deeply rooted in action, commitment, and
involvement in a specific context. According to Nonaka, individuals are able to recreate their
own systems of knowledge to accommodate ambiguity, noise, and randomness generated
in the organization in its interaction with the external environment (Nonaka, 1994). Such
knowledge resides in situated communities of interpreters rather than in texts or messages
and these make sense only in particular interpretive contexts (Reddy, 1979). These
communities emerge not through absorption of abstract knowledge but when members
become insiders and acquire the community’s shared vision and speak the same language
(Brown and Duguid (1991). The notion of contextualization of knowledge and evolving
communities of practice have particular resonance for MNCs given the geographic
distances and cultural differences across units around the world. Following Coyne (1997),
we refer to this perspective as the pragmatic view.
Table I presents a somewhat stylized set of distinctions between the systems-theoretic and
pragmatic perspectives on organizational knowledge. It is worth noting that the dichotomy
between system theoretical and pragmatic perspectives and their respective
epistemological and ontological assumptions pervade most of the writings on
organizational knowledge.
As we would expect, the fundamental differences between the system theoretical and
pragmatic views in characterizing organizational knowledge are reflected in the divergent
approaches and perspectives on knowledge creation, sharing and management in
organizations. In general, the former tends to focus on structural and systemic approaches
while the latter emphasizes human-centered processes such as socialization,
self-organizing teams, extended social interactions, personnel rotation etc. Besides the
diverse modes of knowledge creation and transformation, the globalized firms are faced
with the challenge of mobilizing and integrating fragmented forms of knowledge spread all
over the world (Cohendet et al., 1999). As well, developments in information and
communications technologies (ICT) are increasingly making it easier to separate, transport,
and trade knowledge (Antonelli, 1996).

Research methods and techniques


The fundamental research questions we addressed were the following:
B What are the different mechanisms by which large, multi-national corporations (MNCs)
pool, exchange, and share knowledge?
B What procedures are employed to contextualize, validate, verify, and authenticate the
knowledge generated? What is the nature of the local-global problematic?
B What is the role of information technology in supporting KSM?

Table I Perspectives on organizational knowledge


System theoretic view Pragmatic view

Source Documents, databases, systems, prototypes, People, communities


processes and procedures, manuals etc.
Form Codified or codifiable, explicit Tacit, implicit, intuitive
Transfer Exchange of documents, electronic means, Socialization, apprenticeship, osmotic, situated
formal training learning
Organization Relatively mechanistic Organic
Philosophical perspective Cartesian, separation of mind and body More holistic, unity of mind and body

j j
PAGE 104 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
‘‘ The ongoing changes in computer hardware, operating
systems platforms, data formats, and software versions tend
to limit the archival value of electronic documents.‘‘

The field study was carried out at Du Pont, a large, diversified, global manufacturing
company with its headquarters in the US. Du Pont has world-wide operations and a
strong commitment to understanding and sharing knowledge across the various
business functions and units. The industries and markets that Du Pont serves range from
traditional and specialized nylons, polymers, polyesters, and fibers to agriculture,
nutrition, and pharmaceuticals organized under eight business segments and 20
business units. Through most of its history, Du Pont has seen itself as a science-based
company with the mission of bringing the benefits of science to the marketplace in ways
that benefit the shareholders and society. Scientific and technological knowledge in
chemical and material sciences and biological sciences is the basis for the company’s
business portfolio.

In order to keep the investigation focused and within manageable proportions, we restricted
our data collection activities to knowledge in of two functional areas (R&D and engineering
project management (EPM)), two business units (Engineering Polymers and Microcircuit
Materials, a part of the iTECH business unit previously named Photopolymers and Electronic
Materials), in four countries (USA, UK, Switzerland, and Japan). R&D and EPM are both
highly knowledge-intensive domains that are at the core of Du Pont’s global operations.
While R&D activities at Du Pont are performed at the corporate level as well as distributed at
multiple sites in each of the business units, EPM is largely centralized at the corporate level.
Both Microcircuit Materials (MCM) of iTECH and Engineering Polymers (EP) have
manufacturing, marketing, and R&D operations spread out across the Americas, Europe,
and Asia.

The data gathering and analysis methods most appropriate to the questions we have raised
are primarily qualitative and interpretive. Direct observation is clearly the best approach to
investigating these problems where the process dynamics needs to be captured and the
potential for multiple, conflicting interpretations can be expected. However, direct
observation of the focal phenomenon can be extremely difficult and can impose an
inordinate burden on the participating organizations. Accordingly, we settled for in-depth
interviews with a cross section of professionals. By asking a common and structured set of
questions based on semi-structured questionnaires, we attempted to unravel the complex
and sometimes implicit processes of knowledge sharing, their relative efficacy in different
contexts, and the bottlenecks to effective sharing. It also enabled us to understand and
interpret knowledge sharing in context, which is very important from our perspective. We
also collected copies of documents that provided us with a more comprehensive portrait of
KSM activities. This makes up, to some extent, for the limitations arising from not being able
to investigate knowledge management issues longitudinally. The nature of this kind of
interpretive data gathering required that we analyze, interpret, and reanalyze the data as it
was collected.
All except eight of the 44 respondents who participated in the study were interviewed at their
normal places of work. This was important because it gave them an opportunity to access
and refer to documents, to demonstrate the system(s) they use, and to point to additional
sources. Each interview lasted between 75 and 90 minutes and the responses were
recorded by two of the researchers. A profile of the participants in the study is presented in
the Appendix.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 105
Case study: findings and discussion

Explicit forms of knowledge


Du Pont has a long tradition of relatively successful deployment of procedures and systems
for codification and document management. It pioneered the development of systems for
the storage and retrieval of scientific and technical information. The migration of many of the
document and knowledge bases to the web is progressing steadily. At the corporate level,
the C3PO initiative involving the company-wide rollout of Lotus Notes and related intranet
software is meant to provide the architecture for global collaboration. As well, the corporate
information services (CIS) unit is responsible for managing Du Pont’s proprietary knowledge,
competitive intelligence function, and the libraries. The general emphasis on codification of
knowledge is intended to ensure greater standardization and company-wide dispersion of
such knowledge. We present below some of the significant issues related to their use and
perceived effectiveness.

Information overload is a constant refrain, especially among the R&D personnel we


interviewed. It appears that the amount of information that is ‘‘pushed’’ at people through
e-mail, document attachments and databases in addition to the physical means is much
higher than what can be meaningfully processed by most in the time available. The
filtration and search capabilities of Lotus Notes document databases and the intranet
are far too primitive at this stage in relation to the need. One of the respondents
compared these to dumpsters in which one had to forage hard to find something useful,
if any. The time spent scanning large volumes of marginally relevant information also
came in the way of the critical reflection needed to summarize and present in succinct
form, their own work for the benefit of others. This issue points to the limits of codification
in the absence of means for filtration and contextualization of the vast amount of local
information.
Diversity of media and the lack of integration present challenges for effective knowledge
processing. Much of the information needed by the respondents is already in electronic form
but it is fragmented across a variety of incompatible computer systems and databases. In
addition, each knowledge professional typically inherits one or two filing cabinets containing
hard copies of correspondence, memos, reports etc. when they begin a new assignment.
The differences in the indexing schemes and search mechanisms for the different document
bases impede routine activities such as finding useful fragments of information and
synthesizing them in the context of particular tasks.
There are critical issues with respect to archival practices that bear on knowledge sharing
and management. The traditional, paper-based systems appear to be withering while a
robust and company-wide electronic regime is yet to emerge. The ongoing changes in
computer hardware, operating systems platforms, data formats, and software versions
tend to limit the archival value of electronic documents. The old adage of ‘‘if only a
company knew everything it knows’’ could be paraphrased as ‘‘if only a company ‘knew’
everything that is buried in inaccessible computer systems distributed in time and
space’’.

Repositories are available at different levels in the Du Pont units we studied. There are
individual repositories and databases that hold much of the local knowledge that is
generated. There are also repositories at the level of small groups of people who work
together on the same project or with one or a group of customers. The groups need not
always be co-located or working synchronously. Finally there are business units or
corporate-level information bases that are maintained. It is inevitable that there will be
some duplication and inconsistency across the three levels of repositories. More
importantly, the limited general access and the terminological and linguistic differences
across these levels hinder the potential for knowledge sharing. As we shall see later,
critical human interlocutors emerge at the interfaces between the levels to ameliorate the
problem.

j j
PAGE 106 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Several commentators have emphasized the importance of naturally occurring local learning
and innovation that takes place at all levels of an organization (e.g. Brown, 1991). A related
issue is one of facilitating the wider sharing of relevant aspects of such local knowledge
through appropriate organizational processes and systems. We found in the course our
study that in many situations, such knowledge is embedded in organically evolving and
bootlegged systems quite independent of the formal mechanisms available for
systematizing and sharing. The reasons for the emergence of such local mechanisms are
varied and include:
B the need for specialized structures and capabilities to represent the knowledge;
B lack of timely support and assistance for the use of standardized tools and mechanisms;
and
B idiosyncratic attachment to particular computer tools and legacy systems by some of the
employees.
In any case, the need to track, evaluate, support, and selectively integrate such local and
bottom-up systems cannot be overstated. These are useful initiatives arising from the
need to formalize and disseminate localized knowledge with potentially wider implications
that arise as a by-product of routine operations. We were able to document a number of
these at both MCM and EP though the preponderance of such initiatives at MCM is worth
noting.
The central engineering group at Du Pont has traditionally been the pioneers in the use of
electronic collaboration technology. This is not particularly surprising given the global
distribution of project work and the need for rapid, multi-directional knowledge flow among
Du Pont engineers, vendors, external contractors, and joint venture partners. As one of the
respondents put it, it is a trade-off between getting to work remotely with the collaboration
tools or being away from family for long durations. Also, a great deal of effort has been
expended in the past aimed at codifying and standardizing a large part of the process
knowledge generated over the past 50 years of project work. We observed a preponderance
of explicit knowledge-based mechanisms being successfully deployed and used in EPM.
More importantly, systematized knowledge in the areas of systems for plant safety (design
and monitoring) and environmental pollution prevention have become saleable commodities
in the form of consulting packages. There is growing demand for such packaged know-how
especially in the Asia-Pacific region. The ability of the engineering design group to
transcend the local-global divide may be attributable to the maturity of the knowledge base
and the relative homogeneity in engineering education.
There are differences between the Du Pont units in the modes of structuring and organizing
knowledge. For instance, MCM operates in a highly dynamic market place in which rapid
learning from the customers and adapting to their changing needs are normative. This has
led to the organization of MCM repositories to be primarily based on products, customer
segments, and application areas and only secondarily on materials. This is despite the fact
that the training and specialization of the scientists and technical support personnel is by
materials. In contrast, the business strategy of EP is driven by material push and the
knowledge organization at EP is primarily along the material dimension (such as Hytrel,
Minlon, Delrin etc.) and within each by major customer groups and other equipment
manufacturers (OEMs). The need for multiple and divergent classification schemes and
indexing systems is a routine aspect of knowledge processing. This has important
implications for the design, implementation, use, and maintenance of company-wide KSM
solutions with their implicit assumptions regarding the relative homogeneity of knowledge
representation schemes.
The resources needed to carry out the difficult and unrewarding task of combining tacit
and often fragmented forms of explicit knowledge into systematized and reusable
knowledge and to make it publicly available is considerable. While central engineering has
been successful at this for some time, the relationship between the availability of
organizational slack and the group’s ability to standardize and deliver such knowledge is
worth exploring. One of the issues arising from the massive outsourcing and downsizing at

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 107
central engineering relates to the reduction in slack and its effect on codification work.
There are also concerns about continual erosion of the knowledge base with the
department having to rehire ex-employees and to bring in employees from contracting
firms to stem the erosion.

Relatively tacit knowledge


It is certainly the case that knowledge workers consult a variety of sources of explicit
knowledge and fall back on their learning from formal study and training. However, in the
course of actually doing their jobs, most of the learning arises from engaging with real
problems, gathering a range of relevant information from diverse sources, and discussing
key issues with colleagues and other professionals. Most of the R&D respondents in
particular have a strong network of people spread out all over the world that they can turn
to for consultation and guidance. A surprisingly large number of such contacts are from
outside their own units in other business units to departments in Du Pont or from external
entities such as vendors, customers, contractors, universities, and joint venture partners.
Some of the respondents can count upwards of 300 people in their network and the
intensity of interaction varies over time depending on the tasks on hand. These are
typically bottom-up formations and successful R&D scientists spend considerable time
and energy developing and nurturing such networks. The diffusion of tacit knowledge in
particular through these social networks is quick. The track record of the informants in
terms of both the reliability of the acquired knowledge and trust largely determines the
length of such associations. Informal protocols of reciprocity also feature in the
assessment of track record.
Du Pont has long recognized and promoted mechanisms for such networking. In-company
technology conferences and periodic seminars involving people from diverse businesses
and technology areas provide facilitative forums for making initial contacts and extending
one’s networks. Other processes include apprenticeship training for new scientists under
well-established R&D personnel as well as the rotation of the latter through newer research
sites. The Utsonomiya (Japan) research facility in MCM was established and developed
primarily through people rotation and active apprentice-type training programs. In
addition, short visits by scientists either to the experimental station or to other research
sites within the business units are seen as useful mechanisms for the sharing of less
tangible aspects of knowledge. The reliance on these mechanisms based on ‘‘high
bandwidth’’ channels was more pronounced in R&D as compared to EPM and within R&D,
greater in MCM in comparison with EP. This is consistent with the general observation that
higher the role of tacit knowledge, greater the reliance on human- and organization-centric
mechanisms.
There is some concern in the R&D groups that the increasing dependence on electronic
communication and coordination and the reduced opportunities for face-to-face contact are
causing a slow erosion in this mode of knowledge sharing. One of the requirements for any
social network to remain active and to be effective in knowledge sharing is the periodic
opportunity to ‘‘catch-up’’ through face-to-face meetings, albeit infrequently. Such meetings
enable the establishment and/or reinforcement of a common ground needed to cultivate and
develop the trust and reciprocity implicit in such relationships. When the operations are
globally dispersed as they are in MCM or EP, some of the faster and less expensive
mechanisms such as telephone, e-mail, net meeting and even videoconferencing are
constrained by linguistic, cultural, and time-zone related barriers. Sustaining such networks
involves expensive travel and movement. The curtailment of resources for this purpose in
recent years and the increasingly rigorous quarterly financial reporting systems are putting
pressure on the quality and vitality of such networks. This is particularly hard on the younger
scientists who are in the process of developing their personal networks and have to rely on
remote, electronic mechanisms to a very large extent.
Human agents were sought for a variety of reasons. Probably the strongest one that came
up repeatedly in our interviews was the perception that it resulted in significant efficiencies
and time savings. An illustration from an engineering polymers R&D center at Meyrin,

j j
PAGE 108 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Switzerland is illustrative. One of the laboratories there provides analytical support and
carries out a wide range of routine and specialized tests on properties of new materials,
identification of additives etc. When a scientist approaches the supervisor who has
managed this laboratory for the past 15 years, he can give good and immediate answers to
questions such as whether the same test has been carried out previously and if not
whether the results of a closely matched previous test can be extrapolated for a particular
problem that a scientist is trying to solve. The information needed to extract the knowledge
is available in the form of a database of previous tests carried out and the results but this
may not have the format and flexibility requirements that the scientist needs. Searching
through this database is also time-consuming. Besides, the effort and difficulty involved in
externalizing the tacit dimension of subtle matches with previous tests is non-trivial. In
performing even routine research tasks such as interpreting the results of certain complex
analytical tests, sitting down with experts who can help with the interpretation saves much
time and reduces some of the guesswork.
Arising from a combination of historical factors, Du Pont R&D is faced with a bimodal
distribution age distribution of its personnel. The age cohort in the 50-plus range is very
large, in part due to the expansionary spiral of the 1950s through early 1970s. The 40-50 age
cohort is small but there is a small increase in numbers in the 25-40 age group. This
introduces vulnerabilities in the form of potential erosion of tacit knowledge base in the
absence of tacit-to-tacit transfer and tacit-to-explicit conversion of knowledge. Such
programs are resource-intensive and can only succeed in a climate of trust and security.
Also, they need to be guided by sound judgment on the future value of different bodies of
tacit knowledge in relation to the company’s strategic direction and the emerging trends in
the relevant product-market environments.

The phased implementation of the integrated electronic coordination architecture and the
C3PO project has generated incentives and pressures to make a transition to new ways of
collaborating and knowledge sharing. For some aspects of knowledge work this is only a
matter of adapting to a new set of computer tools and they complement other modes.
However, the extent to which this regime can or needs to supplant the existing, organic and
informal mechanisms of tacit knowledge sharing is unclear at this stage. Many of the
respondents do make a clear distinction between electronic collaboration and the sharing of
knowledge.

There is a clear understanding in Du Pont of the limits of the electronic communication,


collaboration, and coordination systems from a knowledge processing perspective. These
systems have laid the foundation for communication, information sharing, and some
amount of collaborative work across distances. It has also provided the means for more
efficient and wider yet selective dissemination of documents and codified knowledge.
However, the recognition that people and their networks have to be effectively and
seamlessly interwoven with the computer networks and systems has triggered the search
for new approaches.
Even the most ardent proponents of electronic communication and coordination concede
that knowledge management can only be roughly 30 percent based on the systems being
implemented and the rest has to be based on people. Accordingly, a campaign to facilitate
the evolution of communities of practice through the use of communication and collaboration
systems has been launched. These communities are largely organic formations comprising
members with specialized skills or know-how. They may rely on technology to identify
potential members and establish communities by sharing useful information, offering
guidance and critique, recounting related experiences etc. There are currently over a dozen
such communities (such as winding experts, people with extrusion skills) in existence at Du
Pont and they function like virtual guilds. A project to inventory and assess the effectiveness
of such on-line communities is currently in progress.
Innovation-oriented knowledge processing involving new product and process
development necessitates the integration of vast amounts of explicit knowledge with
certain unique and deeply personal insights acquired through direct practical experiences.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 109
The validity and veracity of aspects of the new knowledge being synthesized need to be
tested and established through ongoing experimentation and consultations with domain
experts. Much of the explicit knowledge available needs to be interpreted in the new context.
We have documented a number of cases in which both routine and radical innovations were
produced by combining different types of knowledge and making leaps that cut across
traditional boundaries drawn by existing technologies, business units, and even disciplines.
Once the feasibility of the initial idea is established, detailed experiments and trials, and
tests can follow.
One of the main aspects of the integrated knowledge processing described in the
foregoing is the pivotal contributions by key knowledge practitioners who typically
combine extensive amount of tacit knowledge with other knowledge in order to bridge the
local and the global. We were able to identify four such knowledge practitioner roles that
typically operate at the interfaces between disciplines, technologies, business units,
functions, and businesses and customers. Each of these plays an important yet differing
role in ameliorating the local-global divides. A brief description of each of the roles is
presented below:
B High level synthesizer: these are typically senior technology or R&D managers with a wide
range of experiences across several business units and in several operational and
functional areas. They are adept at environmental scanning and closely monitor the
trajectories of various technologies. They operate at the interfaces between different
technology areas and business units. They develop and maintain a vast network of formal
and informal knowledge sources. They look for opportunities arising from various
combinations of different types of knowledge.
B Librarian: librarians operate at the interfaces between materials, processes and product
markets. By virtue of their ability to painstakingly gather, assimilate, index, and store
copious amounts of information pertaining to material properties, costs, customer
requirements, and changes to the requirements over time, they are able to provide timely
assistance and service to other knowledge workers. They deal primarily with explicit
knowledge. It takes nearly 10-15 years’ experience to grow into this knowledge
practitioner role.
B Knowledge engineer: knowledge engineers operate at the classic interface between
R&D, marketing, and customers. They work closely with the customers to adapt existing
products and to develop new products and applications. Some of the work might
appear rather routine but the tacit dimension of knowledge is in their assessment of
customers’ current and future needs. This knowledge has to be integrated with detailed
and explicit knowledge concerning the company’s products, processes, and
technologies.
B Knowledge operators: knowledge operators are typically technical or customer support
personnel who are usually front-line employees located close to the operational realities of
the business and the market place. They accumulate and transmit tacit knowledge in the
form of embodied skills (Nonaka and Takeuchi, 1995). They tend to work at the interfaces
between R&D and manufacturing or R&D and customer operations. They work very
closely with knowledge engineers.

Conclusion
This paper contributes to the growing literature on the organizational processes and
mechanisms for knowledge sharing and management especially in large, global
companies. By drawing on and synthesizing related bodies of writing, we have
attempted to further the debate on what constitutes organizational knowledge. The
diverse forms in which such knowledge manifests leads to a review of some of
mechanisms and systems for knowledge sharing and management in contemporary
organizations and the issues that needs to be addressed to effectively ‘‘globalize’’ at least
part of the locally generated knowledge. A detailed field study exploring the major
knowledge sharing issues, practices, constraints, and mechanisms was carried out in
selected departments and business units of a large, multinational company with

j j
PAGE 110 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
knowledge-intensive operations. The field study lends qualified support to the proposition
that exclusive focus on certain types of knowledge can be counter-productive. Knowledge
sharing and management systems and processes in large global companies need to be
integrative and flexible enough to facilitate the dynamic interplay between different forms
of knowledge across the space and time.

Note
1. The role and influence of strong knowledge in the activities of the R&D department cannot be ruled
out.

References
Antonelli, C. (1996), ‘‘Localized knowledge production processes and information networks’’, Journal of
Evolutionary Economics, Vol. 6, pp. 281-96.
Antonelli, C. (1999), ‘‘The evolution of the industrial organization of the production of knowledge’’,
Cambridge Journal of Economics, Vol. 23, pp. 243-60.
Arrow, K. (1969), ‘‘Classificatory notes on the production and transmission of technical knowledge’’,
American Economic Review, Vol. 59, pp. 29-35.
Arrow, K. (1994), ‘‘Methodological individualism and social knowledge’’, American Economic Review,
Vol. 84, pp. 1-9.
Bartlett, A.B. and Ghoshal, S. (1989), Managing across Borders: The Transnational Solution, Harvard
Business School Press, Boston, MA.
Brown, J.S. (1991), ‘‘Research that reinvents the corporation’’, Harvard Business Review, Vol. 69 No. 1,
pp. 102-11.
Brown, J.S. and Duguid, P. (1991), ‘‘Organizational learning and communities of practice: toward a
unified view of working, learning, and organization’’, Organization Science, Vol. 2 No. 1, pp. 40-57.
Cohendet, P., Kern, F., Mehmanpazir, B. and Munier, F. (1999), ‘‘Knowledge coordination, competence
creation, and integrated networks in globalized firms’’, Cambridge Journal of Economics, Vol. 23,
pp. 223-41.
Coyne, R. (1997), ‘‘Language, space, and information’’, in Droege, P. (Ed.), Intelligent Environments,
Elsevier, Amsterdam, pp. 495-516.
Davenport, T.H., de Long, D.H. and Beers, M.C. (1998), ‘‘Successful knowledge management projects’’,
Sloan Management Review, Vol. 39 No. 2, pp. 43-57.
David, P.A. (1993), ‘‘Knowledge property and the system dynamics of technological change’’,
Proceedings of the World Bank Annual Conference on Development Economics, The World Bank,
Washington, DC.
Doz, Y. and Prahalad, C.K. (1991), ‘‘Managing MNCs: a search for a new paradigm’’, Strategic
Management Journal, Vol. 12 No. 5, pp. 145-64.
Dretske, F. (1981), Knowledge and the Flow of Information, MIT Press, Cambridge, MA.
Eliasson, G., Foster, S., Lindberg, T., Pousette, T. and Taymaz, E. (1990), The Knowledge-Based
Information Economy, The Industrial Institute for Economic and Social Research, Stockholm.
Galbraith, J.R. (1974), ‘‘Organization design: an information-processing view’’, Interfaces, Vol. 4 No. 3,
pp. 30-6.
Geertz, C. (1983), Local Knowledge: Further Essays in Interpretive Anthropology, Basic Books Inc.,
New York, NY.
Goldman, A.I. (1999), Knowledge in a Social World, Oxford University Press, Oxford.
Jorde, T.M. and Teece, D.J. (1990), ‘‘Innovation and cooperation: implications for anti-trust’’, Journal of
Economic Perspectives, Vol. 4, pp. 75-96.
Kerssens-Van Drongelen, I.C., de Weerde-Nederhof, P.C. and Fissher, O.A.M. (1996), ‘‘Describing the
issues of knowledge management in R&D: toward a communications and analysis tool’’, R&D
Management, Vol. 26 No. 3, pp. 213-30.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 111
Kogut, B. and Zander, U. (1992a), ‘‘Knowledge of the firm, combinative capabilities, and the replication
of technology’’, Organizational Science, Vol. 3 No. 3, pp. 383-97.
Kogut, B. and Zander, U. (1992b), ‘‘Knowledge of the firm, technology transfer, and the theory of the
multinational corporation’’, working paper, Institute of International Business, Stockholm School of
Economics, Stockholm, December.
Machlup, F. (1980), Knowledge: Its Creation, Distribution, and Economic Significance, Princeton
University Press, Princeton, NJ.

March, J.G. and Simon, H.A. (1959), Organizations, John Wiley, New York, NY.
Nelson, R.R. and Winter, S.G. (1982), An Evolutionary Theory of Economic Change, Harvard University
Press, Cambridge, MA.
Nonaka, I. (1991), ‘‘The knowledge-creating company’’, Harvard Business Review, Vol. 69 No. 6,
pp. 96-104.
Nonaka, I. (1994), ‘‘Dynamic theory of organizational knowledge creation’’, Organization Science, Vol. 5
No. 1, pp. 14-35.
Nonaka, I. and Takeuchi, H. (1995), The Knowledge-Creating Company, Oxford University Press,
New York, NY.

Polanyi, M. (1967), The Tacit Dimension, Doubleday Anchor, Garden City, NY.
Reddy, M. (1979), ‘‘The conduit metaphor: a case of frame conflict in our language about language’’,
in Ortony, A. (Ed.), Metaphor and Thought, Cambridge University Press, Cambridge, pp. 284-324.
Simon, H.A. (1977), The New Science of Management Decision, Rev. ed., Prentice-Hall, Englewood
Cliffs, NJ.
Simon, H.A. (1996), ‘‘Knowledge and the time to attend to it’’, paper presented at the Carnegie Bosch
Institute International Conference on High Performance Global Companies, Boca Raton, FL, April 21,
available at: http://cbi.gsia.cmu.edu/work/96-2.htm

Van Krogh, G., Ichijo, K. and Nonaka, I. (2000), Enabling Knowledge Creation, Oxford University Press,
Oxford.

Appendix

Table AI Participant profile


Micro circuit materials Engineering polymers
Central Central Exp. Bristol Utsonomiya Meyrin Parkersburg Exp.
R&D engineering station RTP (UK) (Japan) (Swiss) (WVA) station Total

Total no. interviewed 5 5 3 9 7 5 8 1 1 44


No. of PhDs 5 0 3 5 4 0 6 1 1 25
Percent of PhDs 100 0 100 56 57 0 75 100 100 57
Average no. of years at
Du Pont 19.2 23.6 29.0 21.0 11.3 11.6 13 32 21 17.6
Range of years at Du Pont 11-26 14-34 11-33 9-34 1-18 9-14 2-26 32 21 1-34

j j
PAGE 112 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Knowledge management systems:
finding a way with technology
John S. Edwards, Duncan Shaw and Paul M. Collier

Abstract
Purpose – To consider the role of technology in knowledge management in organizations, both actual
and desired.
Design/methodology/approach – Facilitated, computer-supported group workshops were conducted
with 78 people from ten different organizations. The objective of each workshop was to review the
current state of knowledge management in that organization and develop an action plan for the future.
Findings – Only three organizations had adopted a strongly technology-based ‘‘solution’’ to knowledge
management problems, and these followed three substantially different routes. There was a clear
emphasis on the use of general information technology tools to support knowledge management
activities, rather than the use of tools specific to knowledge management.
Research limitations/implications – Further research is needed to help organizations make best use
of generally available software such as intranets and e-mail for knowledge management. Many issues,
especially human, relate to the implementation of any technology. Participation was restricted to
organizations that wished to produce an action plan for knowledge management. The findings may
therefore represent only ‘‘average’’ organizations, not the very best practice.
John S. Edwards, Duncan Shaw Practical implications – Each organization must resolve four tensions: between the quantity and
and Paul M. Collier are all at the quality of information/knowledge, between centralized and decentralized organization, between head
Aston Business School, Aston office and organizational knowledge, and between ‘‘push’’ and ‘‘pull’’ processes.
University, Birmingham, UK. Originality/value – Although it is the group rather than an individual that determines what counts as
knowledge, hardly any previous studies of knowledge management have collected data in a group
context.
Keywords Knowledge management, Communication technologies, Organizations
Paper type Research paper

Introduction
One of the fundamental questions in knowledge management is that of the appropriate role
of information technology in knowledge management in organizations. There are various
possible positions on this. Is an organization’s knowledge management system just an
information technology one? Is information technology a part, but only a part, of a knowledge
management system? Or is information technology really not a key issue in managing an
organization’s knowledge, compared with others such as people or process issues?
In this paper we try to shed light on these questions, using some of the results of a study into
what a variety of organizations in the UK currently do by way of knowledge management,
and what they believe they should be doing. We begin by reviewing some of the literature on
information technology, knowledge management and knowledge management systems. We
then explain the background to our study and briefly describe the methodology we used. We
This research was funded by CIMA,
the Chartered Institute of
then concentrate on the three organizations in the study that have pursued what we
Management Accountants in the UK. identified as ‘‘technology-based’’ solutions. Discussion of the general issues raised by these
The authors are also grateful for the
comments of the anonymous
three cases (and others) leads to our conclusions and thoughts about the future of
referees. knowledge management systems.

DOI 10.1108/13673270510583009 VOL. 9 NO. 1 2005, pp. 113-125, Emerald Group Publishing Limited, ISSN 1367-3270 j JOURNAL OF KNOWLEDGE MANAGEMENT j PAGE 113
Information technology for knowledge management
This paper concentrates on technological aspects of knowledge management (KM),
although this is not to imply that this is the most important area. Davenport and Prusak (1998)
describe KM as involving organizational, human and technical issues, with the advice that
the technical should be treated as least important of the three. Dieng et al. (1999) add
financial, economic and legal issues to this list. Our brief literature review here will similarly
center on technology, and on knowledge management systems, again without wishing to
imply that this is therefore the most important aspect of KM.
Many authors have written about the use of various types of software in knowledge
management, including Junnarkar and Brown (1997), Offsey (1997), Liebowitz (1998),
Borghoff and Pareschi (1998), Dieng et al. (1999), Alavi and Leidner (1999), Hendriks and
Vriens (1999), Earl (2001) and Alavi and Leidner (2001). Since the early days of knowledge
management there has been a particular stream of thinking that stresses the use of
knowledge-based systems software in knowledge management. Strapko (1990) was
discussing this point even before the term knowledge management came into common use,
while Liebowitz has been one of its main proponents, arguing that expert systems have a
crucial role in institutional memory, because of their ability to capture business rules.
Becerra-Fernandez (2000) gives a different kind of example, a people-finder system. It is
clear that expert or knowledge-based systems software, and artificial intelligence (AI)
software more generally, does have a role to play in supporting knowledge management, but
in addition, so does more conventional software.
Table I shows the most common forms of both AI-based and conventional software that
have been suggested by various authors as offering support for knowledge management.
It is noticeable that different authors address this discussion in terms varying from the
very general (such as knowledge based systems and databases) to the very specific
(such as genetic algorithms and workflow). Table I shows the terms as authors have used
them.
Surveys of the use of knowledge management systems include those by Alavi and Leidner
(1999) and Zyngier (2001), and a less formal one by Edwards et al. (2003b). Our intention is
not to go into detail about the various types of supporting software here, discussing their
advantages and disadvantages, since our focus in this paper is on which of these systems
organizations currently use, and would like to use.

Table I Different types of support for knowledge management


AI-based Conventional

Case-based reasoning Bulletin boards


Data mining Computer-supported co-operative work
Expert systems Databases
Genetic algorithms Data warehousing
Intelligent agents Decision support systems
Knowledge-based systems Discussion forums
Multi-agent systems Document management
Neural networks Electronic publishing
‘‘Push’’ technology E-mail
Executive information systems
Groupware
Information retrieval
Intranets
Multimedia/hypermedia
Natural language processing
People finder/‘‘Yellow Pages’’
Search engines
Workflow management

j j
PAGE 114 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Study background and methodology

The organizations
We conducted ten workshops, one in each of ten different organizations. Two of the
organizations agreed to participate as a result of direct contact made by the researchers.
Eight organizations agreed to participate following a mailing to MBA alumni of the university.
These contacts became the sponsors of the research and arranged for the participants from
their organizations. We sought organizations with a genuine interest in, and concern for
knowledge management, and we also wished to ensure that a variety of different sizes and
types of organization was included.

Between five and ten participants – all from the same organization – attended each
workshop. In total there were 78 participants who came from a variety of functional areas.
Each workshop included an accountant, a requirement of our funding from the Chartered
Institute of Management Accountants. With that exception the participants in each workshop
were those selected by each organization. The criteria suggested by the researchers were
that the participants should include ‘‘a sufficient spread of people with awareness of, and
responsibility for, knowledge management’’ and also ‘‘one person responsible for securing
the commitment of resources towards achieving whatever outcomes and actions are
decided upon’’. In the event, most participants were middle or senior managers, with a
sprinkling of junior managers and operational-level staff. In all but two of the workshops, one
participant was at director-level or equivalent. By having a director present the groups
seemed more confident in the strategy they were generating as they were getting immediate
informal feedback on how the board might react, and so were able to appreciate whether or
not they would realistically be allowed to implement any proposed actions.
Of the ten organizations, six were for-profit, three were not-for-profit or non-profit-distributing
and one was public sector. One of the not-for-profit organizations also received significant
government funding. Three of the six for-profit organizations were listed PLCs, two of which
were divisions of FTSE 100 companies. Two organizations were privately owned and one
was a subsidiary of an overseas PLC.
In terms of ‘‘business’’ sector, one was in retailing, two in manufacturing, one in
design/distribution, three in services, one in consumer protection, one in social housing and
the public sector organization was a police force.

The participating organizations are summarized in Table II. The identity of the organizations
has been disguised for reasons of confidentiality.

Table II Participating organizations


Organization Brief description of organization

Restaurants Retail/service business with about 12 major brand names, division of FTSE100
PLC
Police Public sector/police force with 3,700 staff and £140 million budget
DesignInst Design/installation of high technology equipment, subsidiary of overseas-listed
company
HighTechManuf Manufacturing high technology, £100 million turnover and 800 employees,
privately owned
Consult International technical/engineering consultancy, division of FTSE100 PLC
B2BService Business-to-business services, 12,000 employees, turnover £200 million, listed
PLC
R&D Non-profit distributing membership-owned research and development, 550
employees
Housing Non-profit registered social landlord, 500 employees managing 5,500 homes
ManufIndProd Manufacturing industrial products, privately owned
ConsumProt Not-for-profit membership owned non-statutory consumer protection body

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 115
As part of our analysis (Edwards et al., 2003a), we classified the organizations’ preferred
knowledge management ‘‘solutions’’ into three types:
(1) Technology.
(2) People.
(3) Process-based.
Five organizations’ approaches were classified as people based (HighTechManuf, consult,
B2BService, R&D, and housing). Two were process based (ManufIndProd and
ConsumProt). We add further detail here on the three organizations preferring an
approach to knowledge management that placed a particular emphasis on technology:
(1) ‘‘Restaurants’’ was the restaurants division of a listed PLC operating under about a
dozen major brand names throughout the UK. Restaurant turnover was £1 billion in the
last financial year. Most participants were from the planning and insight department.
Because of the selection of participants, the workshop emphasized ‘‘head office’’
knowledge rather than the knowledge in the operating units.
(2) ‘‘Police’’ was an English police force with 2,400 police officers, 1,300 support staff and a
budget of £144 million. Prior to the workshop, ‘‘Police’’ had increased the police levy (the
portion of the council tax that pays for police services) by 33 percent and wanted to
develop a communications strategy, ‘‘a shared commitment to a shared plan’’.
(3) ‘‘DesignInst’’ was the design and installation division of a high technology equipment
supplier, a subsidiary of an overseas listed company. They were implementing a new
enterprise accounting system and wanted to ‘‘make sense of the information we have’’.

The workshop approach


The methodology used to run the workshops is one that has evolved during more than 15
years of research, initially called strategic options development and analysis (SODA) (Eden
and Ackermann, 1989) and more recently being renamed journey making to take account of
advances in the method (Eden and Ackermann, 1998). Journey making, a mnemonic for JOint
Understanding, Reflection, NEgotiation of strategY, supports groups in surfacing, exploring,
synthesizing and critically reflecting for personal and collective learning (Shaw et al., 2003).
During a journey making workshop computer technology is used extensively to help the
participants to surface, explore and synthesize their views. Each participant has access to a
laptop computer that is networked. Instead of shouting out views to the facilitator, or writing
them onto ‘‘post-it’’q notes, participants type their views into the computer which is running a
brainstorming-type software, Group Explorer. The views are normally four to ten words in
length to make them descriptive, rather than cryptic, to the other participants.
Once participants have finished typing their views into the computers, all the views are
shown on a large projection screen using Decision Explorer software. They will have been
clustered by content by the facilitator, to assist the group members cognitively to manage
the mass of information on the screen (up to 100 different views) (Grise and Gallupe, 1999).
Then participants have the opportunity to read other participants’ views, expand on them, or
critique them (Shaw, 2003).
Following this activity group discussion ensues on the views, clusters and causal
relationships. Normally a large number of views are considered and a tremendous amount of
complexity arises. The different perspectives of (up to) 15 people are each considered
systematically using a transparent, structured and logical process. This ensures that:
B the group will make real progress rather than going round in circles;
B there is equalization of air-time between participants, reducing the dominance of any
individuals; and
B each option can be fully considered before being dismissed or integrated into the action plan.
Computer brainstorming has the advantage that people:

j j
PAGE 116 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
B can share ideas/views simultaneously, rather than all fighting for ‘‘air-time’’ (Pinsonneault
et al., 1999);
B have anonymity when they share their views, to encourage controversial views to be
shared (Cooper et al., 1998);
B can accurately record their views, rather than a facilitator misunderstand (Eden and
Ackermann, 1998); and
B the group can edit and move views, en masse, rapidly.
The output from this process takes the form of a group causal map, or strategic map, an
example of which is shown in Figure 1 (this example, from B2BService, has been chosen to
show that there was more to the workshop discussions than IT). This map can be analyzed to
identify a range of actions that might be implemented to improve the situation. Group
consideration and negotiation supports the identification of the right actions to implement.
Through this entire process the participants are building a map, negotiating agreement, and
giving commitment to the group to support action being taking to address the situation
(Eden and Ackermann, 1998).
A consistent four-stage process. In this research we adopted a consistent approach to all the
workshops. This involved having the same facilitator, and either one or two observers. We also
used identical technology and software and a standard four-stage agenda, which consisted of:
(1) Stage 1: what knowledge informs your business?
(2) Stage 2: what processes are currently used to harness this knowledge?
(3) Stage 3: what processes should be used to harness this knowledge?
(4) Stage 4: how do we (or should we) evaluate how good we are at harnessing this
knowledge? (‘‘We’’ refers to the participants’ organization.)

Figure 1 An example of a map produced in one of the workshop sessions

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 117
Although this agenda was consistent, it was used flexibly rather than restrictively. We
recognized that there was no one best solution to knowledge management, and so allowed
each organization to determine the specific content of the workshops within the broad field
of knowledge management and the research questions, to suit their own interest.
After the first session (which was necessary to get the group thinking together about
knowledge management) the participants decided whether our agenda was appropriate for
them and ‘‘whether (they) would regret not having discussed something else’’. Sometimes
the group followed our agenda throughout, but more often they added sessions and
refocused others to be more relevant to the expertise in the group and the urgency of
particular issues. For example, one group added a session which asked ‘‘how can we get
reluctant people to pull information off the intranet?’’ The debate which surrounded the
validation/amendment of our agenda provided insight into the pressing knowledge
management issues which faced the organizations.

Extent of technology use in KM


The workshop discussion covered many aspects of knowledge management in the
participating organizations. In this paper we concentrate on knowledge management
systems and the role of information technology, but discussion of other issues may be found
in Edwards et al. (2003a). Of the ten organizations in the study, information technology was a
significant element of the discussion in all but one of them. The one exception was
‘‘ManufIndProd’’, where although two types of information technology (e-mail and
knowledge-based systems) were mentioned during the identification of processes that
were relevant to current KM, neither was pursued in the subsequent detailed discussions.
We now summarize the discussion relating to information technology firstly in the three
organizations in which it was the major focus of discussion, and then in the six organizations
where it was a significant but minor element. This includes the various types of IT that
participants mentioned as being relevant to supporting KM, and a little indication of the
context of the discussion. Direct quotations from workshop participants are shown in italics.

Restaurants
A feature of the workshop held for ‘‘Restaurants’’ was its focus on ‘‘head office’’ knowledge
(such as sales, marketing and financial aspects) rather than on ‘‘operational’’ knowledge
(such as how to cook and serve meals in the restaurants, which dishes were most popular,
and so on).

Restaurants claimed to make extensive use of technology in supporting knowledge


management, including internet searching, an intranet, MIS, accounting and payroll
systems (the latter ‘‘for details about staff’’), shared databases, an ‘‘electronic library’’ and
an externally held data warehouse. They also talked about ‘‘cubes’’ of information, by which
they meant OLAP-style analyses, although these were not yet available because of the
unreliability and inconsistency of the data.
Their desire for the future was thus for a single source of knowledge that required standard
site technology . . . ‘‘the long term fix is dependent on a technological solution’’. Interestingly,
plans were already in hand for such a technological solution to be implemented, but until the
workshop took place the participants did not seem to have appreciated the significance of
this system as regards to KM. There was definitely an ‘‘aha’’ moment during the workshop as
this became apparent, and the new system became more and more central to the action
plan for KM that they were trying to devise.

The best way to describe Restaurants is that they were led to a technological approach to
KM – perhaps due to a focus on understanding the customer. Despite the extensive use of
technology in Restaurants, participants had a broad view of KM but realized during the
workshop how important technology was to it.

j j
PAGE 118 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Police
‘‘Police’’ as an organization was new to formal KM, and the managerial initiative that led to
their participation took a broad view of what KM meant. However, most of the suggestions
and discussion in the workshop were strongly technology-based. Police forces in general
make extensive use of information technology and indeed other types of technology
(everything from helicopters to DNA profiling), and this force is no exception.
Current uses of IT for KM included e-mail, the intranet, and the Police National Computer,
although more than any other organization the descriptions often concentrated on the
hardware (e.g. notebooks or personal organizers) rather than on what was done with it.
Police also make extensive use of video, although discussion of this technology is beyond
the scope of this paper.
A great deal of the discussion in the workshop focused on how much and how well the official
IT systems were used, as illustrated by the following quotes:
How many people are logged on [is not a good measure] . . . of how many people used it. A whole
room will use data if one person is logged on.
A PC [police constable] has to access information at the beginning of their shift before going on
patrol to be able to do their job.
We need to make some things only available via the intranet, for example [forms for] expense
payments, annual leave, overtime.

An issue not raised in any other workshop was that of the use of unofficial IT systems.
‘‘Privately owned organizers and laptops that people shouldn’t have are a barrier to
communication.’’

The overall focus of the Police workshop was on communications. For the future, the intranet
was ranked as the most effective tool for corporate communications and second (after
intelligence led policing) for operational communications:

If you only put information in one place, that is where people will go to use it.

A corporate web site was also identified as a good way to communicate with external
stakeholders.
The best description for this case is that police were opportunist with information technology,
or indeed that they cannot keep away from technology – to improve processes and
provision to ‘‘customers’’/stakeholders.

DesignInst
‘‘DesignInst’s’’ attitude to KM was definitely technology-driven, although there is scope for a
little debate as to whether they were driven to technology, because of the support needed for
their business processes, or driven by technology, with their interest in KM arising from the
introduction of a new information system. It was this system, and the uses that could be
made of it, that formed the focus for the workshop.
Discussion in this workshop concentrated almost entirely on the future, rather than the
present. They identified their new ‘‘enterprise accounting system’’ as a key resource for all
aspects of the business, including KM, and also the need to make fragmented knowledge
more coherent.
Examples of this included a discussion of online access to stock availability. One participant
said:
The management system will not do anything for us in relation to ‘how do we find new customers’’.

The same person also said that:


Not enough time spent on outward looking areas, we are too inward looking.

Another commented that:

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 119
The nature of what is produced is that the system doesn’t provide product development
information.

A frequent theme was the need to define a list of MIS requirements and reduce duplication of
information held. They called this a ‘‘flight deck’’ for the business.
Interestingly, what was never explicitly discussed was whether the new system addresses
the clusters that participants identified during the workshop as the key elements of
knowledge their organization needed.
There was a real conflict in the workshop between the ‘‘systems’’ side of the business and
the ‘‘product’’ side, hence the difference between the two themes of reducing duplication
and satisfying customers.

HighTechManuf
Current uses of information technology for KM were identified as including the internet (for
searching), an intranet, e-mail, bulletin boards and shared files. They also identified that the
organization had ‘‘islands’’ (meaning they were unconnected) of databases. In considering
the processes that ‘‘HighTechManuf’’ should use for KM, the participants discussed a cluster
of ideas related to IT, but the focus was on internal communications (the label given to the
cluster) rather than the technology itself. By internal communications they meant what
needed to be communicated, and to whom. This was typical of their overall ‘‘people’’ focus.
Much of the discussion centered on sharing and storing operational manufacturing
knowledge, for which solutions such as a printed sheet of instructions (laminated to resist
grease etc.) were preferred to an information technology solution.

Consult
Current uses of IT for KM in ‘‘Consult’’ included e-mail, internet, intranet and shared
databases. In discussion they characterized databases as a ‘‘compendium of knowledge’’.
A specific concern was the over-use of e-mail, which resulted in a suggestion for ‘‘message
boards’’ rather than one to one e-mails. They also wished to see more and better integrated
databases. Again, their overall focus was on people, especially in relation to business
aspects, rather than technology.

B2BService
Current IT relevant to KM was identified as including e-mail, sundry databases, and
accounting software. The focus of their discussions on KM was very much more concerned
about market than processes ‘‘if we are driven by internal process issues we will fail in the
marketplace’’. Thus they considered ‘‘how to achieve KM’’ – ‘‘KM through . . . ’’ under a
variety of headings, including a large cluster of ideas labeled ‘‘KM through IT’’, as shown in
the map in Figure 1. Items here included an intranet (and access to it from offsite), an
extranet, a data warehouse, better MIS, video conferencing, better use of the web and
protocols for the use of e-mail. Although technology was clearly an important element,
people were seen as an even more important aspect, as witness the clusters on human
resources and (social) networking.

R&D
This organization was a very advanced information technology user. Systems identified as
relevant to KM included databases (internal and external), e-mail, bulletin boards, the
internet, and an intranet. However, they felt that the introduction of this technology had not
always been effective. ‘‘Informal mechanisms have been replaced by formal e-mail and the
information doesn’t get shared as well’’. They also had a history of confidentiality (because of
the nature of their work) which meant that many databases were not accessible, one
participant called this ‘‘anal’’! Looking to the future, they stressed the need for scanning of
documents to help form a centralized document repository, enabling intranet access and
search, and with abstracting features. However, they felt that technology was not the main
issue in KM in their organization. Rather, it was crucial to concentrate on the people aspects,
especially the fact that the nature of the business had undergone a significant change.

j j
PAGE 120 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Housing
Current uses of information technology for KM discussed in this workshop included e-mail,
an intranet, and shared files. ‘‘Housing’’ was also the only organization to mention the current
use of its own web site for communicating knowledge to external partners. For the future,
participants talked about an ‘‘electronic library’’ and encyclopedia, plus an extranet, and a
portal for news reports etc. They identified a critical need for ‘‘summarizing, abstracting,
disseminating’’ rather than just placing files somewhere where people might (or might not)
choose to access them. However, this was a relatively minor part of the discussion
compared to their interest in partnerships and (social/organizational) networking, and in the
issues raised by staff being split between their two main offices.

ManufIndProd
‘‘ManufIndProd’s’’ discussion concentrated on processes because of the recent
management buyout and a possible future change of location of their only site.

ConsumProt
‘‘ConsumProt’’ also had a process focus to their discussions on KM. In discussing current
processes, they identified e-mail and a cluster of more than a dozen items relating to different
databases. Interestingly, participants chose to include MS Word and MS Project files in this
cluster. Despite this extensive discussion about retaining knowledge, or perhaps information,
in databases, participants felt that it probably was not important to develop technology further.
This was mainly because of the difficulty of achieving any payback on such an investment over
‘‘ConsumProt’s’’ limited future life (its functions as a voluntary regulatory body are due to be
taken over by a statutory body in approximately two years). In the circumstances, they thought
it more important to transfer knowledge to people’s heads to enable them to get replacement
jobs, and also to be able to transfer knowledge to the replacement organization. The latter
need in particular accounted for much of the process focus.
Table III summarizes the various IT-based knowledge management systems mentioned by
participants in the ten workshops.

Discussion
Although based only on a small and relatively informal survey, the paper by Edwards et al.
(2003b) gives a flavor of the expectations of academics and practitioners about different types
of knowledge management system. Those most often cited were intranets, groupware, search
and retrieval tools, and data mining software. The type of information technology support most
favored for specific uses in their survey was groupware, but interestingly no use of groupware
for KM was mentioned in this sample of cases at all. Perhaps this may have been because our
sample did not include any management consultancies; they were amongst the pioneers of
groupware, and are probably the most advanced users of it. In fact, the only interest
expressed in groupware was that participants in at least two of the workshops enquired about
buying their own copies of the software that was used to run the workshops.
In the cases we studied, there was a clear emphasis on the use of general information
technology tools (such as e-mail, shared databases and intranets) to support KM activities,
rather than the use of tools specific to KM. This is consistent with the findings of Zhou and
Fink (2003) for Australian organizations. The best example of a specific KM tool that we
found was that Restaurants strongly advocated the use of a data analysis system based on
OLAP or business intelligence principles (their reference to ‘‘cubes of data’’). Restaurants
already used an external data warehouse. B2BService also wanted to see a data
warehouse, while R&D and Housing were interested in repositories, although the discussion
on this topic in the Housing workshop seemed to have a ‘‘pie in the sky’’ element to it.
With the more general tools, the issue seems to be how to use them effectively in supporting
KM. None of the organizations with intranets seemed to be confident that they were using
them well, or even that they knew how to use them well. E-mail was used in all of the ten
organizations (although one did not see a significant connection with KM), but in almost

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 121
j
j
PAGE 122 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Table III Use of IT in KM – current systems (shown as X) and suggested future (shown as O)
Bulletin/
Internet/ Intranet/ Shared Data message Accounting Their own Payroll
Organization E-mail web portal Extranet databases/files Repository warehouse boards KBS system web site MIS OLAP system

Restaurants X X X X X X X X O X
Police X X X X O
DesignInst X O O O O O
HighTechManuf X X X X X
Consult X X X X O O
B2BService X O O O X O X
R&D X X X O O X
Housing X O X O X O X
ManufIndProd X X X
ConsumProt X X O
every case there was dissatisfaction with its use, especially the tendency to copy everyone
in on everything. All of the organizations saw shared databases of some kind as important for
KM, but there was often uncertainty as to how best to achieve this.
It is generally accepted that there is no ‘‘one size fits all’’ solution to the use of technology to
support KM in organizations. Three of our ten organizations emphasized technology; at the
other end of the scale, two scarcely mentioned it. What each organization has to do in terms
of supporting its KM activities is to strike an appropriate balance between various tensions
apparent in the organization. This balance will differ, not only between different
organizations, but also perhaps for the same organization at different times.

We have identified four related tensions influencing decisions about IT and KM. The first is
the tension between the quantity and the quality of the information and knowledge being
managed (not helped by the confusion between information and knowledge displayed by
many participants). Examples of this were Restaurants’ inclusion of the payroll system as a
source of support for KM, and a strong emphasis on shared databases in several workshops
without much specific discussion of their content. A relevant question is: ‘‘has technology
simply increased the volume of unfocused data without helping to convert it into usable
knowledge?’’ This reinforces findings elsewhere in the literature, as Alavi and Leidner (2001)
put it: ‘‘Hoards of information are of little value’’.

Related to this is the second tension, between centralized and decentralized organizations.
Restaurants, as we have seen, wanted a centralized ‘‘solution’’: a single source of
knowledge based on standard site technology. However, this may have been influenced by
the fact that all of the workshop participants were from head office. Police were very aware of
this tension, especially the use of what one might call ‘‘independent’’ KM technology
(ranging from the unofficial to the dubiously legal). DesignInst expressed this tension as
being between an inward and an outward focus. A question arising here is: ‘‘does the
decentralized organization conflict with centralized knowledge ‘systems’ – does the KM
strategy imply a more centralized organization?’’ This raises significant issues about the
roles of the formal and informal organization in knowledge management. The importance of
the informal organization, especially social networks, has long been recognized in
management literature generally; see for example Krackhardt and Hanson (1993). These
ideas have been taken note of in the knowledge management literature, although as
Holtham and Courtney (1998) point out, informal mechanisms may preclude wide
dissemination of knowledge. However, we believe that the relevance of informal information
systems to knowledge management, such as those in Police, has not been previously
recognized.
A third related tension is between ‘‘head office’’ and operational knowledge. Restaurants
scarcely considered operational knowledge at all in their workshop. For Police this tension is
a well-known problem, but unfortunately without a well-known solution. Anecdotal evidence
in Police is that operational knowledge is shared reasonably effectively, but greater efforts to
systematize this may have the opposite of the desired effect. This had already happened in a
pilot project in another organization, R&D. A question arising here is ‘‘who decides which
knowledge needs to be managed?’’ This does not appear to have been addressed so far, at
least in the literature relating to knowledge management systems.

The fourth tension is between ‘‘pushing’’ information and knowledge out to people and
leaving them to ‘‘pull’’ it when needed. There was general agreement that universal ‘‘push’’
systems did not work. This is consistent with the literature (e.g. Damodaran and Olphert,
2000). Holtshouse (1998) explains the need to balance push and pull approaches. However,
Police in particular recognized that some people were much more likely to choose to pull
knowledge for themselves than others were. A question here is thus: ‘‘how does one involve
what one police participant called ‘recalcitrant non-communicators’?’’ The need to involve
those who might least wish to be involved has been raised in other information systems
contexts, for example expert systems (see Edwards et al., 2000).

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 123
Finally, we see the general problems of alignment – making sure that the solution fits the
organization’s business processes. This is shown by the very different technology-based
solutions favored by DesignInst, Restaurants and Police.

Conclusions
Different solutions are appropriate and organizations need to find the solution that is right for
their context. There is a range of approaches that can be taken in considering technology to
assist KM – even just three technology-focused cases, as reported here, give three very
different approaches.
Within this we have identified four tensions that each organization must resolve:
(1) Between the quantity and quality of information/knowledge.
(2) Between centralized and decentralized organization.
(3) Between head office and organizational knowledge.
(4) Between ‘‘push’’ and ‘‘pull’’ processes.
Finding the way to make best use of generally available software such as intranets and
e-mail for KM is perhaps the biggest single challenge.
Whatever technological route is adopted, there will also be many issues, especially human
ones, relating to the implementation of that solution. There is insufficient room to address
these here.

Limitations

The most apparent limitation of the study is that participation was restricted to organizations
that expressed an interest in knowledge management, and presumably wished to receive
some assistance from the researchers. This would therefore exclude both those who had no
interest in KM, and, more importantly, those who felt that they did not need any assistance
with KM. Our findings may therefore represent only ‘‘average’’ organizations, not the very
best practice.

Although our study was limited to UK organizations, we believe that the findings will still be
representative of organizations in other industrialized countries, because of the variety of
organizations covered. This will be true unless there are countries significantly ahead of, or
behind the UK in KM adoption.

References
Alavi, M. and Leidner, D.E. (1999), ‘‘Knowledge management systems: issues, challenges and
benefits’’, Communications of the Association for Information Systems, Vol. 1 No. 7, pp. 1-37.
Alavi, M. and Leidner, D.E. (2001), ‘‘Review: knowledge management and knowledge management
systems: conceptual foundations and research issues’’, MIS Quarterly, Vol. 25 No. 1, pp. 107-36.
Becerra-Fernandez, I. (2000), ‘‘The role of artificial intelligence technologies in the implementation of
people-finder knowledge management systems’’, Knowledge Based Systems, Vol. 13 No. 5, pp. 315-20.
Borghoff, U. and Pareschi, R. (1998), Information Technology for Knowledge Management, Springer,
New York, NY.
Cooper, W.H., Gallupe, R.B., Pollard, S. and Cadsby, J. (1998), ‘‘Some liberating effects of anonymous
electronic brainstorming’’, Small Group Research, Vol. 29 No. 2, pp. 147-78.
Damodaran, L. and Olphert, W. (2000), ‘‘Barriers and facilitators to the use of knowledge management
systems’’, Behaviour and Information Technology, Vol. 19 No. 6, pp. 405-13.
Davenport, T.H. and Prusak, L. (1998), Working Knowledge: How Organizations Manage what they
Know, Harvard Business School Press, Boston, MA.
Dieng, R., Corby, O., Giboin, A. and Ribiere, M. (1999), ‘‘Methods and tools for corporate knowledge
management’’, International Journal of Human-Computer Studies, Vol. 51, pp. 567-98.

j j
PAGE 124 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Earl, M. (2001), ‘‘Knowledge management strategies: toward a taxonomy’’, Journal of Management
Information Systems, Vol. 18 No. 1, pp. 215-33.
Eden, C. and Ackermann, F. (1989), ‘‘Strategic options development and analysis (SODA) – using a
computer to help with the management of strategic vision’’, in Doukidis, G.I., Land, F. and Miller, G.
(Eds), Knowledge-based Management Support Systems, Ellis Horwood, Chichester, pp. 198-207.
Eden, C. and Ackermann, F. (1998), Making Strategy: The Journey of Strategic Management, Sage,
London.

Edwards, J.S., Collier, P.M. and Shaw, D. (2003a), Management Accounting and Knowledge
Management, CIMA, London.
Edwards, J.S., Duan, Y. and Robins, P.C. (2000), ‘‘An analysis of expert systems for business decision
making at different levels and in different roles’’, European Journal of Information Systems, Vol. 9 No. 1,
pp. 36-46.
Edwards, J.S., Handzic, M., Carlsson, S. and Nissen, M. (2003b), ‘‘Knowledge management research
and practice: visions and directions’’, Knowledge Management Research & Practice, Vol. 1 No. 1,
pp. 49-60.

Grise, M.L. and Gallupe, R.B. (1999), ‘‘Information overload in face-to-face electronic meetings:
an integrative complexity approach’’, Journal of Management Information Systems, Vol. 16, pp. 157-85.
Hendriks, P.H.J. and Vriens, D.J. (1999), ‘‘Knowledge-based systems and knowledge management:
friends or foes?’’, Information and Management, Vol. 35 No. 2, pp. 113-25.

Holtham, C. and Courtney, N. (1998), ‘‘The executive learning ladder: a knowledge creation process
grounded in the strategic information systems domain’’, Proceedings of the 4th Americas Conference on
Information Systems, Association for Information Systems, Baltimore, MD, pp. 594-7.
Holtshouse, D.K. (1998), ‘‘Knowledge research issues’’, California Management Review, Vol. 40 No. 3,
pp. 277-80.
Junnarkar, B. and Brown, C.V. (1997), ‘‘Re-assessing the enabling role of information technology in KM’’,
Journal of Knowledge Management, Vol. 1 No. 2, pp. 142-8.
Krackhardt, D. and Hanson, J.R. (1993), ‘‘Informal networks – the company behind the chart’’, Harvard
Business Review, Vol. 71 No. 4, pp. 104-11.
Liebowitz, J. (1998), ‘‘Expert systems: an integral part of knowledge management’’, Kybernetes, Vol. 27
No. 2, pp. 170-5.
Offsey, S. (1997), ‘‘Knowledge management: linking people to knowledge for bottom line results’’,
Journal of Knowledge Management, Vol. 1 No. 2, pp. 113-22.
Pinsonneault, A., Barki, H., Gallupe, R.B. and Hoppen, N. (1999), ‘‘Electronic brainstorming: the illusion
of productivity’’, Information Systems Research, Vol. 10, pp. 110-33.
Shaw, D. (2003), ‘‘Evaluating electronic workshops through analysing the ‘brainstormed’ ideas’’, Journal
of the Operational Research Society, Vol. 54 No. 7, pp. 692-705.

Shaw, D., Ackermann, F. and Eden, C. (2003), ‘‘Sharing knowledge in group problem structuring’’,
Journal of the Operational Research Society, Vol. 54 No. 9, pp. 936-48.
Strapko, W. (1990), ‘‘‘Knowledge management’ – a fit with expert tools’’, Software Magazine, November,
pp. 63-6.

Zhou, A.Z. and Fink, D. (2003), ‘‘Knowledge management and intellectual capital: an empirical
examination of current practice in Australia’’, Knowledge Management Research & Practice, Vol. 1 No. 2,
pp. 86-94.
Zyngier, S. (2001), ‘‘The role of technology in knowledge management: trends in the Australian
corporate environment’’, in Burstein, F. and Linger, H. (Eds), Knowledge Management in Context,
Australian Scholarly Publishing, Melbourne, pp. 78-92.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 125
Connected brains
Question and answer systems for knowledge sharing: concepts, implementation
and return on investment

Paul Iske and Willem Boersma

Abstract
Purpose – In this paper the aim is to describe the role that question-driven knowledge exchange
systems can play in the transfer of knowledge between people and to describe the conditions to be
fulfilled for successful implementation.
Design/methodology/approach – The conclusions in this paper are based on interpretation of results
of case studies. These are combined with literature research.
Findings – The major conclusion of the work is that question and answer (Q&A) systems are more
Paul Iske is Chief Knowledge promising than traditional Yellow Pages systems. However, some challenges remain the same,
especially those related to motivating people to ask (the right) questions.
Officer at ABN AMRO Bank,
Oostzanerdijk, The Netherlands. Research limitations/implications – The authors believe that further study would be helpful to better
understand the causal relationships between the success of a Q&A-driven knowledge system and the
Willem Boersma is CEO at
context where they are applied. More case studies and a fundamental study of the types of knowledge
Integral Knowledge Utilization
and organizations that could benefit from this approach would help people to make better decisions
BV, Badhuisweg, The when considering the implementation of a Q&A system.
Netherlands.
Practical implications – The aim of this work is to help people make better decisions when they
consider the implementation of a system that connects people with a knowledge question to people with
the relevant knowledge. It helps them to understand whether such a system can add value at all and, if
so, how to increase the probability of success.
Originality/value – As far as is known, there has not been a study so far, explicitly focusing on this type
of system and the comparison of the application of Q&A systems to ‘‘traditional’’ Yellow Pages. The
application of scenario-thinking to this field is also new.
Keywords Knowledge management, Culture (sociology), Return on investment
Paper type General review

1. Introduction and background


Today’s professionals are confronted with the ‘‘information-based, knowledge-driven,
service-intensive economy’’ (Bartlett and Ghoshal, 2002):

A learning organization is an organization skilled at creating, acquiring, interpreting, transferring,


and retaining knowledge, and at purposefully modifying its behaviour to reflect new knowledge
and insights (Garvin, 2000).

If one asks someone about the most valuable asset of the organization, the answer is very
often: the people, the employees, the staff, etc. However, it seems that this crucial asset is
not always being used in the most effective way. We have done some research (Iske, 2004)
and asked a very simple question: ‘‘what percentage of your talent, ideas and experiences
do you use in your job (this is not exact science, just select the percentage that first came to
your mind)?’’ The average of the answers (number of respondents approximately 1,000) is
just below 60 percent, with typically a local maximum around 20 percent and the absolute

PAGE 126 j JOURNAL OF KNOWLEDGE MANAGEMENT j VOL. 9 NO. 1 2005, pp. 126-145, Q Emerald Group Publishing Limited, ISSN 1367-3270 DOI 10.1108/13673270510583018
maximum around 70 percent (see Figure 1). Though it is clear that it is not possible to use the
full 100 percent of someone’s intellectual capabilities (though there were some people in the
age category 50 þ who indicated they were using 100 percent of their brain capacity!), it is
clear that a higher return on investment in human capital is feasible! Another conclusion from
this research is that there is not much difference between people neither with different
education levels, nor between men and women or between various ages.
Without doubt, the evolution of IT-technology has sparked new interest and new
developments in the area of knowledge management. Internet and intranet technology
have enabled communication from one-to-one, many-to-many, many-to-one and
one-to-many. Especially (business) communities nowadays have a wealth of tools at their
disposal to effectively communicate and share information. A web-portal, for instance, can
play a crucial role in developing the intranet for what it is supposed to facilitate or make
possible: personalized information, knowledge retrieval, and virtual collaboration.
We all know that knowledge is much more than information: it includes the experience, skills,
ideas and attitudes of people in a context where value can be created. This has as a
consequence that knowledge management is about connecting people to people and
people to content. Especially the relationships between people, based on the knowledge
they want to share and develop, can benefit from a combined technical and more
human-focused, anthropological approach. The other important factor is ‘‘context’’. Value
can only be created in a certain context and therefore it is of utmost importance to
understand the various contexts in the business environment and business processes.
Relevant knowledge can be identified only in this way and information overload can be
avoided. In a previous article we have explained that without context, there can be no value
of knowledge (Iske and Boekhoff, 2001).
To structure our discussion, we will use the framework of the so-called knowledge value
chain, which is given in Figure 2. The knowledge value chain covers the following
fundamental knowledge processes:
B knowledge analysis (starts with the mission, vision and strategy (MVS) of the organization:
what do we have to know, what do we know?);
B knowledge development;
B knowledge capture;
B knowledge transfer;
B knowledge application; and
B knowledge evaluation.

Figure 1 What percentage of your intellectual capacity do you use? (n ¼ 930)

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 127
Figure 2 The knowledge value chain

The first part of the knowledge value chain is more strategic in nature whereas the second
part has more operational aspects. Various instruments support the processes in the value
chain, which is illustrated in Figure 2 (Weggeman, 1998).
Knowledge is someone’s ability to make decisions necessary to execute a specific task. This
ability can therefore be seen as the interaction between insights (from the past), information
(the present) and imagination (the future):
(1) Information (synonyms: explicit, encyclopedian or codified knowledge). This is
knowledge, which, by the owner of this knowledge, can be written up or this is
knowledge that is acquired from knowledge that was already put into symbols
(language, drawings, schemes etc.).
(2) Insights (synonyms: implicit or tacit knowledge) consists of:
B the collection of personal experiences as a basis for feelings, associations, fantasies
and intuition (definition of experience: knowledge acquired by observation and
practice);
B the repertoire of skills: manual skills, analytical skills, communicative skills etc.
(description of skill: dexterity, competence); and
B attitude: the position and behavior that comes from basic assumptions and values,
which is characteristic for someone in a specific situation.
(3) Imagination:
B the ability to visualise possible futures; and
B someone’s ability to generate ideas
Therefore, we could use the following expression for knowledge: K ¼ I £ I £ I, or K ¼ I (Iske,
2004). It should be clear that knowledge emerges in the interaction between the three
factors: information, insights and imagination. So, it is also an interaction between external
(information) and internal (insights, imagination) factors.
To emphasize that managing knowledge is essential for achieving business and personal
goals, rather than a goal in itself, we propose to speak about knowledge-conscious
management: it is about managing the business, processes, customer relations, ambitions,

j j
PAGE 128 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
etc. in a way that good use is made of new and existing knowledge! Thus, it is clear that we
can talk about knowledge (-conscious) management on various levels, see Figure 3.
At each level (individual, team, organization, and network), different rules apply and different
processes and instruments need to be identified and implemented. As we will see in this
article, however, Q&A systems can add value at each level. This requires that such a system
facilitates the sharing, development and stimulation of the factors that make up knowledge,
i.e. insights, information, imagination.

2. What knowledge is relevant? Knowledge mapping


In almost all KM projects the creation of a so-called knowledge map is one of the key
activities. This map has to be developed by analyzing the knowledge that supports the
people in the organization so that the business processes and projects run efficient and
effective. These processes include the strategic business decision processes, ensuring
alignment with business strategy. The knowledge map is a set of knowledge domains for
each of which the following questions are asked:
B Is knowledge in the specific area of strategic importance for the business?
B If yes, who has/wants to know what?
B Where is the knowledge and how do we make it available?
A knowledge map illustrates or ‘‘maps’’ how knowledge flows throughout an organization.
After the knowledge map has been constructed, prioritizing based on the strategy will be the
first step towards actual development and implementation of a knowledge management
process. For instance, one could implement a governance model in which subject matter
experts (knowledge owners) will be made responsible for development, capturing and
maintenance of the knowledge/information. The governance model should avoid that the
knowledge database becomes a ‘‘databasement’’: an environment where people dump
information without checking the relevance and without maintenance.
In Figure 4 a generic form of a knowledge map is presented. The crucial question in all cases
is: are we talking about stock or flow? This is due to the ‘‘dualistic’’ behavior of knowledge:
we tend to capture as much as we can, but most often, knowledge is being created by
people, transferred from one person to another person and it is directly connected to the
context. Without context no knowledge! Here lies the principal justification for the
development of knowledge-information systems: systems that point to people and
environments where knowledge may be found and/or created!

Figure 3 Various levels of knowledge management

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 129
Figure 4 A generic knowledge map

3. Knowledge management systems


In the past 20 years, information technology has provided companies and individuals with
new possibilities to manage and share knowledge. The choice for a specific
IT-implementation is situation dependent and several types of systems can be identified.
The development of a knowledge repository is an example of a tool that supports the ‘‘stock’’
approach. In this approach knowledge/information is being captured and made accessible
for re-use. However, it is almost impossible, time-consuming and/or unattractive to capture
and codify all knowledge.
A different type of tool is represented by yellow page applications. These are curriculum
vitae-oriented and highlight which person has what knowledge within the organization.
These tools are also not easy to maintain. In fact, people have to constantly review their data
in order to keep the system up to date.
A critical success factor for the implementation of such tools is strong management support.
Often these applications fail due to the fact that management does not motivate their
employees to use the tool. The answer to the question ‘‘what is in it for me?’’ has to be
provided by management. The tool needs to be marketed, a reward system is necessary
and participation of the employees has to be won.
A third group of applications is e-mail-based tools. These seem to work very well in practice,
at least in dedicated communities, where people share the same interests. One major
feature of these applications is that they are reactive tools. Whereas the two first types fail to
show people who exactly they are helping and with what, e-mail-based tools, by asking and
answering a question, allow experts to know that they helped someone. They also provide
feedback on what information was needed to be helpful. Moreover, they are more natural to
use, reactive, and relatively easy to embed in the organization. These applications come
close to the normal way of working: asking your colleagues for help if a problem arises.

4. From ‘‘traditional’’ yellow pages to question and answer-driven applications

Combined people-based and content-based KM environment


In an on-line community environment, people create valuable content by aggregating
knowledge. Furthermore, capturing, recording, analyzing and interpreting both the behavior

j j
PAGE 130 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
‘‘ A knowledge map illustrates or ‘maps’ how knowledge flows
throughout an organization. ’’

and the knowledge exchanged create value. In Figure 5 these processes (Hagel and
Armstrong, 1997), which are also used to describe the dynamics in on-line market places,
are described.
In recent years a large number of yellow page systems have been implemented. The authors
started doing so in the early days of the web, as is illustrated in a screenshot of the starting
page of a system they implemented for Shell Research in 1995/1996 (Figure 6).
The goal was to present the organization, the various teams and departments and the
people working in these teams. Users could browse or search through the system and
everyone was responsible for updating their own pages. Based on this model several other
implementations followed, among which a system aimed at creating a network of companies
in The Netherlands in the area of bio-energy. This system started in 1997 and was active for
more than five years. However, many of these yellow pages implementations suffer from
decreasing use and poor maintenance within a period of one to two years. Reasons for this
are:
B the systems are not integrated in the (primary) business processes. It is not enough to
state that it is important that people know how to find experts;
B there is no connection between the context of entering information and the context of
using information. The question ‘‘who am I helping with my information’’ is not answered;
B the information providers need to be pro-active in keeping the system up-to-date.
However, there is no direct reward for this effort;

Figure 5 Communities and value creation in the knowledge value chain

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 131
Figure 6 Screen shot of the starting page of Shell research yellow pages in 1995(!)

B a lack of communication about the system, poor understanding of the potential value of
the system by end-users as well as management, and no management support for the
system; and
B an unattractive user-interface or, in the worst case, a poor technological implementation.
From these observations, it becomes clear that it requires a very thought-through strategy
and a lot of stamina to develop a sustainable yellow-pages solution. An example of a
corporate yellow pages system that has taken a number of years to develop in a tool that is
more or less institutionalized within the organization is shortly described in the Philips case
study below (Case 1).
Recently, as an alternative for yellow-page systems, question and answer systems have
become increasingly popular. Question and answer systems are systems where users
interact with each other by asking questions and providing answers. The questions and
answers are stored in a categorized way and are easily retrievable for further use. This
means that after a period of intensive use a very valuable knowledge database will have
been built and that when people ask questions a properly designed Q&A system will first
search through the already available answers before sending the question to subject
experts. A similarity with yellow pages is that persons do need to register themselves as
expert on a subject in order to receive questions. An important difference, however, is that
they only need to fill in a very limited profile and that their profile builds itself over time by
means of the questions they have answered. Moreover, when they answer a question they
can be sure that there is a real need for this answer by a real person.

While the Q&A game is the basis of a Q&A system, it might be extended at will by adding e.g.
best practices, frequently asked questions, and valuable resources. Extensive personal and

j j
PAGE 132 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Case 1: yellow pages in Philips (Iske, 2002)
Philips has developed a Yellow pages application that enables people to locate colleagues with the
right experience. It is built on the principle that you can find any information and any experience with
the help of your colleagues. It allows for sharing pockets of knowledge (contributions) in the form of
best practices and links. For those who are not able to find information directly, there is a possibility
to pose open questions. Other members can answer open questions. Members are able to
subscribe to all open questions from various categories. On average each question receives three
answers. Surveys show that people experience good added value.
The application is highly personalized (personal homepage) and information is built according to
peoples needs. All contributions are linked to a two level subject taxonomy, which resembles the
main working areas and subsequent categories, within Philips. Members have the possibility to
forward, rate and comment any contribution in the system. The advanced search option enables
members to select any combination of the subject taxonomy as well as geographical and business
structure options (‘‘I am looking for a purchaser in Taiwan with SAP experience’’). The system is in
use since 1999 and has over 26,000 members and has 5,000 hits per week. Membership is
voluntary. It is part of the intranet infrastructure within Philips.
Some observations and lessons learned:
† people use the system more for finding people than for asking questions;
† language might be a barrier: some people are reluctant to publish a question in poor English;

† because of wide exposure, there is no abuse of the system;


† presence of executive management legitimates participation;
† seamless integration in intranet infrastructure stimulates usage;
† people should always get answers – for this Philips started with gatekeepers who followed up
on unanswered questions; and
† maintenance is key – there is one full time employee working on the administration/maintenance.

system administration modules for e.g. limiting the number of questions per expert or adding
business rules for questions on a specific topic, reporting and e-mail integration are
additional features that are available in these type of systems.
So how and why can these Q&A systems do a better job than the yellow-page systems? The
answer is fourfold:
(1) They are easier to integrate in the (business) processes. Only when a specific problem
arises in the normal working practice, an answer needs to be obtained. Answers need
not to be provided beforehand and out of context.
(2) There is a direct relation between effort and result: by simply answering questions, users
provide their knowledge, which is not only used immediately, but also stored for possible
future use. Moreover, since the questions are originating directly from the context where
the knowledge will be used and since the answer receives direct feedback from the
knowledge seeker, the context is specified for future use as well.
(3) They can learn from implementation mistakes made in yellow page projects regarding
communication and management support.
(4) They can benefit from an improved, interactive user-interface and more mature
technology. An easier direct integration with e-mail and instant messaging applications
also helps. People will receive, for instance, notifications of questions and answers via
e-mails with hyperlinks to the web-application.

5. Implementation challenges for knowledge management systems


As mentioned before, there are several general issues related to knowledge management.
Most notably, each knowledge management program involves a change management
effort. A successful change management project can be divided into seven steps: making

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 133
contact, creating awareness, building understanding, testing to get acceptance, which
should lead to institutionalization and finally to internalization. The change curve is illustrated
in Figure 7.
From many case studies it has become clear that without appropriate effort spent in each
step, the following level is difficult to achieve in a sustainable way. Many (large) knowledge
management projects focus on institutionalizing, which usually means organization-wide
implementation of new tools and/or processes. However, for a sustainable, value-adding
change, this approach is usually insufficient. Careful building of awareness, strategic testing
and ensuring acceptance are key success factors in any change project.
From the study of successful and failed knowledge management and change projects a
list of success factors can be extracted. Each of these enablers/barriers can be related to
at least one of the three key areas: culture, process/organization, and technical
infrastructure:
B Culture. There must be a culture that does not discourage knowledge sharing.
B Leadership. The leaders have to be leading and give commitment to knowledge
management to make it a success.
B Reward system. To make a success of knowledge management, the incentives have to be
in line with the behavior wanted.
B Information and communication technology. Information and communication technology
enables the quick finding and using of information and also enables communication with
and searching for people.
B Shared language. To make it possible to share knowledge, speaking the same language
and using the same meaning to a word is very important.
B Information need clear. To know which information must be stored and be available, the
need for information must be explicit. Having this knowledge available makes it workable.
B Performance measurement. To make knowledge management a success, performance
measurement must be done to get a clear perspective on whether it’s positively effective
or not and to motivate people.
B Resource availability. There must be resources available to execute all the work needed to
make knowledge management a success.
B Clear processes. It must be clear what kind of processes are part of knowledge
management and how these processes are defined, this helps to make explicit what must
be done to use knowledge better.

Figure 7 The change curve

j j
PAGE 134 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
B Direction. It must be clear in what direction the company is heading, so the employees
can act accordingly.

6. Specific issues for Q&A-driven knowledge applications


It turns out that motivating people to ask questions in a Q&A system presents a bigger
challenge than making people answer them. In most cases people are willing to answer
questions as long as they are relevant and the answering process is simple and not
time-consuming. However, it takes a lot of communication, integration in the (business)
processes, targeted incentives, demonstrated results and patience before it becomes the
second nature of people to ask questions via a Q&A system.
It has been demonstrated that rewarding both the knowledge seeker and provider (e.g. the
‘‘Q&A of the month’’) stimulates others to use the system because it acknowledges both the
active behavior of the seeker and the willingness and expertise of the expert.
In general, Q&A systems tend to suffer from either a very limited functionality or a poor user
experience. When the functionality is limited, the system provides hardly any added value
over using e-mail, while when the user experience is too complex, especially the not very
computer-savvy users tend to stop using the system after the first try. Therefore, a
compromise has to be found between functionality and user interface, where a lot of
attention needs to be paid to small details. Only when these are well taken care of, the
system has a chance to succeed.
Nothing is more scary than asking a question to an empty, not yet very active, system.
Therefore an initial fill of the system needs to be created by a small group of active,
committed professionals, who provide an initial load of frequently asked questions, create a
sense of activity and make sure questions from new users are promptly answered. An active
moderator/administrator is also needed here.
A brief survey was performed with several companies and experts in The Netherlands on
what they considered implementation issues for Q&A systems. The main concern of the
interviewed individuals is that answering the questions might cost time for the experts.
They are worried that the experts do not want to answer lots of questions, because they
have other work to do. What these companies and experts do not see is that the questions
are being asked anyway, and if the right expert is found in less time it will save time
somewhere else. Of course, the experts within a company have to be explained what is in it
for them and they need to be made aware of the benefits for the company as a whole when
using this system.
Another concern is the need for face-to-face contact. For particular questions personal
contact is required and the interviewed companies think that such a system will decrease the
personal contact. However, in the case of a Q&A system the one who asks the question will
know who answered it and if he thinks that he needs to have a meeting with that expert to
further elaborate on this subject it can be arranged. The fact is, however, that not for each
question or problem you need to have face-to-face contact, so with the introduction of the
Q&A system a decrease in this personal contact will be one of the consequences (stop the
‘‘meeting-culture’’).
In the case below we describe an implementation of a Q&A system within a network of
organizations in the Dutch healthcare sector. Here it is extremely important that the user
interface is simple, because of the strong differences in IT-literacy in the network.
Furthermore, since the system has been implemented on the internet, measures had to be
taken so that not everyone could ask medical questions to the experts, who often are
medical specialists. There are a number of examples where people met each other after they
made contact through the system. Here too, the ongoing challenge is to stimulate people to
ask questions, more than experts to answer them (Case 2).

7. Building the business case


For various reasons one might want to make an assessment of the impact the Q&A system
has achieved. One reason could be that a business case has been produced that needs to

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 135
Case 1: good healthcare innovation practice (Iske et al., 2003)
A unique initiative started in The Netherlands in the area of local healthcare innovation projects
during the second half of 2001.
All over the country healthcare innovation projects are carried out, in various locations including
hospitals, GP practices, pharmacies etc. In most cases the Ministry of Health, other
semi-governmental institutions or healthcare insurance companies, finance these projects. Most
often these projects are very but the knowledge and expertise associated with such a project can
hardly ever be made available to others in similar circumstances.
By the end of 2001 this problem had been recognized by many parties who are committed to the
area of healthcare innovation. These parties collaborated for over a year and this has resulted in a
guideline for healthcare innovation initiatives: good healthcare innovation practice (GHIP).
This guideline will secure that all stakeholders and all process steps (generation, selection,
realization, evaluation and dissemination) of a healthcare innovation project will be addressed,
resulting in a separate GHIP dossier which will make knowledge easily exchangeable and
interpretable.
Meanwhile the Dutch Healthcare Insurance Council has formalized this guideline. The GHIP
Knowledge and Coordination Centre (GHIP KCC) will play a key role in the implementation of the
guideline as a tool to benefit others in similar circumstances. The GHIP KCC offers a ‘‘click and
brick’’ infrastructure for knowledge sharing (see www.ghip.nl).

How does it work in daily life? Initially, a database with project dossiers is being created. However,
from the very beginning it has become clear that much more knowledge about the projects and
about the innovation process itself will remain in the heads of the people involved. Therefore, the
GHIP expert panel has been created (see Figure 8): via this Q&A-driven knowledge system people
can find knowledge and ask questions about previous innovation projects and about the GHIP
process itself. The system has been embedded in the process in the following ways:
(1) Generation step – people must demonstrate that they have used the system to support their
project proposal. In particular, it must be clear that there will not be any reinvention of the
wheel.
(2) Evaluation step – evaluators of a project will be selected from the GHIP expert panel only.

(3) Dissemination step – in order to fulfill the requirements for knowledge dissemination, for each
project at least one person involved has to sign up as expert in the GHIP expert panel, so that
information and knowledge that is not in the GHIP dossier still will be accessible.
This integration of a Q&A system in the (primary) process is a good example of
knowledge-conscious management and this holistic approach clearly demonstrates the added
value that can be delivered.

be validated at a certain moment in time (the so-called ‘‘ROI’’ – return on investment


calculation). Furthermore, insight in the results could also be used to motivate users of the
system, both people with questions and the experts.
There are various sources of value creation by a Q&A system (see Figure 9) as is being
explained in various frameworks (Sveiby, 1997; Edvinsson and Malone, 1997; Kaplan and
Norton, 1996; Lev, 2002):
B Financial: the use of a Q&A system results in direct cost savings or increase in revenues.
B Innovation: the system is an innovative way to support communication and knowledge
exchange and can play a major role in stimulating innovation within the company by
enabling discussion on solutions to all kind of problems.
B Processes: the system can be used to improve current business processes, in particular
those for communication and information exchange.
B Customer: the system can be used to give customers improved and direct access to the
knowledge in the organization. Furthermore, by being able to respond efficiently and
effectively to customer needs, questions and problems, a contribution will be made to
improving the customer experience. Hence customer capital is being created.

j j
PAGE 136 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Figure 8 Screen shot of the home page of the GHIP expert panel (2003)

Figure 9 Five areas of value creation

B Human: by participating as an expert, the contributors to the system will have more
exposure and will have more recognition that will lead to improved employee satisfaction.
Furthermore, when employees have a convenient, ‘‘human’’-type of communication tool
at their disposal they will enjoy the working environment more. If this contributes directly or
indirectly to the retention rate of employees the contribution to the ‘‘human capital’’ could
already justify the investments.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 137
A lot of measuring methods are based on the traditional (financial) performance
measurements, such as increasing profits and reducing the costs. However, for a
knowledge management project these are hard to quantify. Advantages like employee
satisfaction, customer satisfaction and increased speed to market are hard to measure as
well. Therefore, management needs concrete examples in which they can see direct
benefits of this approach in their own situation. Here one has to realize that knowledge is
context-dependant. In convincing the management and employees why knowledge and
managing knowledge is important and can help in building a successful business, it is
necessary to create a certain context in which it is clear what the value of the knowledge and
of knowledge management can be to that organization.

Direct quantitative measures


Measuring direct business impact is the most powerful way to demonstrate the added value
of knowledge (management). To directly assess the added value requires an intimate
relation between the knowledge processes and the primary business processes. As an
example, one could consider marketing and sales. An example in this case could be the
development of a best practice proposal (including commercial texts, product descriptions,
a pricing model etc.), which will lead to a reduced time to produce proposal (which can be
measured quantitatively) and increased hit-rate (idem). Direct financial results could be
achieved by exchanging knowledge in the area of suppliers: increase of buying power and
reuse of knowledge from consultants are examples of results that lead to directly measurable
financial benefits.

Direct qualitative measures


With these measures one tries to describe the (potential) benefits of an (KM) intervention,
without being able to quantify the total added value. The benefits follow from stories from the
business, which might be quantifiable. However, it is usually difficult to predict up-front what
the situations will be in which the benefits are realized. This is the area of scenario-thinking
that will be discussed in the next section.

Indirect quantitative measures


These measures are used to obtain insight in the maturity, quality and effectiveness of the
knowledge management tools, processes and culture. In-direct quantitative measures
include user statistics of databases, intranets, number of questions being asked in expert
systems, number of documents in the knowledge repository, number of people who have
attended a course, and number of workshops on a certain subject, etc. For an example
reporting of a Q&A system, see Figure 10.

Indirect qualitative measures


These measures give insight in the way the knowledge management efforts are being
perceived. By collecting feedback one can measure user satisfaction. It gives insight in the
reputation of the KM infrastructure. The feedback can be gathered through questionnaires,
testimonials in corporate magazines, during department meetings, in appraisal discussions
or in conversations with clients or via client-feedback.

8. An analytical approach for assessing the value of knowledge (management)


For discussion purposes, we propose a formula (Iske and Boekhoff, 2001) (see Equation (1))
that captures the essential features of the knowledge value chain. At the end, it enables us to
categorize and prioritize properties of organizations, knowledge management activities and
even the value of it all:
X
V P ðKðVÞÞ ¼ {pðKðVÞ; GÞrðKðVÞ; GÞaðKðVÞ; GÞV ðK; GÞ}: ð1Þ
G

Equation (1) reads: the total potential value V P of the knowledge K that is stored in
environment V equals the sum over all contexts G of the probability p that this knowledge is
related to the context G multiplied by the connectivity r that indicates how easy it is to

j j
PAGE 138 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Figure 10 Indirect, quantitative reporting in a Q&A system

transport the knowledge from the environment V to the context G, multiplied by the activation
coefficient a that indicates how easy it is to activate the knowledge (to use it) in the context G,
multiplied by the added value V that is achieved within the context G. We can describe the
elements in the first formula that need to be discussed in more detail as:
B Knowledge relevance indicator p – the level to which knowledge is considered relevant
for the business (processes) and the level to which business issues lead to new
knowledge. The relevance indicator is influenced by a number of factors, e.g. creativity
(can you think of other applications for this knowledge?) and process/system thinking (do
we understand where and how this knowledge can be used in certain process). This
factor plays a key role in the calculation of the value of intellectual properties: a patent is
only worthwhile is one knows in what product development process the knowledge will be
usable.
B Knowledge connectivity factor r – the level to which it is possible to transport knowledge
from an environment (source) to the context (work situation, business process). Factors
that influence this parameter include: geographic distance, language barriers, cultural
barriers, ICT tools.
B Knowledge activation factor a – the level to which it is possible to activate knowledge (to
use it) in a specific context. Sometimes knowledge is not completely ‘‘finished’’ yet and an

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 139
‘‘ Without context no knowledge! ’’

extra effort has to be undertaken to make it useable. This is one of the key challenges in
knowledge transfer between academia and the industry.
B Added value V – the level to which knowledge has added value (is useful) in a specific
context. Here we could think of things like time saving because of the availability of
templates, best practices, but also increased revenues because of specific client
knowledge.
The total potential value of all knowledge in the organization could now be expressed as:
X
V P ðKÞ ¼ V P ðKðVÞÞ: ð2Þ
V

Here we will take the over all sum of all the possible environments V in the company. Note
that it is clear from the first equation that if one of the factors is zero, there is no value added,
irrespective of the values of the other parameters. Quantitative insights into the
environmental parameters that determine the value of the factors in the equation will help
to optimize the return on investment of knowledge-related projects. In general, one should
focus on the smallest parameter (the weakest link) to achieve optimal improvement.
This equation helps us directly to understand the value of a Q&A system:
B Knowledge relevance indicator p – when a question is being asked, the context becomes
immediately clear.
B Knowledge connectivity factor p – the system facilitates the transfer of knowledge from
the source (expert) to the context (question-asker).
B Knowledge activation factor a – when somebody asks a question and receives an answer
he/she is quite likely to use it.
B Added value V – the added value should be clear in the context in which the knowledge is
being used. In fact, it can only be defined in the application environment. In some Q&A
systems people can rate the quality of the answer and can also indicate what the value ha
been to them, e.g. expressed in money/time saved or (extra) revenues generated.
So, the Q&A system is helping to create value because all factors in the knowledge-value
equation (1) are being addressed!

9. Scenario approach
As was concluded before, the value of knowledge only has a meaning within the context in
which it is being used. We will now apply this by using the scenario (or case-study) approach
(Denning, 2001). Convincing people with the help of recognizable, possible cases
(scenarios) is a methodology familiar to human reasoning and thinking (Bergmann, 1998).
This approach is based on the similarity of problems and missed opportunities. Problems
are being solved by using experiences from others (Bergmann, 1998). The idea is to have
people recognize themselves in a story or case that is being presented and then show them
how to deal with the problem. Case studies and storytelling will allow them to see the
organization from another perspective and then take decisions and change behavior in
accordance with these new perceptions, insights and identities (Denning, 2001).
The scenarios that are being used are mainly about general problems that may occur within
a certain context. This context is important since it gives meaning and depth to the
information (Reamy, 2002). The main idea with scenarios is to create a context that everyone
will recognize, and in this way show the possible value of (management of) knowledge, in
this case the implementation of a Q&A system.

j j
PAGE 140 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
The subjects of the scenarios can be found through desk research, but also through
interviews with process and topic experts. A scenario is a description of a diagnostic
situation and is divided into at least three parts, namely the description of the symptoms, the
description of the failures and their cause and finally the description of the repair strategy
(Bergmann, 1998).
The scenario approach can be used for management, but also for employees. If the
management recognized themselves in the problems and the missed opportunities that
were mentioned in the cases, so should the employees. Thus before implementing the
system, the management has to discuss with the employees the necessity for the system by
using exactly the same scenarios.
As mentioned above, a scenario approach can be used to highlight and quantify the benefits
of a Q&A system. In a recent study (Iske et al., 2002) some scenarios were tested and an
Excel model was created to estimate the quantitative benefits. They include:
B knowledge networking;
B missed business opportunities;
B customer satisfaction;
B inefficient research;
B redundant work;
B knowledge leakage; and
B direct customer support.
Tables I and II describe two scenarios in more detail. These are knowledge leakage (due to
turn-over of staff) (Table I) and direct customer support (Table II).

10. Cultural impact


Just as different societies have their own culture, so do organizations. Naturally the
individuals within the organization will have different personalities, but they do have
something in common, which is called the corporate culture (Morgan, 1997). This corporate
culture will have a strong influence on knowledge management and Q&A systems.
Corporate culture can break or make any system, including a Q&A system. According to the
Hofstede (1997) culture it has five dimensions, namely power distance, uncertainty
avoidance, individualism versus collectivism, masculinity versus femininity and
time-orientation (Child and Faulkner, 1998). The most important dimension in this respect
is the collective versus the individual one. If the individual perspective rules, the main idea
will be that knowledge is power. There will be a lot of competition between the employees,
they constantly will want to distinguish themselves from others and they will only handle out
of self-interest and satisfaction. They do not see each other as being a member of a
collective item, the organization. In such an organization it will take much more time to
implement a system based on knowledge sharing, because the employees simply will not
see the benefits.
On the other hand, if the collective perspective rules the employees will already be used to
share knowledge and the whole atmosphere will be more friendly and cooperative. In such
organizations it will not be a problem to convince the employees of the benefits of a
knowledge sharing system. However, there is a change that they might be worried about.
This is the decrease in personal, face-to-face contacts, as described in the previous section.
In most of these organizations there will be an informal working-environment and people will
value personal contacts. Therefore management has to pay attention to the impact of such a
system on these contacts.
Then there are also organizational cultures that are very resistant to change. This happens
mostly in the older organizations with employees that are working for that organization for
most of their lives. They are used to set ways of working and they will not change easily.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 141
Table I The knowledge leakage scenario
Knowledge leakage

Description This case occurs when a firm loses an expert who leaves and takes with him his
accumulated knowledge acquired through his work and previous education and
experience. All his undocumented knowledge is therefore lost and the firm must
live with the consequences
Impact on The organization that suffers this situation may face grave consequences. The
organization loss of the knowledge may slow down some projects or activity but could also
have much more severe consequences. The lost knowledge may cause the firm
to have to forfeit some business opportunities or even lose part of their
competitive advantage
Q&A system With a Q&A system and some time, the knowledge of every expert will be
impact documented through the questions asked to this expert. The most important
subjects will be covered intensively while the more obscure topics may be
covered more unevenly
Q&A system Q&A system allows the firm to document and therefore retain the knowledge of all
benefits its experts
Q&A system allows the firm’s employees to be more productive as they do not
have to answer questions more than once
Q&A system allows the firm to build a knowledge base that can only grow and will
not suffer setbacks with every staff departure
Key quantitative Cost associated with hiring a new expert to replace a lost one. Cost associated
measures with training employees to recapture lost knowledge. Value of lost business
opportunities due to loss of expertise. Quantity of questions documented
Sample In an organization, people older than 55 are offered a good package when they
quantitative voluntarily leave the organization. One person developed a computer program
impacts for flow modeling. He was hired back as a consultant because nobody knew how
to use and change the program
When a specific problem occurs in a high profile project, the remark ‘‘he/she
would have known the answer’’ is avoided since the problem/answer can be
found in the system
Typical questions Any question related to an expertise or knowledge possessed by a single person
asked or a limited number of persons
Who knows about the dos and don’ts in implementing just-in-time
manufacturing?
What are the pitfalls with international branding?
What approach works best for implementing SAP in an entertainment business
such as a circus?
Key assumptions The questions are documented and the system becomes part of the way of doing
things
Typical firms Consulting firms, services firms, manufacturing firms, organizations with
which benefit from temporary staff (including consultants). Any firm where individuals are singled
this scenario out with specific valuable knowledge

Management has to invest a lot of time in training these employees and in convincing them of
the benefits of the change.
A positive aspect of a Q&A system will be that the less social employees will get the
opportunity to answer questions. Previously nobody would ask them a question,
because they did not know the person or did not know that the person had the expertise.
When the questions are being asked via the computer new unknown experts can be
discovered.
Finally, it is necessary to give a warning about rewarding in relation to corporate culture.
When an expert answers a question, in some systems the answer will be graded and this
will lead to a ranking of experts, which can be of help when the company wants to reward
the experts. Whether this ranking and rewarding system will turn out to have positive
effects, in the sense that the employees will answer the question as good as possible, or
negative effects, in terms of competitive and jealous behavior, also depends on the
culture. If the culture is a more informal and friendly one, the competition will also be
friendly and they will be happy for the one that is the best expert in that year, but if the

j j
PAGE 142 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Table II The customer support scenario
Direct customer support

Description The case is about providing a Q&A system to customers and allowing them to
use it like internal users of the organization. Many firms today have web sites with
a frequently asked question section and also make it possible for clients to e-mail
the firm for more information. Providing them with a Q&A system would push the
support one step further and would definitely strengthen the relationship with the
client. There are three possible levels of integration with the customers identified
at this time. The first one would have customers linked only to internal users and
only they would provide information to customers. The second level would
provide customers with all questions and answers in the system including those
questions submitted by other clients and possibly competitors. Finally, the last
level would have no restriction and would allow two customers to interact
together
Impact on The first impact will be the increased satisfaction of the employees as redundant
organization questions should be limited. Also the customers should have access to more
information faster. The possibility of learning from other clients may spark ideas
for new business opportunities with the organization linking them. The cost
associated with supporting clients should also be reduced. Finally all the benefits
of improving customer satisfaction should kick in
Q&A system The Q&A system will force the organization to be more transparent if clients have
impact access to its knowledge management system. Some questions may have to be
avoided to avoid problems. For example, pricing strategies or sale strategies
should probably not be discussed in a Q&A system if clients can have access to
the info
Q&A system Q&A system will reduce the customer support costs
benefits Q&A system will increase customer satisfaction (and generate all the benefits
associated with it)
Q&A system will provide valuable knowledge about the customers, their desires
and need
Q&A system will increase the transparency of the firm
Key quantitative Cost of managing client relationship (customer support staff per customer)
measures All the measures of a customer satisfaction model
Sample The support center reduced its annual cost by $250,000
quantitative The customer retention rate increase by 10 percent generating additional
impacts revenues of $100,000
Typical questions What Asian country is the best to set up a sales office from a banking
asked perspective?
Can I use this device when I switch to Windows XP?
Key assumptions Customers will use the application and not just pick up the phone
Typical firms Customer-oriented firms, service firms, firms with few large clients, software
which benefit in firms, firms selling knowledge-intensive products or services
this scenario

culture is a more individual-based one, it might happen that the experts thwart each
other.

11. Conclusions
The development and implementation of a Q&A system is a complex project, where attention
should be paid to all change management aspects. Of course, there is the IT part of the
project: Tools have to be selected and/or developed that work seamlessly together and that
have to be integrated in the overall IT architecture and infrastructure of the organization.
Quite often, this also requires enhancement of the current infrastructure and the
development of a support organization dealing with infrastructure issues, including
connectivity and authorization.
Then, as argued above, the right processes and organizational structure need to be in
place. These range from knowledge mapping to a consistent approach for facilitating and
supporting communities: all of the knowledge management processes need to be clear and
accepted in order to be able to have them reflected in the Q&A system architecture.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 143
‘‘ The value of knowledge only has a meaning within the context
in which it is being used. ’’

Finally, and probably most important, attention needs to be paid to the behavioral aspects
and the culture in the organization. Typically, by introducing a new business tool such as a
Q&A system, sometimes changes in the work processes are anticipated or required. The
more one can integrate the on-line environment into the natural way of working, the higher
the probability that the project will be successful. It is therefore mandatory that a thorough
study is made of the implementation environment and the anticipated users. From
experience we know that ‘‘they come by foot and leave by horse!’’
Based on these arguments it is clear that the development and implementation of a (knowledge)
Q&A system is a business project rather than an IT project. The project management needs to
find the right balance between the hard (IT) issues and the soft ones. Communication and
change management skills are indispensable to be able to develop and implement a
knowledge Q&A system that will be a value-adding tool for the users in the organization.
Q&A systems can offer huge opportunities for creating value by developing, sharing and
applying knowledge. Q&A systems can offer an environment that supports the various steps in
the knowledge value chain, integrated in the organizational processes and environments. In
these respects they offer a major step forward from the more traditional yellow pages systems.
Although the identification, selection and implementation of the right tool and the
development of the appropriate architecture and infrastructure are a technical exercise, the
real challenge is found in the development of the right processes, organization and culture.
We recommend strongly that each knowledge Q&A system project is focusing on the
personal and collective objectives of the users. A real understanding of the business, the
processes and, especially the members of the organization, will guide the developers in
understanding where and how knowledge can add value and what contribution the
knowledge Q&A system environment can make.
The paradigm should no longer be: ‘‘if we build it, they will use it’’, but instead: ‘‘if they use it,
it will build itself!’’

References
Bartlett, C.A. and Ghoshal, S. (2002), ‘‘Building competitive advantage through people’’, MIT Sloan
Management Review, Vol. 43 No. 2, pp. 34-41.

Bergmann, R. (1998), ‘‘Introduction to case-based reasoning’’, available at: www.cbr-web.org

Child, J. and Faulkner, D. (1998), Strategies of Co-operation, Oxford University Press, New York, NY,
p. 233.

Denning, S. (2001), The Springboard: How Storytelling Ignites Action in Knowledge-Era Organizations,
Butterworth-Heinemann, Woburn, MA, p. 223.

Edvinsson, L. and Malone, M.S. (1997), Intellectual Capital: Realizing your Company’s True Value by
Finding its Hidden Brainpower, HarperBusiness, New York, NY.

Garvin, D.A. (2000), Learning in Action – A Guide to Putting the Learning Organization to Work, Harvard
Business School Press, Boston, MA.

Hagel, J. III and Armstrong, A.G. (1997), Net Gain, Expanding Markets through Virtual Community,
Harvard Business School Press, Boston, MA, p. 49.

Hofstede, G. (1997), Allemaal Andersdenkenden, Contact, Amsterdam.

Iske, P. (2002), ‘‘Building a corporate KM community’’, KM Magazine, Vol. 6 No. 4.

j j
PAGE 144 JOURNAL OF KNOWLEDGE MANAGEMENT VOL. 9 NO. 1 2005
Iske, P. (2004), ‘‘Are you challenging your brains?’’, survey of under 1,000 people in The Netherlands,
available at: www.knocom.com
Iske, P. and Boekhoff, T. (2001), ‘‘The value of knowledge doesn’t exist’’, KM Magazine, Vol. 5 No. 2.
Iske, P., Kalter, E. and Naber, L. (2003), ‘‘A healthy outlook for KM’’, KM Magazine, Vol. 7 No. 3.

Iske, P. et al. (2002), Measuring the Impact of Q&A Systems, Exchange Project, Erasmus University,
Rotterdam.
Kaplan, R.S. and Norton, D.P. (1996), ‘‘Using the balanced scorecard as a strategic management
system’’, Harvard Business Review, Vol. 74 No. 1, pp. 75-85.

Lev, B. (2002), Intangibles: Management, Measurement and Reporting, Brookings Institution,


Washington, DC.
Morgan, G. (1997), Images of Organization, Sage Publications, Newbury Park, CA, p. 129.
Reamy, T. (2002), ‘‘Imparting knowledge through storytelling’’, KM World, Vol. 11 No. 6, pp. 8-11.
Sveiby, K.E. (1997), The New Organizational Wealth: Managing and Measuring Knowledge-Based
Assets, Berrett-Koehler, San Francisco, CA.
Weggeman, M.C.D.P. (1998), Kennismanagement, Inrichting en Besturing Van Kennisintensieve
Organisatie, Scriptum, Schiedam.

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 145
Note from the publisher
Emerald structured abstracts have arrived!
After months of preparation by journal editors, authors and Emerald publishing staff,
structured abstracts are ready for publication in all Emerald journals. The abstracts appear
in journals from the first issues of all 2005 volumes and a glance at any article title page in this
issue of the Journal of Knowledge Management will illustrate the format and style of the
new-style abstracts. The format differs slightly in the electronic version of articles on
Emerald’s website but this is only a cosmetic variation and takes account of the different
medium and way in which people use abstract information.
The idea for the structured abstracts came about at the start of 2004 and a small team has
worked on the design and introduction of structured abstracts throughout the year. Thanks to
the hard work of everyone involved in producing this journal, Emerald is now able to
showcase the abstracts for the first time. We believe they provide real benefits to our readers
and researchers and that they answer some of the key questions journal users have about a
paper without them having to scan or read the entire article. Some of these questions might
include:
B ‘‘What research has been conducted on this topic?’’
B ‘‘How was the research approached – what methods were used?’’
B ‘‘What were the main findings?’’
B ‘‘Are there any literature reviews on this topic and are they selective or inclusive?’’
B ‘‘So what? The authors have shown this but what does this mean for my
work/organization?’’
B ‘‘I want to conduct research in this area but what questions still need to be answered?’’
B ‘‘Has this work got any relevance and value for me?’’
B ‘‘What did the writer set out to show?’’
Structured abstracts provide the answers to these kinds of questions without the researcher
having to go any further into the article itself. Authors can be more confident that their paper
will be noticed and read by others with a real interest in the topic or research.
As far as possible, we have alerted our authors and editorial team members to this change
via Literati Club Newslines and communications with journal editors. Authors who have been
asked to rewrite their abstracts in the new format have readily obliged. The response from all
parties has been very encouraging:

Structured abstracts are increasing in popularity among the social and behavioral sciences.
There’s overwhelming evidence that readers (and indexers) glean more from structured abstracts
(Jonathan Eldredge, MLS, PhD, AHIP, Associate Professor, School of Medicine, Academic &
Clinical Services Coordinator and Author, Health Sciences Library and Informatics Center, Health
Sciences Center, The University of New Mexico, USA).

For more on structured abstracts and their value for researchers and writers, read the short
paper by Liz Bayley and Jonathan Eldredge at: http://research.mlanet.org/
structured_abstract.html
Everyone has difficulties in the digital environment in weighing up the value of any piece of
information and structured abstracts go some way towards a remedy to the problem of
information overload. Emerald is the very first publisher in the management field to introduce
structured abstracts and whilst we are mindful that this means change for authors and
researchers, we feel our pioneering work in this area gives our journals a strong competitive

PAGE 146 j JOURNAL OF KNOWLEDGE MANAGEMENT j VOL. 9 NO. 1 2005, pp. 146-147, Q Emerald Group Publishing Limited, ISSN 1367-3270
advantage. We are pleased and proud to be the first in the field to implement this extremely
good idea.
Unfortunately, we are unable to go back through more than 40,000 papers already in
Emerald’s database to change already-published abstracts into structured ones. On a more
positive note, however, nearly 5,000 new papers will be deposited into the database this
coming year and all will be accompanied by a structured abstract.
Emerald would be pleased to hear what you think about this initiative. E-mail your views to
Sue de Verteuil, Head, Editorial Developments at:
E-mail: sdeverteuil@emeraldinsight.com

j j
VOL. 9 NO. 1 2005 JOURNAL OF KNOWLEDGE MANAGEMENT PAGE 147

You might also like