You are on page 1of 20

Critical Perspectives on Accounting (1992) 3, 205-224

A CRITIQUE OF KNOWLEDGE-BASED SYSTEMS


IN AUDITING: THE SYSTEMIC ENCROACHMENT
OF TECHNICAL CONSCIOUSNESS
JESSEF. DILLARD* AND ROBERT BRICKER~
*Anderson School of Management, University of New Mexico and
fDeparfment of Accounting, Weatherhead School of Management, Case
Western Reserve University

It is important to reflect critically on the effect of knowledge-based


computer systems on human expertise and potential in order to formulate
better an appropriate role for such systems in the audit environment. In this
paper we initiate a critique of knowledge-based systems in auditing by
considering their effect on field auditors’ work and their work place. Three
perspectives are taken: technical-empirical, historical-hermeneutic and
critical. We review current and pending knowledge-based systems which
have audit applications and consider their potential effects in, and on, the
professional audit workplace. Following the work of Winograd and Flores
[Understanding, Computers and Cognition (Reading, Massachusetts:
Addison-Wesley, 1986j, limitations inherent in the prevailing rationalistic
orientation to systems design are discussed. While these systems may be
able to reduce costs and increase audit quality in the short term, there is a
danger that necessary human development and expertise will be sacrificed
if high-level decisions and judgements routinely come within the domain of
such systems. Habermas’ [Legitimation Crisis, T. McCarthy (trans) (Boston:
Beacon Press, 1975) and The Theory of Communicative Action, Vols 1 and
2, T. McCarthy (trans) (Boston: Beacon Press, 1984)) framework for
communicative action is employed to explore both the liberating and
oppressive effects of knowledge-based systems on field auditors. We
contend that, analogous to other technological applications in the work
place, as issues of consequence are pre-empted by knowledge-based
systems the potential for enlightened, and enlightening, debate and
discussion is reduced. In this environment, possibilities for development
and transformation are diminished.

Introduction
Recently, there has been keen interest in the application of artificial in-
telligence technology through knowledge-based systems within the audit
work environment’.
A variety of effects have been envisioned, including
reduced costs, high quality and, generally, a revolution in audit practice. Many
believe that expert-mimicking systems will improve audit consistency, insure
proper audit procedures are carried out in each audit and reduce the likelihood
of human error. Similarly, it is believed that these systems have the potential
to improve audit efficiency and effectiveness greatly. However, such con-
Address for correspondence: Professor J. F. Dillard, Anderson School of Management,
University of New Mexico, Albuquerque, New Mexico 87131-1221, USA.
Received 8 June 1990; revised April 1991 and October 1991; accepted December 1991.

1045-2354/92/030205 + 20 $08.00/O 0 1992 Academic Press Limited


206 J. F. Diiard and R. Bricker

siderations in areas related to professional auditing tend to focus on short


term effects, omitting longer-term implications. In this paper we broaden the
extant discussion of knowledge-based systems applications within the audit
work environment.
One of the most important functions of an audit is to make professional
audit judgments. Gibbins (1988, p. 51) states that “exercising judgment is an
essential everyday task for auditors”. Judgements are the result of an
auditor’s evaluation process within the context of accounting and auditing
rules and procedures. Interpersonal communications among audit profes-
sionals is an essential activity in arriving at a judgement and the correspond-
ing action (See Gibbins & Emery, 1985; Solomon, 1987). Because audit
judgement and interprofessional communication are so closely related, it is
important to consider the potential effects of knowledge-based systems on
them. Yet, these effects have not been systematically considered.
The following discussion investigates the effect of knowledge-based sys-
tems within the audit workplace from three different perspectives (Pusey,
1987, p. 25):
1. the technical-empirical;
2. the historical-hermeneutical; and
3. the critical.
The technical-empirical perspective is aligned with an objectivist tradition.
Within this perspective there are two prevailing views. Both see knowledge-
based systems as a capital investment having the objective of reducing
“production” costs. One view is the traditional positivist view, focusing on
efficient and effective resource utilization. This perspective takes an in-
strumental view of knowledge-based systems and their impact on costs
relative to outputs. The other view takes a traditional labour process view,
focusing on system design and implementation as they impact the field
auditor. As such, it points to the possibility of knowledge-based systems
accelerating the development of elite groups of narrowly focused experts
accompanied by a general deskilling (or not-skilling) of the field auditor.
These two perspectives represent the different sides of the technical-
empirical coin, thus are inherently narrow. The traditional positivist perspec-
tive largely ignores the potential dangers arising from technological applica-
tions. The traditional labour process view generally takes a rather limited
perspective of the productivity gains accruing from knowledge-based
systems.
To broaden the discussion of knowledge-based systems in the auditing
workplace, we provide interpretations from both a historical-hermeneutical
and a critical perspective. The historica/-hermeneuticaj perspective evaluates
knowledge-based systems applications in auditing from a perspective which
recognizes the central position of the social as manifest in language,
conversation and context in audit judgement and decision making (Winograd
& Flores, 1986). This focuses attention on the design limitations that occur in
knowledge-based systems because of the “dialogical understanding in the
traditions and norms of individuals and their value structures”.
The critical perspective is directed toward emancipatory understanding and
A critique of KBS 207

follows from the work of Habermas (1968, 1975, 1984). It applies to our
analysis of knowledge-based expert systems in auditing because it focuses on
the systemic forces motivating, and being motivated by, technological
innovation. Through critical theory it is possible to gain a perspective on the
relationship between technological innovations and social tensions.
Using the three perspectives identified above, the following discussion
explores the potential effects of introducing technology in the audit work-
place. Our purpose is not to advocate one perspective over another. The
pluralism represented in these three perspectives helps to anticipate the
impact of knowledge-based systems as they are implemented in an auditing
environment. The exploration is organized as follows. First, we survey
knowledge-based system development in auditing. The survey indicates that
while current applications are not extensive, the anticipated applications are
immense. The next section explores the motivation and anticipated effect of
knowledge-based systems in the professional workplace from a traditional,
technical-historical perspective. This traditional view is expanded by illumi-
nating system design limitations which become evident with a historical-
hermeneutical perspective. Finally, a critical perspective provides insight by
bringing into focus systemic forces motivating technological innovation and
the corresponding effects on participants’ subjective reality.

A Review of Extant Knowledge-based Systems in Auditing

System Classification
The distinguishing characteristic of knowledge-based systems, when con-
trasted with more traditional computer-based systems’, is the centrality of
human expertise as a template for system design and content. System
expertise in knowledge-based systems is derived from a human expert’s
representation of the task domain (Hayes-Roth et al., 1983, p. 32 cf.). This
expertise is embodied in fact identification and selection and in heuristic rules
which provide for the systematic search of a problem space. These systems
are designed to perform at a level, in terms of quality, quantity and time,
approaching that of a human expert.
Decision support systems and expert systems may be viewed as two
extremes on a continuum representing the extent to which knowledge-based
systems contain reasoning capabilities. With a decision support system, the
system structures the audit judgement process by selecting and sequencing
the decision processes. The structure reflects the sequence of information
processing behaviours evidenced in expert behaviour; however, the system
itself contains no reasoning or logic capability. In addition to sequencing the
information processing, an expert system contains reasoning capabilities
which would be carried out by a domain expert in making the audit
judgement. The judgements are made by the system. The user needs little, if
any, domain-specific knowledge. The system directs actions. The user is
reduced to deciding whether to accept the directives of the system3.
While this paper generally considers the effects of knowledge-based
systems on auditors, the discussion focuses on expert systems. We are
248 J. F. Dullard and R. Bricker

particularly concerned about the effect of decision making, action directing


systems on the development of auditor expertise and judgement. Obviously,
the degree of the auditor’s exclusion from the judgement process depends on
the specific application; however, the desirability of this exclusion has not
been adequately considered. As will become evident later in the discussion,
an unquestioned acceptance of the progressive exclusion of the field auditor
from the audit judgement loop could conceivably have deleterious effects on
auditors and the audit profession.
To date, the application of knowledge-based systems has varied. The
advent of microcomputers and related software has led to a plethora of
software support packages. Many of these are customized applications of
generic software packages. For example, Lotus 123 and other spreadsheet
packages have been used to develop customized programs in a multitude of
applications in hundreds of audit firms. Many of these have only a single user.
Others are commonly used within a firm or more widely across firms for a
particular application.
Many of these applications have been developed using existing computer
software having a more technical, book-keeping emphasis. For example,
beginning in the 196Os, software became widely available for book-keeping
applications. In the 1980s the use of such systems burgeoned with the
appearance and widespread application of microcomputers. The capabilities
of this book-keeping software evolved throughout this period and began to
add analysis features useful to auditors. Concurrently, software was de-
veloped which facilitated auditor write-up work for clients and which autom-
ated auditor work papers. Using these systems, information can be entered or
transferred from a client book-keeping system. As these systems evolved,
features have been added that assisted the auditor. Basic systems offer full
financial statement, trial balance and supporting schedule capabilities. In the
more sophisticated of these systems, ratio analysis, analysis of relationships
between accounts and analysis of account balances among years can be
performed. These systems emphasize the automation of routines4.
While expert systems have appeared as field tools in some accounting
areas, notably taxation, they have not been as widely applied in auditing.
Given the recent technological advancements and reductions in costs, sophis-
ticated knowledge-based systems are becoming feasible in a number of audit
application areas. While many of the current systems are prototypes, some
have been used in the field in a variety of auditing applications.
Many of the knowledge-based system applications have been in the area of
audit planning, audit risk assessment and internal control. Other applications
which have been developed or envisioned include staff scheduling, com-
pliance testing and substantive testing. Arthur Young (now Ernst and Young)
developed a microcomputer-based decision support tool entitled the “Arthur
Young Decision Support Tool” (DS). Its goal is to assist auditors in selecting
an efficient combination of audit procedures (Broderick, 1988). Coopers and
Lybrand has developed RISK ADVISOR, a microcomputer-based system for
assessing audit risk and evaluating client performance (Graham et al., 1990).
Peat Mat-wick’s “Loan Probe” (Willingham & Ribar, 1988) is used to evaluate
the reasonableness of a bank’s loan loss reserves. Deloitte’s CONTROL PLAN,
A critique of KBS 209
which evaluates internal control, is commercially available and is included rn
the comprehensive audit package AUDITPLUS. The system is designed to
arrive at an expert conclusion with a minimum amount of data. A 1986 report
presents a proposal for development of a system to establish auditing
procedures for inventory systems (O’Leary & Watkins, 1989).
There have also been a number of auditing-related systems developed by
academicians. EDP-EXPERT (Hansen & Messier, 1986) tests the adequacy of
internal controls in EDP systems. AUDITOR’S ASSISTANT is an audit evidence
network which has financial statement values as variables for audit planning
and evaluation (Srivastava et al., 1990). Notably, it requires use by an
experienced auditor who accesses the system to develop judgements about
the different degrees of support obtained by various sources of evidence
AUDITOR (Dungan, 1983) analyses the adequacy of bad debt allowances. R
more sophisticated extension of this model is under development which
“‘considers the combination of analytical and judgmental variables” (O’Leary
& Watkins, 1989). Kelley (1984) developed a system called “Internal Contra:
Evaluation” (ICE) for use in audit planning. To form a going concerr,
judgement, the expert system “Going Concern Expert” (GCX) was designed In
a number of versions (O’Leary & Watkins, 1989). Dillard and Mutchler (1988)
used the shell XINFO to model the formation of a going concern judgement.
TICOM-IV helps the auditor model and evaluate an internal control system
(Bailey et al., 1988). Steinbart’s (1984) AUDITPLANNER makes materiaiity
judgements.
Some areas in which expert systems have been proposed for future-8
development include (Graham, 1988):

l Issues of audit risk disclosure associated with new financial instruments


technical accounting standard and so forth.
l Choice of appropriate research methodologies for selected research
issues.
l Application of research methodologies appropriate to data.
l Focus of literature search and standard-related research techniques.
l Use as a teaching and learning aid.

The review of the current activity in the area of knowledge-based systems


indicates that this technology is being applied to relatively sophisticated areas
of audit judgement. It is important, especially at this early stage, to consider
the potential effects on the field auditors’ judgement, expertise and work
environment. Morgan (1986) has observed:

“All kinds of functions once performed by skilled and semiskilled people


are performed electronically, making complete sections or levels of or-
ganization quite redundant, and others more valuable. Networks of rela-
tions between humans give way to “interface” between electronic devices
supported by new kinds of operators, programmers, and other information
specialists” (p. 83).

We propose that this is also an apt description of the auditors’ emerging work
environment. The next section addresses these implications from a tradi-
tional, technical-historical perspective.
210 J. F. Dillard and R. Bricker

A Technical-Empirical Perspective of Expert Systems


in the Professional Audit Workplace
A technical-empirical perspective postulates that the motivation and effect of
knowledge-based systems5 stems from a primary focus on accomplishing
effective and efficient resource utilization through understanding and control-
ling instrumental means-end relationships. The following benefits from
expert system applications in auditing cited by Peat Marwick (see Willingham
& Ribar, 1988) follow from a technical-empirical perspective.
1. Support of field work by freeing auditors from mundane tasks, making
the auditors’ work more interesting, improving the quality of field work
and reducing field work time.
2. Development of specialized experts whose knowledge can be used
widely without their presence.
3. Uniformity of audit documentation and working papers.
4. Increased effectiveness and decreased cost of staff training.
Within the technical-empirical context, one way of understanding the
motivation behind, and effect of, knowledge-based expert systems can be
found in a traditional labour process view. Braverman (1974) argues that one
impetus for implementing technological innovations derives from capital’s
attempt to achieve efficiency through increased control over the production
process and reduced labour costs. Within the context of a professional service
organization, Orlikowski (1991) found that, contrary to conventional wisdom,
the implementation of information technology “augments and extends exist-
ing mechanisms of control, while reinforcing established forms of organiza-
tion” (p. 10)6. (See also Howard, 1985.)
Until the advent of knowledge-based systems, the impact of computer-
based technological innovations was felt primarily in the traditional trades
such as machinist, carpentry and tinsmithing. However, as computers are
being programmed to carry out more complex tasks, the impact on managers
and professionals is increasing. The professional fields of medicine, account-
ing, engineering and law are examples of areas where expert systems
technology is being implemented.
Concerning audit practice, O’Leary and Watkins (1989) state that, as a result
of expert systems “. . . we can expect to see a decrease in the number of
auditors needed to accomplish the same amount of work” (p. 15). More
explicitly, Buckley (1988) argues that expert systems will reduce the labour
component of audit production.
The potential deskilling of the labour force is a particularly acute concern,
because expert systems are meant to replace human experts. While some
small elite pool of human experts is maintained, the development of field
auditors’ professional judgement may be adversely affected by reliance on an
expert system. Buckley (1988, p. 191) prophesies that “. . . never in the history
of auditing will so many rely on so few for so much”. O’Leary and Watkins
(1989) state that:
I, . . . if the [expert] system knows something then the auditor may not need
to know that something. As a result, auditors may forget important
information that they have learned or not learn things that are important”
lp. 15).
A critique of KBS 211

As previously cited, audit firms often perceive the benefits of expert


systems in terms of increased efficiency7 and, to a lesser extent, increased
effectiveness*. The computer has been shown to be useful in carrying out
computational tasks which have prespecified rules and identifiable para-
meters. Spreadhseet programs provide an example of where an auditor’s
effectiveness and efficiency have been greatly enhanced. The auditor is freed
from the mundane computational activities and can thus concentrate his or
her efforts in areas where professional knowledge is needed. The problem of
under-utilization of professional skills is ameliorated and opportunities for
professional development are enhanced. Both academics and practitioners
recognize that the resulting increase in effectiveness and efficiency may affect
the audit practice in such areas as firm organization structure, employee
career paths and educational programs. (For example, see Summers, 1988.)
Educational programs will focus more on how to use the systems and less on
the technical auditing and accounting skills traditionally needed by the field
auditor. The number of entry-level professionals needed is lessened, reducing
the pool from which top management has traditionally been drawn. There is
also a need for a relatively higher percentage of intermediate-level skills
(seniors and managers). Grade-level distributions are changing from triangu-
lar shaped to diamond shaped. Opportunities for gaining the managerial and
professional experience needed for supervisory responsibilities are de-
creased. As Braverman (1974) argues, the typical response to such a dilemma
is more technology; thus, the pressure and enthusiasm for the development
and implementation of expert systems and decision support systems within
the audit practice.
Braverman (1974) chronicles worker frustrations emanating from the de-
skilling effects of technology. The reaction has been at times subtle, man-
ifested as minor industrial sabotage or work slow downs, and at times violent,
manifested as riotous strikes and wholesale property destruction. Such
frustration is not currently evident within the audit workplace. There is little, if
any, overt resistance to the application of knowledge-based systems other
than concern for their technical correctness’. Computer technology has
become so commonplace that knowledge-based systems are perceived as a
logical evolutionary step. Furthermore, there are such high demands on the
field auditor’s time and energies as a result of the changing competitive
environment that any prospects of short-term relief are received with open
arms. More fundamentally, the traditionally high turnover rates in public
accounting facilitate a reduction in the work force with minimal disruptions. In
the next section we investigate implications of expert systems applications
in the audit workplace from a historical-hermeneutical and a critical
perspective.

Alternative Perspectives of Expert Systems


Technically, a computer system cannot feel and sense, cannot understand and
communicate. It can only manipulate symbols and transmit information. The
expert system cannot provide the context for legitimate discourse leading to
conflict resolution and self worth. By taking a viewpoint different from the
traditional empirical position, a more complete understanding of expert
212 J. F. Dillard and R. Bricker

systems’ limitations is possible. Specifically, we address the difficulty of


designing expert systems for making subjective human judgements. First, we
focus on the design limitations identified by Winograd and Flores (1986) as a
result of their historical-hermeneutical analysis. Next, we examine expert
systems within the context of Habermas’ (1968, 1975, 1984, 1987) theory of
communicative action. This critical perspective incorporates both the
technical-empirical and historical-hermeneutical perspectives in dealing with
the social dynamics surrounding technical innovation.

A Historical-Hermeneutical Perspective

A historical-hermeneutical perspective of system design and evaluation has


been advocated and employed to varying degrees by Boland (1985), Orlikow-
ski (1991), Preston (1991) and Winograd and Flores (1986)“. Winograd and
Flores (p. 16) have recognized the rationalistic orientation of current expert
systems design as emphasizing the logical application of general rules to
draw systematic conclusions. Based on the work of Gadamer (1975, 1976) and
Heidegger (1962), it is argued that the rationalistic orientation, while domin-
ant, is inadequate. Thus, the development of expert systems based on the
assumption of some objective reality, independent of the human observer, is
invalid. Orlikowski (1991) and Preston (1991) go further in suggesting that
such a perspective obscures the control imposed by the technology.
Winograd and Flores ground their stance on the predominance of language
in human thought and action. Because of this, the “objective” can be no more
than an interpretation, an interpretation which is based on the assumptions
implicit in language and which is constrained, and enabled, by the traditions
and norms making up the individual’s internal character and value structures
(lifeworld). Representations, including computer inscriptions and language,
are not the objects. Correspondences between representations and “objects”
are historical, not structural, as is presumed by the rationalistic orientation
currently underlying expert system design (p. 43). Knowledge is consensual
representation, or intersubjective agreement, which exists for a social com-
munity. Meaning arises from “an active listening, in which the linguistic form
triggers interpretation rather than conveying information” (p. 57) and is
“relative to what is understood through tradition” (p. 63). Winograd and
Flores point out the dynamics of interpretation through language in that the
individual learns and acts in certain ways as a result of language, and
language is in turn changed by the actions of individuals. Thus, not even the
medium for interpretation is stable or “objective”.
Recall that knowledge acquisition for expert systems is the reformulation of
narrative descriptions of expertise into formal, machine-readable rules. Such a
description assumes that objective knowledge can be abstracted context-free
from an expert and then reapplied within some other contextual setting.
“Background”, the medium of interpretation, gives context and meaning to
current reality. Winograd and Flores argue that background cannot be
reformulated because it is embodied in the linguistic dynamics of a “knowl-
edgeable community” (p. 76). Thus, the idea of an objectively knowledgeable
expert is an impossibility and must be replaced with a recognition that
A critique of KBS 213

expertise’s “situatedness” within a knowledgeable community is the milieu of


professional judgement and motivated action.
A knowledge community necessitates a “dialogue of understanding”.
According to Winograd and Flores, legitimate dialogue involves shared
commitments on the part of the communicating community members.
Speech acts theory (Austin, 1962; Searle, 1969) develops the idea that speech
acts create commitment”. Commitment, and the accompanying validity
claims, are the basis for undistorted communication. Computers are neither
capable of making commitments nor can they enter into legitimate language
discourse. They cannot understand in a human sense and are “restricted to
representing knowledge as the acquisition and manipulation of facts and
communication as the transferring of information” (p. 78).
The prevailing, traditional view of system design is that system designers
create symbolic representations corresponding to facts and expert processes,
These facts are then manipulated by a computer using encoded processes to
provide the user with information. Two critical factors are normally over-
looked. First, “facts”, as they are entered into a system, are represented
symbolically and are not objective truth gleaned from the physical world.
They are assertions and interpretations by an individual within a context
emanating from unavoidable, and sometimes unrecognized, predispositions.
Second, the knowledge acquisition for process specification is generally a
linguistic interchange between the system designer and an expert. “Sen-
tences in a human language cannot be treated as statement of fact about an
objective world, but are actions in a space of commitment” (p. 105). This
recognition of the critical role of context within which observations are made
and within which conversations take place exposes a major, inherent impedi-
ment to expert system design.
Some (e.g. Newell & Simon, 1972) carry the objectification further by
claiming that computational manipulation of symbolic representations is
representative of human thought, understanding and choice. Winograd and
Flores contend that such claims are at best spurious because without context
and background the “objectively” selected relevant rules for choosing are
highly restrictive and do not allow one to see the “irrationality of a situation
as manifested in wrong alternatives and wrong preferences” (p. 145).
Extrapolating from Winograd and Flores’ discussion of expert system
applications to management decisions (see pp. 145-150), audit decision-
making and professional judgement can be seen as a heuristic search within a
problem, or possibility, space. It is generally assumed that the auditor’s task
can be depicted as choosing the appropriate alternative from an alternative
set identified within the problem space. For the most part, this is not an
adequate description of auditor behaviour within a real-life setting. Keen and
Scott-Morton (1978) claim that a lack of understanding about how managers
make decisions is a serious weakness in current computer applications
research. This observation is equally valid for auditing. First, the auditor’s
limited time and resources constrain the generation and selection of audit
alternatives in a given situation. Second, the perspective of the auditor as a
rational decision maker and problem solver does not recognize the back-
ground against, and within, which action is undertaken. Suggesting that a
214 J. F. DiUard and R. Bricker

manager or auditor is “optimizing some value by choosing among alterna-


tives for action is like regarding language understanding as a process of
choosing among formal definitions” (p. 146).
More relevant is the question of how a problem comes to be formulated
and how contextually relevant alternatives are identified. Winograd and Flores
address this matter in the following way:
,, . . . some situation is previous to the formulation, but its existence as a
particular problem (which constrains the space of the possible solutions) is
generated by the commitment in language of those who talk about it. This
conversation in turn exists within their shared background in a tradition. ...
The relevant question is not whether it is ‘true’ or ‘false’ that there is a
problem, but what commitments are generated (for speakers and hearers)
by the speech acts that created it, and how those commitments generate
the space of possible actions” (p. 147).

Yet, expert systems may have the effect of eliminating such conversation,
leaving problem identification and alternative consideration to the expert
system. Buckley (1988, p. 192) recognized the potential danger of such utter
dependence:

“I fear that auditors, faced with the need to justify everything they do, will
not act in the future without consulting the oracle of the expert system.
What little vestiges of professional judgment remain may be homaged to
the computer”.

Wallace (1988) indicated similar concerns and suggested that auditors may be
tempted to use systems without critically evaluating what they are doing or
what the system is telling them.
Winograd and Flores further contend that even if the decision maker does
not evaluate all available alternatives, as with quasi-rational decision theories
(e.g. Simon, 1976), well-defined processes and a well-defined problem space
within which the alternatives are located are at least implicitly assumed.
However, this is not always the environment of the auditor. The key decisions
made by an auditor during the course of an audit are generally poorly defined,
or understood, and cannot be fully represented within a prespecified problem
space. If Winograd and Flores are correct, the well-defined processes, and the
associated rational outcomes, which are embodied within a system, are more
the result of the knowledge engineer’s a priori analysis, based on his or her
preconceptions, than a reflection of the auditor’s action context.
An auditor is faced with situations which generate a mood of
“irresolution”‘* arising out of actions and the potential for future action.
Irresolution is dealt with through “deliberating conversation”. Deliberating
conversation can be characterized as follows.

1. At some moment in the process of articulating the claims, some incipient


partial proposals can be discerned, as different people give opinions,
suggestions, disparagements, counter-offers, etc. In this conversation,
distinctions between means and goals, parts and wholes are discarded in
favour of interpretations about possible causal links, potential results and
inconveniences.
A critique of KBS 215

2. At some moment, a sedimented opinion about possible courses of action


to be evaluated and considered may begin to appear; this is when the
process called “choosing” can be considered. However, the name
“choosing” is inadequate, because it suggests algorithmic procedures
for selecting the course of action (pp. 149-150).
We believe this to be a legitimate description of the context within which a
professional auditor functions. Professional judgement successfully applied is
not the result of a solitary auditor rendering a reflectively deduced decision
but is a collective of activity resulting from conversations among the affected
constituencies and the related commitments to action, or inaction.
Audit judgements are made necessary because of circumstances of “ir-
resolution”. More specifically, we view the execution of an audit as the
articulation and activation of communication networks which foster commit-
ments based on promises and requests among the affected constituencies.
The commitments achieved through the communicative activities are the
bases for audit judgements (resolving “irresolution”). Transformative action
possibilities are created and recognized within a newly constructed action
space. For the possibilities to be realized, actors must be capable of
communicative discourse and be held accountable for the validity of their
claims. These communicative activities constitute the background and context
for professional audit judgment. The application of expert systems has a
retarding effect on facilitating and sustaining these communicative activities.
In summary, we see a number of systematic limitations inherent in expert
systems designed for audit applications which follow from a historical-
hermeneutical perspective (p. 153-155).
1. Choice orientation. This institutionalizes prevailing decision processes
reinforcing the status quo and obscuring possibilities for more social,
emotional, intuitive and personalized approaches.
2. Implied relevance. The choices designed into the system are seen as
those most relevant to the auditor regardless of the circumstances.
3. Unintended transfer of power. The power to define the field of action
shifts from the auditor to the system designer. As a result, problem
definition and the alternative action set are specified by the system
professionals, not by the audit professionals.
4. Unanticipated effects. The paradox of expert system applications is
present in the auditing domain. The system designer’s preference for
system predictability and efficiency may inhibit the auditor’s professional
discretion and innovation. For example, as a judgement situation
becomes more efficient through structuring, professional flexibility is
reduced. Furthermore, as the process becomes more automated, oppor-
tunities for professional discourse are reduced, the result being the
possibility for lower quality professional judgements.
5. Obscured responsibility. Once a system is in place, it tends to be treated
as an independent entity, a surrogate expert, shielding the developer and
the human expert from responsibility.
6. False sense of objectivity. The symbolic representations stored in the
system are accepted as objective facts with no ability to consider their
origins.
216 J. F. Diiard and R. Bricker

Current expert system applications do not, and in fact cannot, recognize this
broader conversational context. If our arguments are correct, expert systems
in auditing are not capable of this most crucial element of professional
judgement, and thus their applications are, at best, inherently limited.

A Critical Perspective

In the preceding discussion we have argued that by considering language,


conversation and context, design limitations inherent in expert audit systems
become evident. A critical perspective extends our understanding by recog-
nizing the forces motivating technological innovation and the corresponding
effects on participants’ subjective reality.
The purpose of a critical perspective is not to develop a causal model of
observed phenomena but to provide clarification of societal tensions so as to
facilitate emancipatory understanding. In this section we investigate pre-
viously specified limitations arising from technology-in our case computer-
based expert systems-on the individual and social settings within the audit
workplace. The basis of this discussion is the work of Jurgen Habermas13
which addresses the effect of “technical consciousness” on individuals and
society. The discussion is predicated on the belief that the better one
understands the systemic forces driving technological advancements, the
greater the likelihood of minimizing the negative effects described above as
well as increasing opportunities for developing human potential.
For the most part, technological advancements in auditing are defined in
terms of efficiency and quality and are evaluated in terms of improved
performance. Within such a context, technical rationality, or technical con-
sciousness, can overwhelm value or ethical considerations. Technical con-
sciousness tends to simplify problems naively, obscure alternatives and
justify instrumental ways of organizing “conversations”. In contrast, a major
component in the historical-hermeneutical evaluation is the lifeworld which
represents the background, or context, for communication made up of norms,
traditions and values that provide the basis for understanding and evaluating
social discourse. Habermas contrasts the lifeworld with the “system”‘4, the
externally imposed structure constituted of interrelated components of tech-
nology and bureaucratic structures, and attempts to incorporate both within a
critical social theory.
Lyytinen and Klein (1985) submit that by using Habermas’ critical social
theory in information system development and implementation one achieves
a much broader understanding of the system’s potential impact. Not only can
the capacity for the information system in increasing organizational
effectiveness and efficiency be recognized but also its potential for emancipa-
tion by overcoming social, physical, communication and physical constraints.
Implementing a critical perspective, Broadbent et a/. (1991) show how
information and management systems in the United Kingdom’s National
Health Service have become increasingly technical and coercive.
We suggest that expert systems are one manifestation of the technological
colonization of the auditor’s lifeworld. Habermas claims that such colonization
obscures the repressive forces which restrict human social intercourse,
A critique of KBS 217

impede social development and lead to crisis. Recognizing the critical balance
between technical advancement and social integration, this discussion initi-
ates an informed discourse addressing the tradeoffs, both explicit and implicit,
of expert systems being applied, and anticipated, within the audit workplace.
The application of Habermas’ work to the field auditor’s work environment
requires the following assumptions. The first is that an analogy can be drawn
between Habermas’ “global society” and the “society of auditors working
within an audit context”. While the audit environment does not have the
cultural or political richness of the “global society”, it can be viewed as a
microcosm of the larger society. Auditors are members of a community
whose members must cooperate in carrying out designated tasks within a
social context. The social context includes both an externally imposed
structure and internally acquired background.
Second, an externally imposed structure containing the interrelated com-
ponents of technology and organization design is imposed on the auditors’
work environment. The implementation of expert systems in the audit work
environment is an excellent example of such an externally imposed structure.
Third, norms, traditions and values are the basis for internal representations
which provide for understanding social discourse. As we have argued earlier,
the auditor’s judgements are a function of the auditor‘s background. Fourth,
following from the first three, the theory of communicative action provides a
legitimate medium for critique of the audit environment.
In Legitimation Crisis, Habermas defines crisis as an “objective force that
deprives a subject of some part of his [her] normal sovereignty” (p. 1)15. In a
crisis, the person is not subjectively involved and has little power to actively
influence the current state of affairs. Crisis resolution occurs with the
liberation of the individual caught in the situation.
As Pusey (1987) points out, Habermas (1975) specifies four different crises
related to technical consciousness: two systemic and two individual. One
systemic crisis is an economic crisis which occurs as technological innovation
becomes unable to provide productivity gains capable of offsetting the
increased market demands. A second systemic crisis is a rationality crisis. As
the system becomes relied upon to a greater and greater extent, its ability to
provide “rational” decisions in an ever-growing domain begins to break
down. An administrative overload occurs.
The systemic crises give rise to individual or identity crises. A legitimation
crisis arises as the system fails to provide adequate rational decisions in that
necessary validity claims are not considered. As knowledge-based systems
become more prevalent and as they are viewed as extensions of the
bureaucratic hierarchy, the entire administrative structure is subverted. Com-
mitment is undermined.
The loss of commitment, a legitimation crisis, leads to a motivation crisis
where human beings can no longer garner sufficient life meaning, thus
reducing their motivation to continue as active participants. The loss of
meaning and relevance in professional life carries over into views of self
worth. The auditor becomes alienated from his or her work, no longer able to
make the requisite professional commitments because of the absence of
ethical or value considerations. This may currently be a rather extreme
outcome, but it does, we believe, illustrate a legitimate scenario.
218 J. F. Dillard and R. Bricker

Habermas argues that these crises are related to impediments in system


and social integration. Systems are associated with the production process’”
and seen as self regulating, setting boundaries and limits and maintaining
identity by coping with the complexity of a changing environment through
adaptation and goal attainment. The social, or lifeworld, is seen as a
symbolically structured social system made up of values and institutions
which “shape members of the system into subjects capable of speaking and
acting” (p. 9). Both need to be considered if the critical connection between
social and system integration is to be maintained. Viewing only the lifeworld,
the control mechanism is obscured. Viewing only the system, the validity
claims of social reality are obscured.
Interaction between system and lifeworld takes place within a set of social
organizing principles which limit, “in the abstract, the possibilities of alterna-
tive social states” (p. 7). According to Habermas these principles determine
learning mechanisms and interpretative scope, and fix institutional control
boundaries. Control problems manifest themselves in crisis if they cannot be
resolved within the range of possibility circumscribed by the social organizing
principle. Problems are resolved through discursive activities among mem-
bers of society. The limits to this communicative activity imposed by the
social organizing principles constrain the resolution set.
Within an audit context, insights may be gained by understanding how
auditors communicate, interact and develop symbolic meaning using three
validity claims necessary for legitimate communication”:
1. Propositional validity concerning external or objective characteristics
(truth-obliged to provide grounds).
2. Normative validity concerning “rightness” relative to social norms
(rightness-obliged to provide justification).
3. Subjective authenticity, relative to the perception and actual intention of
action (truthfulness-obliged to prove trustworthiness).
We recast these at a professional level relative to the audit work setting. The
first validity claim is that an action is the most efficient and effective means for
attaining an end. The focus is toward instrumental, goal-directed actions,
“getting the job done”. A field auditor makes these claims in an audit
engagement to both the client and his/her superiors. The second validity
claim is that action is “correct, and proper in accordance with relevant
norms”. One might argue that the auditor makes such a claim in implement-
ing accounting and auditing standards as they represent “socially acceptable”
modes of behaviour18. The third claim is authenticity and sincerity of
subjective action. This might be related to the professional judgements made
by the auditor in the sense that the auditor holds him or herself out to be
“genuine”“.
Intersubjective understanding is the focus of Habermas’ critical analysis.
This is the communicative relationship serviced by the auditor. Turner (1986)
explains that the basis for this integration is embodied in the processes by
which mutual understandings and stores of knowledge are developed and
communicated. In the previous section we have essentially argued that this
integration is the basis of professional judgement. That is, as the external
A critique of KBS 219

circumstances (system) interface with the internal stores of knowledge


(lifeworld), understanding evolves. Consistent with Winograd and Flares, the
bases for understanding are the stores of knowledge and mutual experience.
These stores of knowledge are the result of personal experience within a
community or society. They are developed through interactions. Going
beyond Winograd and Flores, Habermas argues that normal development
comes through intersubjective understanding emanating from “communica-
tive action”. However, as such an activity (situation) is distorted through
system imposition and/or coercive structures, legitimate intersubjective un-
derstanding is blocked. As this process evolves, opportunities for shared
meanings and intersubjective understanding are lost. This leads to poor social
integration. Legitimate communication media are replaced by artificial (“de-
linguistified”) media.
As technology intrudes on the lifeworld, opportunities for discourse are lost.
As these opportunities are foregone, the development of mutual understand-
ing diminishes. As this understanding diminishes, the common stock of
shared knowledge dwindles. As the common lifeworld d’iminishes, tech-
nological solutions are more strongly demanded. The link between coworkers
withers. The common ground for discussion, empowerment and transforma-
tion is diminished. This leads to a very strong instrumental (technology-
driven) orientation towards goal setting and decision making. As the process
progresses, as decisions become more and more instrumentalized (autom-
ated), the possibility of ethical, value and political considerations become
more and more remote because of the withering away of mutual discourse
and understanding. Appeal to technical authority becomes the means for
resolving crises. This technical authority is devoid of any consideration other
than the criteria promulgated by that authority, for example economics of
efficiency.
Within advanced industrial societies, there is an imbalance whereby the
system (the technical) is dominant over the lifeworld (the social). As this takes
place, the criterion of improved production overrides all other considerations.
The same phenomenon is occurring within the audit practice. While
knowledge-based systems are only one manifestation, they represent most
dramatically those forces which have been acting within the audit practice for
the past 50 or so years ” . In Habermas’ terminology, the auditor’s lifeworld is
being increasingly colonized by the encroaching technology. As colonization
takes place, the auditor becomes more apt to accept unquestionably the
theoretical and practical validity claims of knowledge-based systems, This
limits the ability to move beyond the prevailing organizing principle and
appropriately address changing circumstances. The problem is especially
poignant with respect to expert systems because of the ability to incorporate
the prevailing organization structure and management ideology behind
user-friendly interfaces.
Reframing Winograd and Flores’ position in Habermasian terms, an expert
system is making a set of validity claims which cannot be discursively
questioned by the field auditor, who is not privy to the expert system’s logic
or its designer. The decision structures are imposed through the computer
system. The user is, first, not presented explicitly with this structure and,
220 J. F. Dillard and R. Bricker

second, is not in a position to question or change the structure. Such a


process leads to single-loop, or non-reflective, learning where the practical
and theoretical validity claims are accepted without question as opposed to
double-loop, or reflexive, learning where claims are questioned and verified
through discourse. (See Argyris & Schon, 1978). By reinforcing single loop
learning, expert systems foster an objectivist, value-free orientation. As
technical advancements are driven by economic demands, all other con-
siderations are rendered secondary or are lost.
Expert systems are high-level decision modelling, but are still decision
models. As discussed previously, an implicit assumption is that expert human
judgement can be modelled and thus taken out of the expert’s head. As this
technology is implemented in a reaction to economic pressures, the capacity
of auditors to question the validity of the resulting decision recedes along at
least two dimensions. First, the underlying assumptions, or framework, are
not transparent and thus their validity must be assumed. Second, as single-
loop learning is ingrained, it becomes the behavioural norm. Thus, the
technology becomes embedded in both the context of the task and in the
psyche of the individual.
Expert systems may provide a context wherein “absolute conformity” is
demanded and cultivated if the justification of norms is precluded and
individual identity is disregarded. Habermas argues that this technological
calcification inhibits necessary social and technical integration. The technical
(economic) sphere is separated from the social-discussion sphere. The
integrating social mechanisms of communicative rationality and action no
longer function. Changing contexts cannot be adequately addressed from
either an economic or social perspective.

Closing Remarks
Serious criticisms have been levelled at Habermas specifically, and critical
theory generally. (See Held, 1980; Thompson & Held, 1982; Bernstein, 1985;
Pusey, 1987; Fay, 1987.) For example, a major tenet of critical theory is that by
overcoming false consciousness through recognition and understanding,
emancipation from oppressive circumstances may be achieved. Fay (19871,
among others, has taken critical theorists to task for what he considers a
naive, idealistic position. For example, the notion that ideas are the sole
determinant of behaviour is incomplete, and the claim that freedom is
synonymous with happiness may be fallacious. There may also be structural
and physical restraints, such as those imposed by authoritarian regimes or
monopolistic labour markets, on the extent to which reflection and under-
standing can lead to resistance and emancipation. Contrary to Habermas’
theory of communicative action, a traditional economics argument proposes
that participation and acceptance are motivated by enlightened self interest,
not through cooperative discourse. Further, the quest for metatheories of
societal phenomenon is outdated enlightenment philosophy. While these
criticisms of Habermas’ ideas represent both legitimate shortcomings as well
as differences in ontological and epistemological opinion, we hold that useful
insights can be gained from taking a critical perspective of knowledge-based
system applications in auditing.
A critique of KBS 221

Expert systems are, in part, the consequence of “technical consciousness”.


While they hold potential in enhancing audit efficiency, quality and, perhaps,
training, the application of expert systems may at one level have a negative
effect on the development of the expertise and judgement of field auditors.
The reduction of field auditing’s public sphere may also reduce the quality of
the work environment. Another potential result is Weber’s “iron cage of
bureaucracy”, which may evolve if objective actions are the only ones
considered as the encroachment on the lifeworld becomes complete. If
auditors lose touch with the other dimensions, as they solely engage in
instrumental action, the normative and subjective dimensions are lost, leading
to distorted understandings and ultimately to distorted actions.
Habermas has argued that of the three levels of reality, the technical-
empirical is the level at which advanced industrial societies, and the audit
activity carried out in these societies, function. Action is predicated on
anticipated performance improvements. Improvement is defined in terms of
efficiency and effectivness. Value and ethical considerations are not included
as primary decision criteria. Technical rationality dominates.
As we stated at the outset, our objective is not to champion “one best way”
but to initiate a dialogue encompassing alternative perspectives. Only as we
begin to consider alternative viewpoints of the human condition will we begin
to comprehend and overcome impediments to a more humane existence.

Notes

1. See Vasarhelyi (1989) and Bailey (1988) for work supporting the claims made concerning
knowledge-based systems applications in the audit work environment.
2. Traditional computer-based systems are those that provide analog technical calculations,
computations and logics that may have no relation to actual human behaviour. Users use this
information as input to their judgement processes.
3. As will be illustrated in the following discussion, the majority of the systems currently
implemented, and to a lesser extent those under development, fall somewhere between these
two extremes.
4. Examples of such programs include: Prentice-Hall’s FAST!; Sequel/McGladrey’s ACE, Crea-
tive Solution’s Accounting Series, CPAiD and CertiFLEX’s PRO 4.0.
5. As previously explained, expert systems and decision support systems are both knowledge-
based artificial intelligence species. The issues raised in the following discussion are
applicable to both; however, they are obvious for expert systems since their ultimate
objective is to replace the human expert (Henderson, 1987; Benbasat & Nault, 1988). as
compared to decision support systems whose objective is to reduce the level of expertise
needed by a decision maker.
6. We recognize the possibility for resistance to technological change through various means as
well as the arguments for a certain “technological determinism” which will render organiza-
tional environments more democratic. However, Orlikowski’s (1991) work suggests that as the
technology becomes more abstract, more deeply imbedded in the production process, it
becomes more difficult to recognize and therefore counter.
7. Within an audit context, this refers to the level or amount of inputs, predominately
professional time, expended in gaining an output, for example evaluation of internal control.
8. Within an audit context, this refers to the ability of the procedures employed to produce the
desired result, for example the estimation procedures used in estimating adequate loan loss
reserves.
9. While there is no formal, empirical verification of this opinion, discussions with current field
auditors and audit partners attest to its credibility.
10. Winograd and Flores (1986) is the most comprehensive and is used as the primary sorce for
the following discussion. All page citations in this section refer to Winograd and Flores (1986)
unless otherwise specified.
222 J. F. Dillard and R. Bricker

11. As a theme central to Habermas’ theory of communicative action, commitment commensur-


ate with speech acts will be discussed in the next section.
12. Winograd and Flores define irresolution as the state that arises in a situation where the
manager experiences conflict in discerning an appropriate action following a break-down
(p. 147).
13. Habermas (1975) is the basic source for this discussion; however, these ideas are evident in
much of his writings.
14. In our discussion we use the term “system” in the global, Habermasian sense. When
referring to expert systems, we so state. Parenthetically, our text can be read first using
Habermas’ global meaning of system and then read using the more specific “expert system”.
Such an exercise is illustrative in that by doing so one begins to realize how Habermas’
abstract ideas concerning general societal rationalization are being concretized via specific
technological applications; in this case, expert audit systems.
15. All page citations in this section refer to Habermas (1975) unless otherwise specified.
16. Habermas defines the production process as extracting natural resources and “transforming
the energies set free into use value”.
17. These claims are central to Habermas’ thinking and are discussed in many of his writings. For
example, see Habermas (1984, pp. 94-102).
18. There is obviously some question as to the extent to which accounting and auditing
standards, as they are now constituted, are the result of free and open discourse. Addressing
this issue, Arrington and Puxty (1991) propose the use of Habermas’ communicative action
framework. The framework would provide a medium for rational, democratic discourse from
which would emerge socially permissible accounting and auditing standards; thus shifting
the focus of setting such standards away from the current technical-rational orientation.
19. For an example of Habermas’ validity claims applied in an audit context see Wright (1990).
20. The Securities and Exchange Acts of 1933 and 1934 require independent attestation in order
to insure capitalist property rights in a market system being dominated by utilitarian
economic paradigms (Merino & Neimark. 1982).

References

Arrington, A. 81 Puxty, A., “Accounting, Interests, and Rationality: A Communicative Relation”,


Critical Perspectives on Accounting, Vol. 2, 1991, pp. 31-58.
Argyris, C. & Schon, Il., Organizational Learning: A Theory of Action Perspective (Reading,
Massachusetts: Addison-Wesley, 1978).
Austin, J., How To Do Things With Words (Cambridge, Massachusetts: Harvard University Press,
1962).
Bailey, A., Han, K. & Winston, A., “Technology, Competition and the Future of Auditing,” in A.
Bailey, (ed.), Auditor Productivity in the Year 2000, pp. 23-50 (Reston. Virginia: Council of
Arthur Young Professors, 19881.
Benbasat, I. 81 Nault, B., “Empirical Research in Decision Support and Expert Systems: An
Examination of Research to Date”, in A. Bailey ted.), Auditor Productivity in the Year 2000, pp.
255-304 (Reston, Virginia: Council of Arthur Young Professors, 1988).
Boland, R., “Phenomenology: A Preferred Approach to Research on Information Systems”, in E.
Mumford, R. Hirschheim, G. Fitzgerald and T. Wood-Harper (eds.), Research Methods in
information Systems, pp. 193-201 (New York: Elsevier Science Publisher, 1985).
Braverman, H., Labor and Monopoly Capital: The Degradation of Work in the Twentieth Century
(New York: Monthly Review Press, 1974).
Broadbent, J., Laughlin, R. & Read, S., “Recent Financial and Administrative Changes in the NHS:
A Critical Theory Approach”, Critical Perspective on Accounting, Vol. 2, 1991, pp. l-30.
Broderick, J., “A Practical Decision Support System”, in A. Bailey (ed.), Auditor Productivity in the
Year 2000. DD. 131-148 (Reston, Virginia: Council of Arthur Young Professors, 1988).
Buckley, J., .“‘Expert Systems in Auditing: Implications of Technological Change on the Auditing
Profession”, in A. Bailey fed.), Auditor Productivity in the Year 2000, pp. 187-194 (Reston,
Virginia: Council of Arthur Young Professors, 1988).
Bernstein, R., Habermas and Modernity (Cambridge, Massachusetts: Polity Press, 1985).
Dillard, J. & Mutchler, J., “Knowledge-based Expert Systems in Auditing”, in C. Ernst (ed.), in
Management Expert Systems (Reading, Massachusetts: Addison-Wesley, 1988).
Dungan, C. A Model of an Audit Judgment in the Form of an Expert System, unpublished
dissertation, University of Illinois, 1983.
Fay, B., Critical Social Science (Ithica, New York: Cornell University Press, 1987).
A critique of KBS 223

Gadamer, H., Truth and Method, G. Barden and J. Cummings (trans) (New York: Seabury Press,
1975).
Gadamer, H., Philosophical Hermeneutics, D. Linge (trans) (Berkeley, California: University of
California Press, 1976).
Gibbins, M., “Knowledge Structures and Experienced Auditor Judgement,” in A. Bailey (ed.), pp.
149-170 (Reston, Virginia: Council of Arthur Young Professors, 1988).
Gibbins, M. & Emery, C. “Good Judgment in Public Accounting: Quality and Justification”, in A.
Abdel-khalik and I. Solomon (eds), Auditing Research Symposium 7984, pp. 181-212 (Cham-
paign, Illinois: University of Illinois at Urbana/Champaign, 1985).
Graham, L., “Overcoming Obstacles to Expert Systems Development,” in A. Bailey (ed.) Auditor
Productivity in the Year 2000, pp. 149-170 (Reston, Virginia: Council of Arthur Young
Professors, 1988).
Graham, L.. J. Damens & Van Ness, G.. “Developing RISK ADVISOR: An Expert System for Risk
Identification”, in Proceedings of the 1990 U.S.C.JDeloitte and Touche Audit Symposium, 1990.
Habermas, J., The Theory of Communicative Action, Vols 1 and 2, T. McCarthy (trans) (Boston:
Beacon Press, 1984, 1987).
Habermas, J., Legitimation Crisis, T. McCarthy (trans) (Boston: Beacon Press, 1975).
Habermas, J., Knowledge and Human interest (Boston: Beacon Press, 1968).
Hansen, J. & Messier, W., “A Preliminary Investigation of EDP-EXPERT”, Auditing: A Journal of
Practice and Theory, Fall, 1986, pp. 109-123.
Hayes-Roth, F., Waterman, D. & Lenat, D., Building Expert Systems (Reading, Massachusetts:
Addison-Wesley, 1983).
Heidegger, M., Being and Time, J. Macquarrie and E. Robinson (trans) (New York: Harper and
Row, 1962).
Held, D., introduction to Critical Theory: Horkheimer to Habermas (Berkeley California: University
of California Press, 1980).
Henderson, J., “Finding Synergy Between Decision Support Systems and Expert Systems
Research”, Decision Sciences, Summer, 1987, pp. 333-349.
Howard, R., Brave New York P/ace (New York: Penguin, 1985).
Keen, P. & Scott-Morton, M., Decision Suppoti Systems: An Organizational Perspective (Reading,
Massachusetts: Addison-Wesley, 1978).
Kelley, K., Expert Problem Solving for the Audit Planning Process, unpublished dissertation,
University of Pittsburgh, 1984.
Lyytinen, K. 81 Klein, H., “The Critical Theory of Jurgen Habermas as a Basis for a Theory of
Information Systems”, in E. Mumford, R. Hirschheim, G. Fitzgerald, and T. Wood-Harper (eds),
Research Methods in information Systems, pp. 219-236 (New York: Elsevier, 1985).
Merino, B. & Neimark, M., “Disclosure Regulation and Public Policy: A Sociohistorical Reapp-
raisal”, Journal of Accounting and Public Policy, 1982, pp. 33-54.
Morgan, G., images of Organization (Beverly Hills: Sage, 1986).
Newell, A. & Simon, H., Human Problem Solving (New York: Prentice-Hall, 1972).
Orlikowski, W., “Integrated Information Environments or Matrix of Control? The Contradictory
Implications of Information Technology”, Accounting, Management and Information
Technologies, 199 1, pp. 9-42.
O’Leary, D. & Watkins, P., “Review of Expert Systems in Auditing”, Expert Systems Review,
Spring-Summer, 1989, pp. 3-22.
Preston, A., “The ‘Problem’ In and Of Management Information Systems”, Accounting,
Management and Information Technologies, 1991, pp. 43-72.
Pusey, M., Jurgen Habermas (New York: Tavistock, 1987).
Searle, J., Speech Acts (Cambridge: University Press, 1969).
Simon, H., Administrative Behavior, 3rd edn (New York: The Free Press, 1976).
Solomon, I., “Multi-auditor Judgment/Decision Making Research”, Journal of Accounting
Literature, 1987, pp. l-25.
Srivastava, R., Shenoy, P. & Schafer, G., “Belief Propagation in Networks for Auditing”, in
Proceedings of the 1990 USC./ Deloitte and Touche Audit Symposium, 1990.
Steinbart, P., The Construction of an Expert System to Maker Materiality Judgments, unpublished
dissertation, Michigan State University, 1984.
Summers, E., “Implications of Technological Change on the Accounting Profession,” in A. Bailey
(ed), Auditor Productivity in the Year 2000, pp. 199-203 (Reston, Virginia: Council of Arthur
Young Professors, 1988).
Thompson, J. 81 Held, D. (eds), Habermas: Critical Debates (London: Macmillan, 1982).
Turner, J., The Structure of Sociological Theory, 4th edn (Chicago: Dorsey Press, 1986).
Vasarhelyi. M., Artificial Intelligence in Accounting and Auditing (New York: Markus Wiener, 1989).
Wallace, W., “Educating the Partner in the Year 2000”, in A. Bailey led.), Auditor Productivity in
the Year 2000, pp. 245-254 (Reston, Virginia: Council of Arthur Young Professors, 1988).
224 J. F. Diiard and R. Bcicker

Willingham, J. & Ribar, G., “Development of an Expert System for Loan Loss Evaluation”, in A.
Bailey (ed.), Auditor Productivity in the Year 2000, pp. 171-186 (Reston, Virginia: Council of
Arthur Young Professors, 1988).
Winograd, T. & Flores, F., Understanding Computers and Cognition (Reading, Massachusetts:
Addison-Wesley, 1986).
Wright, M., “A Habermasian Framework For Analysis of Financial Reporting and Auditing: The
Case of Canadian Banks”, paper presented at the Critical Perspectives Audit Symposium, April,
1990.

You might also like