Professional Documents
Culture Documents
02 Jesse F Dillard Robert Bricker - A Critique of Knowledge-Based Systems in Auditing
02 Jesse F Dillard Robert Bricker - A Critique of Knowledge-Based Systems in Auditing
Introduction
Recently, there has been keen interest in the application of artificial in-
telligence technology through knowledge-based systems within the audit
work environment’.
A variety of effects have been envisioned, including
reduced costs, high quality and, generally, a revolution in audit practice. Many
believe that expert-mimicking systems will improve audit consistency, insure
proper audit procedures are carried out in each audit and reduce the likelihood
of human error. Similarly, it is believed that these systems have the potential
to improve audit efficiency and effectiveness greatly. However, such con-
Address for correspondence: Professor J. F. Dillard, Anderson School of Management,
University of New Mexico, Albuquerque, New Mexico 87131-1221, USA.
Received 8 June 1990; revised April 1991 and October 1991; accepted December 1991.
follows from the work of Habermas (1968, 1975, 1984). It applies to our
analysis of knowledge-based expert systems in auditing because it focuses on
the systemic forces motivating, and being motivated by, technological
innovation. Through critical theory it is possible to gain a perspective on the
relationship between technological innovations and social tensions.
Using the three perspectives identified above, the following discussion
explores the potential effects of introducing technology in the audit work-
place. Our purpose is not to advocate one perspective over another. The
pluralism represented in these three perspectives helps to anticipate the
impact of knowledge-based systems as they are implemented in an auditing
environment. The exploration is organized as follows. First, we survey
knowledge-based system development in auditing. The survey indicates that
while current applications are not extensive, the anticipated applications are
immense. The next section explores the motivation and anticipated effect of
knowledge-based systems in the professional workplace from a traditional,
technical-historical perspective. This traditional view is expanded by illumi-
nating system design limitations which become evident with a historical-
hermeneutical perspective. Finally, a critical perspective provides insight by
bringing into focus systemic forces motivating technological innovation and
the corresponding effects on participants’ subjective reality.
System Classification
The distinguishing characteristic of knowledge-based systems, when con-
trasted with more traditional computer-based systems’, is the centrality of
human expertise as a template for system design and content. System
expertise in knowledge-based systems is derived from a human expert’s
representation of the task domain (Hayes-Roth et al., 1983, p. 32 cf.). This
expertise is embodied in fact identification and selection and in heuristic rules
which provide for the systematic search of a problem space. These systems
are designed to perform at a level, in terms of quality, quantity and time,
approaching that of a human expert.
Decision support systems and expert systems may be viewed as two
extremes on a continuum representing the extent to which knowledge-based
systems contain reasoning capabilities. With a decision support system, the
system structures the audit judgement process by selecting and sequencing
the decision processes. The structure reflects the sequence of information
processing behaviours evidenced in expert behaviour; however, the system
itself contains no reasoning or logic capability. In addition to sequencing the
information processing, an expert system contains reasoning capabilities
which would be carried out by a domain expert in making the audit
judgement. The judgements are made by the system. The user needs little, if
any, domain-specific knowledge. The system directs actions. The user is
reduced to deciding whether to accept the directives of the system3.
While this paper generally considers the effects of knowledge-based
systems on auditors, the discussion focuses on expert systems. We are
248 J. F. Dullard and R. Bricker
We propose that this is also an apt description of the auditors’ emerging work
environment. The next section addresses these implications from a tradi-
tional, technical-historical perspective.
210 J. F. Dillard and R. Bricker
A Historical-Hermeneutical Perspective
Yet, expert systems may have the effect of eliminating such conversation,
leaving problem identification and alternative consideration to the expert
system. Buckley (1988, p. 192) recognized the potential danger of such utter
dependence:
“I fear that auditors, faced with the need to justify everything they do, will
not act in the future without consulting the oracle of the expert system.
What little vestiges of professional judgment remain may be homaged to
the computer”.
Wallace (1988) indicated similar concerns and suggested that auditors may be
tempted to use systems without critically evaluating what they are doing or
what the system is telling them.
Winograd and Flores further contend that even if the decision maker does
not evaluate all available alternatives, as with quasi-rational decision theories
(e.g. Simon, 1976), well-defined processes and a well-defined problem space
within which the alternatives are located are at least implicitly assumed.
However, this is not always the environment of the auditor. The key decisions
made by an auditor during the course of an audit are generally poorly defined,
or understood, and cannot be fully represented within a prespecified problem
space. If Winograd and Flores are correct, the well-defined processes, and the
associated rational outcomes, which are embodied within a system, are more
the result of the knowledge engineer’s a priori analysis, based on his or her
preconceptions, than a reflection of the auditor’s action context.
An auditor is faced with situations which generate a mood of
“irresolution”‘* arising out of actions and the potential for future action.
Irresolution is dealt with through “deliberating conversation”. Deliberating
conversation can be characterized as follows.
Current expert system applications do not, and in fact cannot, recognize this
broader conversational context. If our arguments are correct, expert systems
in auditing are not capable of this most crucial element of professional
judgement, and thus their applications are, at best, inherently limited.
A Critical Perspective
impede social development and lead to crisis. Recognizing the critical balance
between technical advancement and social integration, this discussion initi-
ates an informed discourse addressing the tradeoffs, both explicit and implicit,
of expert systems being applied, and anticipated, within the audit workplace.
The application of Habermas’ work to the field auditor’s work environment
requires the following assumptions. The first is that an analogy can be drawn
between Habermas’ “global society” and the “society of auditors working
within an audit context”. While the audit environment does not have the
cultural or political richness of the “global society”, it can be viewed as a
microcosm of the larger society. Auditors are members of a community
whose members must cooperate in carrying out designated tasks within a
social context. The social context includes both an externally imposed
structure and internally acquired background.
Second, an externally imposed structure containing the interrelated com-
ponents of technology and organization design is imposed on the auditors’
work environment. The implementation of expert systems in the audit work
environment is an excellent example of such an externally imposed structure.
Third, norms, traditions and values are the basis for internal representations
which provide for understanding social discourse. As we have argued earlier,
the auditor’s judgements are a function of the auditor‘s background. Fourth,
following from the first three, the theory of communicative action provides a
legitimate medium for critique of the audit environment.
In Legitimation Crisis, Habermas defines crisis as an “objective force that
deprives a subject of some part of his [her] normal sovereignty” (p. 1)15. In a
crisis, the person is not subjectively involved and has little power to actively
influence the current state of affairs. Crisis resolution occurs with the
liberation of the individual caught in the situation.
As Pusey (1987) points out, Habermas (1975) specifies four different crises
related to technical consciousness: two systemic and two individual. One
systemic crisis is an economic crisis which occurs as technological innovation
becomes unable to provide productivity gains capable of offsetting the
increased market demands. A second systemic crisis is a rationality crisis. As
the system becomes relied upon to a greater and greater extent, its ability to
provide “rational” decisions in an ever-growing domain begins to break
down. An administrative overload occurs.
The systemic crises give rise to individual or identity crises. A legitimation
crisis arises as the system fails to provide adequate rational decisions in that
necessary validity claims are not considered. As knowledge-based systems
become more prevalent and as they are viewed as extensions of the
bureaucratic hierarchy, the entire administrative structure is subverted. Com-
mitment is undermined.
The loss of commitment, a legitimation crisis, leads to a motivation crisis
where human beings can no longer garner sufficient life meaning, thus
reducing their motivation to continue as active participants. The loss of
meaning and relevance in professional life carries over into views of self
worth. The auditor becomes alienated from his or her work, no longer able to
make the requisite professional commitments because of the absence of
ethical or value considerations. This may currently be a rather extreme
outcome, but it does, we believe, illustrate a legitimate scenario.
218 J. F. Dillard and R. Bricker
Closing Remarks
Serious criticisms have been levelled at Habermas specifically, and critical
theory generally. (See Held, 1980; Thompson & Held, 1982; Bernstein, 1985;
Pusey, 1987; Fay, 1987.) For example, a major tenet of critical theory is that by
overcoming false consciousness through recognition and understanding,
emancipation from oppressive circumstances may be achieved. Fay (19871,
among others, has taken critical theorists to task for what he considers a
naive, idealistic position. For example, the notion that ideas are the sole
determinant of behaviour is incomplete, and the claim that freedom is
synonymous with happiness may be fallacious. There may also be structural
and physical restraints, such as those imposed by authoritarian regimes or
monopolistic labour markets, on the extent to which reflection and under-
standing can lead to resistance and emancipation. Contrary to Habermas’
theory of communicative action, a traditional economics argument proposes
that participation and acceptance are motivated by enlightened self interest,
not through cooperative discourse. Further, the quest for metatheories of
societal phenomenon is outdated enlightenment philosophy. While these
criticisms of Habermas’ ideas represent both legitimate shortcomings as well
as differences in ontological and epistemological opinion, we hold that useful
insights can be gained from taking a critical perspective of knowledge-based
system applications in auditing.
A critique of KBS 221
Notes
1. See Vasarhelyi (1989) and Bailey (1988) for work supporting the claims made concerning
knowledge-based systems applications in the audit work environment.
2. Traditional computer-based systems are those that provide analog technical calculations,
computations and logics that may have no relation to actual human behaviour. Users use this
information as input to their judgement processes.
3. As will be illustrated in the following discussion, the majority of the systems currently
implemented, and to a lesser extent those under development, fall somewhere between these
two extremes.
4. Examples of such programs include: Prentice-Hall’s FAST!; Sequel/McGladrey’s ACE, Crea-
tive Solution’s Accounting Series, CPAiD and CertiFLEX’s PRO 4.0.
5. As previously explained, expert systems and decision support systems are both knowledge-
based artificial intelligence species. The issues raised in the following discussion are
applicable to both; however, they are obvious for expert systems since their ultimate
objective is to replace the human expert (Henderson, 1987; Benbasat & Nault, 1988). as
compared to decision support systems whose objective is to reduce the level of expertise
needed by a decision maker.
6. We recognize the possibility for resistance to technological change through various means as
well as the arguments for a certain “technological determinism” which will render organiza-
tional environments more democratic. However, Orlikowski’s (1991) work suggests that as the
technology becomes more abstract, more deeply imbedded in the production process, it
becomes more difficult to recognize and therefore counter.
7. Within an audit context, this refers to the level or amount of inputs, predominately
professional time, expended in gaining an output, for example evaluation of internal control.
8. Within an audit context, this refers to the ability of the procedures employed to produce the
desired result, for example the estimation procedures used in estimating adequate loan loss
reserves.
9. While there is no formal, empirical verification of this opinion, discussions with current field
auditors and audit partners attest to its credibility.
10. Winograd and Flores (1986) is the most comprehensive and is used as the primary sorce for
the following discussion. All page citations in this section refer to Winograd and Flores (1986)
unless otherwise specified.
222 J. F. Dillard and R. Bricker
References
Gadamer, H., Truth and Method, G. Barden and J. Cummings (trans) (New York: Seabury Press,
1975).
Gadamer, H., Philosophical Hermeneutics, D. Linge (trans) (Berkeley, California: University of
California Press, 1976).
Gibbins, M., “Knowledge Structures and Experienced Auditor Judgement,” in A. Bailey (ed.), pp.
149-170 (Reston, Virginia: Council of Arthur Young Professors, 1988).
Gibbins, M. & Emery, C. “Good Judgment in Public Accounting: Quality and Justification”, in A.
Abdel-khalik and I. Solomon (eds), Auditing Research Symposium 7984, pp. 181-212 (Cham-
paign, Illinois: University of Illinois at Urbana/Champaign, 1985).
Graham, L., “Overcoming Obstacles to Expert Systems Development,” in A. Bailey (ed.) Auditor
Productivity in the Year 2000, pp. 149-170 (Reston, Virginia: Council of Arthur Young
Professors, 1988).
Graham, L.. J. Damens & Van Ness, G.. “Developing RISK ADVISOR: An Expert System for Risk
Identification”, in Proceedings of the 1990 U.S.C.JDeloitte and Touche Audit Symposium, 1990.
Habermas, J., The Theory of Communicative Action, Vols 1 and 2, T. McCarthy (trans) (Boston:
Beacon Press, 1984, 1987).
Habermas, J., Legitimation Crisis, T. McCarthy (trans) (Boston: Beacon Press, 1975).
Habermas, J., Knowledge and Human interest (Boston: Beacon Press, 1968).
Hansen, J. & Messier, W., “A Preliminary Investigation of EDP-EXPERT”, Auditing: A Journal of
Practice and Theory, Fall, 1986, pp. 109-123.
Hayes-Roth, F., Waterman, D. & Lenat, D., Building Expert Systems (Reading, Massachusetts:
Addison-Wesley, 1983).
Heidegger, M., Being and Time, J. Macquarrie and E. Robinson (trans) (New York: Harper and
Row, 1962).
Held, D., introduction to Critical Theory: Horkheimer to Habermas (Berkeley California: University
of California Press, 1980).
Henderson, J., “Finding Synergy Between Decision Support Systems and Expert Systems
Research”, Decision Sciences, Summer, 1987, pp. 333-349.
Howard, R., Brave New York P/ace (New York: Penguin, 1985).
Keen, P. & Scott-Morton, M., Decision Suppoti Systems: An Organizational Perspective (Reading,
Massachusetts: Addison-Wesley, 1978).
Kelley, K., Expert Problem Solving for the Audit Planning Process, unpublished dissertation,
University of Pittsburgh, 1984.
Lyytinen, K. 81 Klein, H., “The Critical Theory of Jurgen Habermas as a Basis for a Theory of
Information Systems”, in E. Mumford, R. Hirschheim, G. Fitzgerald, and T. Wood-Harper (eds),
Research Methods in information Systems, pp. 219-236 (New York: Elsevier, 1985).
Merino, B. & Neimark, M., “Disclosure Regulation and Public Policy: A Sociohistorical Reapp-
raisal”, Journal of Accounting and Public Policy, 1982, pp. 33-54.
Morgan, G., images of Organization (Beverly Hills: Sage, 1986).
Newell, A. & Simon, H., Human Problem Solving (New York: Prentice-Hall, 1972).
Orlikowski, W., “Integrated Information Environments or Matrix of Control? The Contradictory
Implications of Information Technology”, Accounting, Management and Information
Technologies, 199 1, pp. 9-42.
O’Leary, D. & Watkins, P., “Review of Expert Systems in Auditing”, Expert Systems Review,
Spring-Summer, 1989, pp. 3-22.
Preston, A., “The ‘Problem’ In and Of Management Information Systems”, Accounting,
Management and Information Technologies, 1991, pp. 43-72.
Pusey, M., Jurgen Habermas (New York: Tavistock, 1987).
Searle, J., Speech Acts (Cambridge: University Press, 1969).
Simon, H., Administrative Behavior, 3rd edn (New York: The Free Press, 1976).
Solomon, I., “Multi-auditor Judgment/Decision Making Research”, Journal of Accounting
Literature, 1987, pp. l-25.
Srivastava, R., Shenoy, P. & Schafer, G., “Belief Propagation in Networks for Auditing”, in
Proceedings of the 1990 USC./ Deloitte and Touche Audit Symposium, 1990.
Steinbart, P., The Construction of an Expert System to Maker Materiality Judgments, unpublished
dissertation, Michigan State University, 1984.
Summers, E., “Implications of Technological Change on the Accounting Profession,” in A. Bailey
(ed), Auditor Productivity in the Year 2000, pp. 199-203 (Reston, Virginia: Council of Arthur
Young Professors, 1988).
Thompson, J. 81 Held, D. (eds), Habermas: Critical Debates (London: Macmillan, 1982).
Turner, J., The Structure of Sociological Theory, 4th edn (Chicago: Dorsey Press, 1986).
Vasarhelyi. M., Artificial Intelligence in Accounting and Auditing (New York: Markus Wiener, 1989).
Wallace, W., “Educating the Partner in the Year 2000”, in A. Bailey led.), Auditor Productivity in
the Year 2000, pp. 245-254 (Reston, Virginia: Council of Arthur Young Professors, 1988).
224 J. F. Diiard and R. Bcicker
Willingham, J. & Ribar, G., “Development of an Expert System for Loan Loss Evaluation”, in A.
Bailey (ed.), Auditor Productivity in the Year 2000, pp. 171-186 (Reston, Virginia: Council of
Arthur Young Professors, 1988).
Winograd, T. & Flores, F., Understanding Computers and Cognition (Reading, Massachusetts:
Addison-Wesley, 1986).
Wright, M., “A Habermasian Framework For Analysis of Financial Reporting and Auditing: The
Case of Canadian Banks”, paper presented at the Critical Perspectives Audit Symposium, April,
1990.