You are on page 1of 9

COMPUTING AND ACCOUNTABILITY

Helen Wlssenbaum
tcachcr stands brforr her sixth-srade class
demandin!: to know who shot the spitball in her
ear. She threatrns punishment for the whale class
if somrone does not step forward. Eyes arc cast
downward and nervous $&s arr suppressed as
a boy in the back row slowly raises his hand.
The boy in the back row has answered for
his actions. WC do not know whrther he shot at
thr tracher imcntionally or merely missed his
true target, whether he artrd alone or under
goading from classmates, or even whether thr
spitball was in protest for an unrcasonabb action
taken by the teacher. While all ofthese facton are
relevant to determining a just response to the
boy’saction, the boy, in acc~ptingresporlsibility
for his action, has fulfilled the valoablc social
obligation of accountability.
In an increasingly compmcrizrd society,
whew computing, and its broad application,
brings dramatic changes to our way of life, and
exposes us to harms and risks, accountability is
A socK!ty “1

pr~~lc~~ional ~on~n~u~ti~y)! hat insists on accountability, in which agents are


rxprttrd to answer lor tlwir work, signals esteem for high-quality work, and
en~oorages diligcmt, rrsponsible practices. Furthermore, where lines ofaccount-
ability arc maintained, they provide the foundations for just punishment as well
as compensation for victims. By contrast, the absence ofaccountability means
that no one answers for harms and risks. Insofar as they are regretted, they arc
srcn as unfortunate accidents-consequences of a brave new technology. As with
accidents due to natural disasters such as hurricanes and earthquakes, we
sympathize with the victims’ losses, but do not demand accountability.
This article maintains that accountability is systematically undermined in our
computerized society-which, given the value ofaccountability to society, is a
disturbing- loss. While this systematic erosion of accountability is not an
inevitable consequence of computerization, it is the inevitable consequence of
several factors working in unison--an overly narrow conceptual understanding
ofaccountability, a set of assumptions about the capabilities and shortcoming-s
ofcomputer systems, and a willingxcss to accept that the producers ofcomputer
systems are not, in g-enrral, fully answerable for the impacts of their products.
If not addressed, this erosion of accountability will mean that computers are ‘but
ofcontrol” in an important and disturbing way. This article attempts to explain
why there is a tendency toward diminished accountability for the impacts, harms,
and risks of computing, and it offers recommendations for reversing it.
My concern over accountability has grown alongside the active discussion
within and about the computer profession regarding the harms and risks to
society posed by computers and computerized systems. These discussions appeal
to computer professionals,’ to the corporate producers of computer systems,
and to government re&+tors, to pay more heed to system safety and reliability
in order to reduce harms and risks [l, 12-14, 16, l&22,28]. Lives and well-being
arc increasingly dependent on computerized systems. Greater numbers oflife-
critical systems such as aircarft, spacecraft, other transportation vehicles, medical
treatment machines, military equipment, and communications systems are con-
trolled by computers. Increasing numbers of “quality-of-life-critical” systems,
from the vast information systems (IS) supporting infrastructures of govcm-

You might also like