You are on page 1of 11

The Moral Responsibilities of Scientists (Tensions between Autonomy and

Responsibility)
Author(s): Heather E. Douglas
Source: American Philosophical Quarterly , Jan., 2003, Vol. 40, No. 1 (Jan., 2003), pp.
59-68
Published by: University of Illinois Press on behalf of the North American
Philosophical Publications

Stable URL: https://www.jstor.org/stable/20010097

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide
range of content in a trusted digital archive. We use information technology and tools to increase productivity and
facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
https://about.jstor.org/terms

and University of Illinois Press are collaborating with JSTOR to digitize, preserve and extend
access to American Philosophical Quarterly

This content downloaded from


85.96.130.68 on Mon, 20 Sep 2021 18:13:37 UTC
All use subject to https://about.jstor.org/terms
American Philosophical Quarterly
Volume 40, Number 1, January 2003

THE MORAL RESPONSIBILITIES


OF SCIENTISTS
(TENSIONS BETWEEN AUTONOMY
AND RESPONSIBILITY)
Heather E. Douglas

I. Introduction believe in a causal structure that allows


for moral responsibility in general, or you
An the general philosophical literature, the do not.
question of moral responsibility evokes Instead of being about these general is?
issues of general competence (when is a sues, the moral responsibility of scientists
person morally capable of making deci? hinges on issues particular to professional
sions and thus being responsible for them), boundaries and knowledge production. The
coercion (what kind of forces on a person question is what we should expect of sci?
make their choices not their own, but due entists qua scientists in their behavior, in
to someone else, thus shifting moral re? their decisions as scientists engaged in their
sponsibility), and causation (what concep? professional life. As the importance of sci?
tion of causality allows for both enough ence in our society has grown over the past
free will and enough foresight so that we half-century, so has the urgency of this
can be considered responsible for the out? question. The standard answer to this ques?
comes of our actions). (See Arnold 2001; tion, arising from the Freedom of Science
Fischer and Ravizza 1993; and Paul, movement in the early 1940s, has been that
Miller, and Paul 1999) The question of the scientists are not burdened with the same
moral responsibility of scientists, however, moral responsibilities as the rest of us, i.e.,
does not hinge on these general issues. Sci?that scientists enjoy "a morally unencum?
entists are assumed to be generally capable bered freedom from permanent pressure to
moral agents; we do not believe that sci? moral self-reflection." (Liibbe 1986, 82)
entific training somehow impairs moral This essay will challenge the traditional
reasoning or moral sentiments. Scientists view.1 First, a basic framework in which
are not usually under threat of force to not to assess the moral responsibilities of sci?
consider certain moral issues; coercion entists will be developed. This framework
would be considered as pathological in sci? suggests that the basic tensions concern?
ence as it would anywhere else. And the ing the moral responsibilities of scientists
issue of causation applies just as well for arise because of a tension between role re?
scientists as for anyone else; either you sponsibilities and general responsibilities

59

This content downloaded from


85.96.130.68 on Mon, 20 Sep 2021 18:13:37 UTC
All use subject to https://about.jstor.org/terms
60 / AMERICAN PHILOSOPHICAL QUARTERLY

(defined below). Contrary to the traditional absolute need for complete freedom of
view, role responsibilities do not, and can? inquiry by scientists. As Percy Bridgman,
not, trump the general responsibilities Harvard physicist and Nobel Prize recipi?
scientists have as human moral agents. ent, argued in the early years of the
Scientists are left with a choice: either ac? atomic age:
cept the burden of general responsibilities The challenge to the understanding of nature
themselves, or lose much of their much is a challenge to the utmost capacity in us. In
prized autonomy in allowing others to take accepting the challenge, man can dare to ac?
on the burden for them. cept no handicaps. That is the reason that
There are two general bases for moral scientific freedom is essential and that the
responsibilities in modern life: there are the artificial limitations of tools or subject mat?
general moral responsibilities that each of ter are unthinkable. (Bridgman 1947, 153)
us holds as humans/full moral agents and The knowledge that scientists produce is
there are the role responsibilities that arise
so valuable to society, Bridgman suggests
from our taking on particular positions in in his essay, that we must relinquish other
society. Usually, in practice, role respon? claims of social or moral responsibility on
sibilities expand our set of responsibilities. scientists so that they can produce this val?
For example, a person who becomes a par? ued end. Scientists, under this view, not
ent takes on special responsibilities to? only have a need for autonomy (i.e., the
wards their children, but receives no ability to be the primary decision-makers
lessening in other responsibilities as a re? in determining the direction of their work),
sult. Sometimes, responsibilities (arising but also have a need to be free from con?
from both role and general bases) must be sidering the potential consequences ofthat
weighed against each other. In some cases, work beyond the realm of science.
however, role responsibilities call for a Does the social role that scientists play,
contraction of general responsibilities. the development and presentation of new
Thus, a defense lawyer is not obligated to knowledge, call for a release of scientists
report past criminal activity that they learn from general moral responsibilities to think
about from their client (it being protected about the potential consequences of one's
under lawyer-client confidentiality), actions and choices? As will be argued be?
whereas the rest of us would have a moral low, it does not. The social structures that
obligation to report such activity. This would allow for such a reduction in gen?
works only because of the general struc? eral moral responsibilities are not in place,
ture under which our justice system works. and if they did exist, would almost elimi?
Because of the rigid adversarial structure, nate the autonomy of scientists. Because
lawyers have reduced general responsibili? of the nature of scientists' work, scientists
ties in response to increased role responsi? must choose between shouldering the re?
bilities. Without the rigid structure, that sponsibilities themselves and giving up
both defines the defense lawyer's role as decision-making autonomy to allow others
client advocate and assigns to other parties to do so.
the job of properly uncovering crime, such a
reduction in general responsibility would not II. Negligence, Recklessness, and
be possible. Responsibility
It has been argued or assumed that sci?
entists have a similar type of role. Such Before examining the role responsibili?
ties of scientists, an examination of our
an argument has generally centered on an

This content downloaded from


85.96.130.68 on Mon, 20 Sep 2021 18:13:37 UTC
All use subject to https://about.jstor.org/terms
THE MORAL RESPONSIBILITIES OF SCIENTISTS / 61

general moral responsibilities is needed. The key point is that we expect moral agents
What do we mean when we say a person is to carefully weigh such risks and to deter?
morally responsible for some action or mine whether they are, in fact, justified.
outcome? The first distinction central to If, on the other hand, one is not aware
this examination is between causal respon? that one is risking harm, but one should be
sibility and moral responsibility. Moral aware, then one is being negligent. When
responsibility involves the attribution of being negligent, one does not bother to
blame or praise. This essay will focus for evaluate obvious risks of harm, or one does
simplicity on blame, although in general not think about potential consequences of
there should be broad symmetries between one's actions. As Feinberg noted, there are
what warrants praise and what warrants many ways in which to be negligent:
blame. The attribution of blame is not
One can consciously weigh the risk but
equivalent to the attribution of cause. misassess it, either because of hasty or oth?
While some kind of causal connection is erwise insufficient scrutiny (rashness), or
necessary, we are not held morally respon? through willful blindness to the magnitude
sible for all the things in which we play a of the risk, or through the conscientious ex?
causal role. ercise of inherently bad judgment. Or one
Minimally, we are morally responsible can unintentionally create an unreasonable
for those things we intend to bring about. risk by failing altogether to attend either to
what one is doing (the manner of execution)
However, we are not morally responsible
or to the very possibility that harmful con?
merely for those things we intended to
sequences might ensue. (1970, 193-194)
bring about. We are also morally respon?
sible to some extent for side effects of our The difficulty with negligence, in addition
actions. While this is widely accepted, it is to determining whether a risk is justified,
a difficult question under what circum? is to determine what should be expected
stances and to what extent we should be of the agent. How much foresight and care?
responsible for unintended consequences. ful deliberation should we expect the
Two general categories cover unintended individual to have? Often, this question is
consequences: recklessness and negli? answered through an examination of com?
gence. The definitions used here follow munity standards, couched in terms of what
Feinberg 1970 and general legal usage. As a reasonable person would have done in
Feinberg wrote: "When one knowingly cre?
like circumstances.
ates an unreasonable risk to self or others, Through recklessness and negligence,
one is reckless; when one unknowingly but one can be held responsible for unintended
faultily creates such a risk, one is negli? consequences both when things go the way
gent" (1970, 193). When one is reckless, one expects them to and when things go
one is fully aware of the risks one is tak? awry. Through negligence, things may go
ing or imposing on others, and those risks exactly as planned (as far as you planned
are unjustified. What justifies certain risks, them), and still harmful and clearly fore?
particularly when the risks involve other seeable consequences would be your fault.
people not given decision-making author? You would be responsible because you
ity, can be contentious. Nevertheless, there should have foreseen the problems and
are clear examples of justified risk (speed? planned further. For example, suppose you
ing on city streets to get a seriously injured set fire to a field one dry summer to clear
it of brush. You didn't bother to think about
person to the hospital) and unjustified risk
(speeding on city streets for the fun of it). how to control the fire, not recognizing the

This content downloaded from


85.96.130.68 on Mon, 20 Sep 2021 18:13:37 UTC
All use subject to https://about.jstor.org/terms
62 / AMERICAN PHILOSOPHICAL QUARTERLY

obvious risk. Because of your negligence, foresight we might expect from scientists.
damage caused by the fire raging beyond Happily, the reasonable person standard
your property is your moral responsibility. might actually be easier to apply to scien?
If, on the other hand, you are aware of the tists than to the average citizen, particu?
risks in setting the fire, decide not to care larly with respect to the issue of foresight.
and proceed anyway, then you are respon? Scientists have fairly tight-knit epistemic
sible for the damage when the fire escapes, communities and routinely discuss the po?
because of recklessness. The distinction tential implications of their work. It is rare
between recklessness and negligence thus that a scientist has not thought about some
rests on the thought processes of the agent, of the potential consequences of their ac?
on whether they reflect on potential con? tions as scientists. Reflecting on the proper
sequences (assuming both that events go methodologies and crafting one's papers to
as planned and that unexpected "errors" elicit certain responses (particularly in ar?
occur) and on whether there was any guing for the relevance of one's research)
acknowledgement of possible harms that is part of the trade. Discussions concern?
could arise from the agent's action. Reck? ing risks and implications fill journal
lessness is the acceptance of unreasonable pages. Because of this aspect of scientific
risk; negligence is the failure to foresee and communities, it is probably easier to as?
prevent such risk. certain what a scientist should have thought
How might recklessness and negligence about before performing a risky experiment
apply to scientists? Let us assume that sci? (or even whether it was perceived as risky
entists bear the full general moral respon? by others) than in the broader social con?
sibilities to not be reckless or negligent. text. However, scientists may not be ac?
(Reasons for such a view are presented customed to full reflection on potential
below.) This would mean that scientists implications of their work. Demanding that
should think about the potential consequen? scientists shoulder their general responsi?
ces of their knowledge producing activi? bilities may require deeper reflection from
ties. If some new piece of knowledge could them. Nevertheless, because of scientific
lead to both good and harm, then scientists communities, it could be fairly straight?
should think about those harms and goods, forward to ascertain whether a scientist
and act in ways to ensure that the good should have perceived certain potential
outweighs the harm.2 If the magnitude or consequences.
severity of the harm is too great and the Whether scientists should even consider
probability too high to be compensated for broader social implications and other im?
by the potential goods, then the act is reck? pacts of their work is contested. Under the
less, imposing unjustified risks. Scientists standard views of negligence, to not con?
should also think about the potential con? sider foreseeable implications opens one to
sequences of error, and be sure that they attributions of negligence and blame. If sci?
are properly acting to prevent those con? entists do foresee implications for a
sequences or to mitigate the possible im? particular case where the apparent risks
pacts of error. To not think about how outweigh the apparent benefits, and they
things might go wrong, and to act reason? then proceed anyway, they can be held re?
ably to prevent possible harms due to er? sponsible for consequences on the basis of
ror, would be negligent. recklessness. Thus, to what extent scientists
In order to apply negligence to scientists, should be held accountable to the general
however, we must decide what level of moral requirements imposed by recklessness

This content downloaded from


85.96.130.68 on Mon, 20 Sep 2021 18:13:37 UTC
All use subject to https://about.jstor.org/terms
THE MORAL RESPONSIBILITIES OF SCIENTISTS / 63

and negligence is crucial. And this issue is a time-consuming and thankless job, but
hinges on whether role responsibilities one seen as essential to maintaining inter?
trump these general moral responsibilities nal scientific standards, and thus an ac?
to consider unintended consequences. cepted responsibility. Finally, scientists
accept responsibility for training the next
III. The Role Responsibilities of
generation of scientists properly.
Scientists These specific role responsibilities all
serve the general goal of the search for
If the primary goal of science is to de?
truth (or reliable knowledge about the
velop knowledge about the world, then the
world).4 It seems clear that this goal de?
role responsibilities of scientists should be
fines the role responsibilities of scientists.
structured around this goal. Not surpris? What is at issue here is whether this re?
ingly, this is largely the case. Also not
sponsibility, this goal, obliterates other re?
surprisingly, these responsibilities are not
sponsibilities scientists have as human
hotly contested among scientists, but in?
beings and capable moral agents.
stead are largely accepted as part of, or
essential to, being a scientist and the proper IV. General Responsibility vs. Role
functioning of science. For example, sci? Responsibility for Scientists
entists expect honest reporting of data, and
cases of dishonesty in such reporting are It has been argued above that we have a
universally condemned as fraud and dealt general responsibility to consider both in?
with harshly in the scientific community tended and unintended consequences of our
(once proven). There is no debate on actions. Whether we are negligent in con?
whether such dishonesty is acceptable or sidering potential consequences or reckless
whether it is harmful to science.3 in our disregard for potential harms, we are
Other responsibilities also come with held responsible for consequences if our
being a scientist. It is expected that sci? actions are unjustifiable or unreasonable.
entists will share important results with Whether scientists are subject to such gen?
each other to further the field. (However, eral moral reflection is not clear. As
this expectation is being directly chal? mentioned in the introductory section, the
lenged as more research is being kept se? standard view of the past half-century has
cret because of business interests. See, e.g., been that they are not. In this section, it
Maker 1994.) Scientists are also expected will be argued that this view is mistaken.
to respond in some way to valid criticism. One way to protect scientists from hav?
It is unacceptable in science to simply ig? ing to think about the potential impact of
nore (over a long period of time) evidence their work is to argue that scientists pur?
that runs contrary to one's views. One must sue new knowledge (or truth), and that this
account for it in some way, by finding fault pursuit is so important, nothing should get
with it or accommodating it, or calling for between scientists and their pursuit. This
further investigation with greater accuracy. argument has seen many forms, from ar?
Simply ignoring other people's works or guments concerning free inquiry to the
arguments is not acceptable in science. In freedom in science movement of the 1940s.
addition, when asked, it is expected that Under these views, what scientists do is so
scientists will serve as peer reviewers for important, they should not be hampered by
their fellow scientists' work (either for social, moral, or political considerations.
publication or for funding proposals). This The pursuit of knowledge/truth trumps all

This content downloaded from


85.96.130.68 on Mon, 20 Sep 2021 18:13:37 UTC
All use subject to https://about.jstor.org/terms
64 / AMERICAN PHILOSOPHICAL QUARTERLY

other values, and allows scientists to shed research programs, or by framing the pre?
their general responsibilities in favor of sentation of the results, emphasizing some
their role responsibilities. Someone else, implications over others.
politicians or ethicists or society outside In addition, some of the choices scien?
of science, should worry about the impli? tists make in pursuing knowledge occur
cations, but without interfering with the deep in scientific practice, in the fine tun?
activity of scientists. ing of methodologies, the characterization
This picture fails for two reasons. First, of data, or the interpretation of data. That
knowledge (or the pursuit of truth) does not the scientist makes these choices, with po?
trump all other values. If it did, we would tentially important implications and po?
happily submit our children to scientists who tential consequences of error, may not even
wished to use them for biochemical testing be apparent in the published final report
and no moral limits on methodologies would (Douglas 2000). The places of choice and
be in play. But truth is not so valuable to us control by scientists with important social
that we are willing to do this, despite the and ethical implications go far beyond the
fact that controlled human testing would be ostensibly publicly controllable application
the best and perhaps only way to fully un? of science in new technologies. Someone
derstand the full biological impacts of must be responsible for thinking about the
chemical substances, for example. That potential consequences at these decision
there are prices we are not willing to pay points, or the general responsibilities go
for knowledge, or the search for truth, completely neglected.
means it is not an ultimate value existing An abandonment of these general respon?
on a plane above all others. The categorical sibilities in science is neither warranted nor
pursuit of truth is unacceptable. This does desirable. Indeed, the consequences could
not mean that the pursuit of truth is not valu? well be catastrophic. One striking example
able, or that it is not one of preeminent of scientists seriously and successfully
values of our society. It simply means that, paying attention to their general responsi?
in general, other values deserve to be con? bilities arose during the development of
sidered as well. atomic weapons. Whatever one may think
Second, the potential implications of sci? of the morality of building such weapons,
ence go well beyond the applications of the test of the first plutonium bomb was
scientific knowledge. While it may be pos? not just a test of a new technology. It was
sible that those outside of science can con? also a decisive test of some of the physical
trol the uses of applied science in the form principles that went into the development
of technology, the impact of knowledge per of the bomb, from the fissionability of plu?
se is far less controllable. As many have tonium to the calculations behind the im?
pointed out, knowledge is not passive. It plosion device developed by Kistiakowsky.
can, by its very existence, change our self It was also an experiment about what hap?
conception, our options and responsibili? pens when you produce an explosive chain
ties, and our view of the world (Sinsheimer reaction in the atmosphere. No one had
1979, Wachbroit 1994, Johnson 1996). done this before, and there were some wor?
Outsiders to science cannot control a piece ries. One worry that was thought up well
of research's epistemic and ethical impact before the test, and worked on by Hans
after the research is complete. Only scien? Bethe, was that the energy in the explo?
tists (or other insiders) can exert such con? sion might produce an explosive chain re?
trol, by choosing to pursue particular action in the earth's atmosphere, thus

This content downloaded from


85.96.130.68 on Mon, 20 Sep 2021 18:13:37 UTC
All use subject to https://about.jstor.org/terms
THE MORAL RESPONSIBILITIES OF SCIENTISTS / 65

obliterating human life on earth. Happily, burden of general responsibilities from sci?
the scientists had not only thought of this entists. The consideration of non-epistemic
potential outcome, Bethe pursued the pos? consequences could be neither an after?
sibility and determined it was scientifically thought to the research project nor a pro?
impossible (Rhodes 1986, 419). Who else cess merely at the start of the project if the
but the scientists immersed in the project general responsibilities are to properly ful?
could have foreseen this and determined it filled. Instead it would have to be an inte?
was nothing to worry about?5 gral part of it, being involved throughout
Another standard example of scientists the research project. Those shouldering the
taking on their general responsibilities is general responsibilities to consider social
the concern scientists raised over recom? and ethical consequences of research (or
binant DNA techniques and the resulting of errors in research) would have to have
Asilomar conference. (See Culliton 1979.) decision-making authority with the scien?
In both these cases, scientists, while doing tists, in the same way that research review
science, reflected on the potential unin? boards now have the authority to shape
tended consequences and found the risks methodological approaches of the scientists
unacceptable. Before proceeding with the when they are dealing with human subjects.
development of science, they paused, and However, unlike these review boards,
made sure that the harmful consequences whose review takes place at one stage in
either were nearly impossible, or figured the research project, those considering all
out ways to make them so. of the non-epistemic consequences of sci?
It is doubtful that anyone could fully take entific choices would have to be kept
over this function for scientists. Because abreast with the research program at every
science's primary goal is to develop knowl? stage (where choices are being made), and
edge, this goal invariably takes scientists would have to have the authority to alter
into uncharted territory. While the science those choices when necessary. Otherwise
is being done, presumably only the scien? the responsibility would not be properly
tist can fully appreciate the potential im? fulfilled, and would not be able to keep
plications of the work, and, equally pace with the developments accompany?
important, the potential errors and uncer? ing discovery. Such intensive oversight,
tainties in the work. And it is precisely however, would devastate any remaining
these potential sources of error, and the autonomy in science.
consequences that could result from them, In sum, the structure of science cannot
that someone must think about. The scien? shield scientists from their general respon?
tists are usually the most qualified to do sibilities without sacrificing their au?
so. Partial sharing of general responsibili? tonomy. Currently, there are no adequate
ties is possible, as is now done with ethi? social structures that could fully take over
cal review boards and clear standards for this function from scientists, if we thought
dealing with human subjects. But the more it desirable to do so. And to fully develop
that scientists relinquish their general re? such a structure would involve a very high
sponsibilities, the more they must relin? level of oversight and possible interven?
quish their autonomy. tion in science, at every decision point sci?
In order to see why, consider what would entists have. Such oversight would oblit?
be required if we implemented a system erate scientific autonomy. This does not
that provided ethical oversight of all sci? mean scientists should be fully on their own
entific decisions, in order to remove the when grappling with these difficult issues

This content downloaded from


85.96.130.68 on Mon, 20 Sep 2021 18:13:37 UTC
All use subject to https://about.jstor.org/terms
66 / AMERICAN PHILOSOPHICAL QUARTERLY

raised by general responsibilities. When such responsibilities) nor could it ever fully
research programs are well-mapped out be in place, without a complete loss of sci?
ahead of time, general public discussion of entific autonomy. Scientists must shoulder
the potential impacts can help scientists at least some of this general responsibil?
shoulder the burden. But precisely because ity. The power of science simply makes the
they are in the forefront of an ever-chang? fulfillment of the responsibility more ur?
ing field, they cannot fully relinquish their gent. Thus, scientists, with the power they
general responsibilities without fully los? wield through their endeavors, must show
ing autonomy. due care with the execution of the general
Because of the awesome power of sci? responsibility not to be reckless or negli?
ence to change both our world, our lives, gent. Contrary to the tendencies of the past
and our conception of ourselves, the ac? century, scientists should place more
tual implementation of scientists' general thought into the consequences of their ac?
responsibilities will fall heavily on them. tions as scientists because they are scien?
With full awareness of science's efficacy tists, particularly social consequences, not
and power, scientists must think carefully less.
about the possible impacts and potential
V. Conclusion: Difficult Weighings
implications of their work. Although there
is no qualitative difference between this
Ahead
responsibility and the responsibility of au? While the search for truth is not a tran?
tomobile drivers to proceed with due care
scendent good, it still is a good. When
and caution, the quantitative burden is
considering both intended and unintended
much greater. The ability to do harm (and
consequences of one's work, the develop?
good) is much greater for a scientist, and
ment of knowledge is an important and
the terrain almost always unfamiliar. The
worthy goal for scientists?but it must be
level of reflection such responsibility re?
weighed against other goods, including
quires may slow down science, but such is
basic human rights, quality of life, and en?
the price we all pay for responsible behav? vironmental health.
ior. The driver may need to take more time
From the responsibility to weigh and find
getting to his destination; similarly the sci?
the right balance of goods, it does not fol?
entist may need to take more time in
low automatically that some research
developing her research and determining
should be forbidden. Scientists (and non
how to present the results.
scientists) need to weigh carefully the
However, it must be emphasized that the
goods of developing knowledge with the
responsibility to reflect on potential con?
risks that might be involved. Sometimes
sequences of one's work does not arise
scientists will decide that the good of the
because science has such power. The ba?
knowledge to be gained will outweigh the
sis for this responsibility is one all humans
necessary risks; other times they will not.
share as moral agents. Scientists are not
And sometimes scientists will figure out
excused from this responsibility because
ways to mitigate the potential risks so that
they are scientists searching for truth. Truth
unacceptable risks become acceptable
is not such a transcendent good that only
truth matters. And the social structure that (e.g., recombinant DNA). Because devel?
oping knowledge is a good, the vast
could relieve scientists of their general re?
majority of research programs should be
sponsibilities is neither in place (as it would
continued, but the behavior of scientists in
need to be before scientists could give up

This content downloaded from


85.96.130.68 on Mon, 20 Sep 2021 18:13:37 UTC
All use subject to https://about.jstor.org/terms
THE MORAL RESPONSIBILITIES OF SCIENTISTS / 67

performing their work may need to be to the occasion of justifying a prohibition.


modified. This is because we cannot be sure what
In addition, the picture of the moral re? might be discovered in a research program,
sponsibilities of scientists painted here knowledge is a value, and there are so
makes no direct claim on whether pursu? many other ways of mitigating the harm?
ing a particular piece of knowledge should ful effects of knowledge. Despite this
be forbidden. What should be clear from complexity, there is too much at stake for
the analysis above is that it is theoretically the general responsibilities of scientists to
possible for some areas of research to be be neglected. Careful reflection and justi?
forbidden because the risks of harm are fication of scientific decisions is needed.
judged to outweigh the potential benefits. Nor does this essay imply that scientists
Whether or not it makes sense to forbid must give up their autonomy. It is doubtful
scientists (and others) from pursuing some an oversight system that would adequately
piece of knowledge has been addressed by address the general responsibilities could
others recently (Johnson 1996; Kitcher be effectively implemented. It does mean
2001, 93-108). In the examples put forth however, that scientists cannot run their
by Johnson and Kitcher, it becomes appar? profession on the basis of isolation from
ent how difficult it is to make a case that society; they must accept their general re?
the pursuit of some knowledge should be sponsibilities that are an inescapable part
forbidden. Only very particular social cir? of their work and meet them fully. Only by
cumstances in which the basis of free meeting their responsibilities will scientists
inquiry itself is threatened by a research be able to maintain the autonomous deci?
program (as in Kitcher's example) or some sion-making so hard fought for after the
other similarly threatening possibility Second World War.
(such as knowledge that can only be used University of Pug et Sound
to produce egregious harms) seems to rise
NOTES
The author would like to acknowledge Lori Alward, Denis Arnold, Ted Richards, and Jim Wilson
for their invaluable assistance with the ideas and presentation in this paper. Remaining muddles
and missteps are the author's alone. The author also gratefully acknowledges the National Sci?
ence Foundation (SDEST grant #0115258) and the University of Puget Sound's Martin Nelson
Junior Sabbatical Fellowship for their generous support, which made this work possible.
1. This view is also expressed in Merton 1938, 261, and Wueste 1994, 2, to be challenged by both.

2. However one weighs those harms. While utilitarian frameworks for negligence have been
standard, Simons 1999 argues for deontological frameworks in evaluating unjustified risk.
3. The issue has been raised whether fraud in science is encouraged by institutional arrange?
ments that place too much emphasis on the pursuit of credit. There is also debate on the extent of
fraud in science. See Callahan 1994.

4. Some generally accepted responsibilities do not serve the goal of knowledge, but instead act
to protect the boundaries and autonomy of science. For example, scientists are expected to main?
tain the ethical standards of proper research, particularly when dealing with living subjects, and
scientists have agreed to come under systematic regulatory review in order to protect those ethi?
cal standards. Such review keeps scientists from having to shoulder this heavy burden themselves,
and also keeps unwanted scrutiny and interference out of science. While this arrangement forces
scientists to give up some autonomy, in the long run, it protects scientific autonomy.

This content downloaded from


85.96.130.68 on Mon, 20 Sep 2021 18:13:37 UTC
All use subject to https://about.jstor.org/terms
68 / AMERICAN PHILOSOPHICAL QUARTERLY

5. If one is desperate to shield scientists from their general moral responsibilities, one could
argue that the scientists should only think about the potential consequences of their work when
developing methodologies. Such methodological considerations would prevent scientists from
using unethical methodologies, and would perhaps keep scientists from inadvertently bringing
catastrophe upon the world (as in the bomb case). Under this view, once the methodology is set
up, scientists should cease to think about the potential consequences of their work and of its
possible errors. Such a move requires that one can make a clean distinction between a methodol?
ogy selected at the beginning of a study (when one would be required to think of consequences)
and choices made later in the study (when one would not). Such a clean distinction seems to me
untenable. Studies rarely go perfectly as planned, and judgments must be made along the way on
whether to keep intact or modify original methodological choices. In addition, events may occur
that are unexpected and for which no methodological protocols are in place. Why should only
expected events receive full considerations and unexpected ones not? Science in practice is too
fluid for such clean distinctions.

REFERENCES
Arnold, Denis. 2001. "Coercion and Moral Responsibility." American Philosophical Quarterly,
vol. 38, no. 1, pp. 53-67.
Bridgman, P. W. 1947. "Scientists and Social Responsibility." Scientific Monthly, vol. 65, no. 2,
pp. 148-154.
Callahan, Joan. 1994. "Professions, Institutions, and Moral Risk." In Wueste, pp. 243-270.
Culliton, Barbara. 1979. "Science's Restive Public." In Holton and Morrison, pp. 147-156.
Douglas, Heather. 2000. "Inductive Risk and Values in Science." Philosophy of Science, vol. 67,
pp. 559-579.
Feinberg, Joel. 1970. Doing and Deserving (Princeton, N.J.: Princeton University Press).
Fisher, John Martin, and Mark Ravizza (eds.). 1993. Perspectives on Moral Responsibility (Ith?
aca, N.Y.: Cornell University Press).
Holton, Gerald, and Robert Morison (eds.). 1979. Limits of Scientific Inquiry (New York: W. W.
Norton).
Johnson, Deborah G. 1996. "Forbidden Knowledge and Science as Professional Activity." Mo?
nist, vol. 79, no. 2, pp. 197-218.
Kitcher, Philip. 2001. Science, Truth, and Democracy (New York: Oxford University Press).
L?bbe, Hermann. 1986. "Scientific Practice and Responsibility." In Facts and Values, edited by
Doeser and Kray (Dordrecht: Martinus Nijhoff).
Maker, William. 1994. "Scientific Autonomy, Scientific Responsibility." In Wueste, pp. 219-241.
Merton, Robert. 1983. "Science and the Social Order." Philosophy of Science, vol. 5, pp. 321
337; reprinted in The Sociology of Science (Chicago: University of Chicago Press, 1973), pp.
254-266.
Paul, Ellen Frankel, Fred Miller, and Jeffrey Paul (eds.). 1999. Responsibility (Cambridge: Cam?
bridge University Press).
Rhodes, Richard. 1986. The Making of the Atomic Bomb (New York: Simon and Schuster).
Simons, Kenneth. 1999. "Negligence." In Paul, Miller, and Paul, pp. 52-93.
Sinsheimer, Robert L. 1979. "The Presumptions of Science." In Holton and Morrison, pp. 23-35.
Wachbroit, Robert. 1994. "Human Genetic Engineering and the Ethics of Knowledge." In Wueste,
pp. 205-218.
Wueste, Daniel (ed.). 1994. Professional Ethics and Social Responsibility (London: Rowman
andLittlefield).

This content downloaded from


85.96.130.68 on Mon, 20 Sep 2021 18:13:37 UTC
All use subject to https://about.jstor.org/terms

You might also like