Professional Documents
Culture Documents
AJOB Neuroscience
Publication details, including instructions for authors and subscription information:
http://www.tandfonline.com/loi/uabn20
To cite this article: Janet Levin (2011): Levy on Neuroscience, Psychology, and Moral Intuitions, AJOB Neuroscience, 2:2,
10-11
To link to this article: http://dx.doi.org/10.1080/21507740.2011.559914
what philosophers typically regard as the proper methodology for moral (or more generally, philosophical) inquiry
that begins with the canvassing of intuitions. Judgments
based on ones spontaneous or unreflective responses to
imagined or actual situations may indeed be prima facie
evidence for a philosophical principle. But they must survive at least a certain amount of further reflective scrutiny
before they are typically regarded as serious evidence for
that principleeven from what Levy calls the first-person
point of view. Only if ones initial judgment about the morality of some action remains intuitive after further reflection
will that judgment have any evidential clout.1
Moreover, if the judgment remains intuitive after sufficient further scrutiny, it will have evidential clout no matter how it initially arose. Sometimes an emotion-generated
judgment will continue to be endorsed after further scrutiny,
sometimes notbut the same is true of judgments generated by reasoning about the alternatives one holds in working memory. Sometimes, for example, emotion-generated
responses can introduce one to important features of a situation that one hadnt yet considered, or even noticed, and are
thus not in ones working memory (which explains why,
among other things, empathy may be a desirable characteristic in a Supreme Court justice). In short, to dismiss an
intuition-based moral judgment as irrational or irrelevant
because of its source instead of its staying power is to conflate (what is often called) the logic of discovery with the
logic of justification.
This conflation also affects Levys discussion of his second target, namely, intuitions that support the so-called
doctrine of double effect (DDE). The DDE holds that it is
morally permissible to perform an act that has bad side effects (such as killing in self-defense, or bombing a munitions
factory that is next door to a hospital) if those side effects,
thoughforeseen, are not intended. But Levy cites some wellknown studies (Knobe and colleagues) that suggest that the
intuition-based judgments that would support this doctrine
are, as he puts it, sensitive to [pre-existing moral views] in
a way that makes appeal to them question-begging (3). In
particular, Knobes studies suggest that subjects are more
Address correspondence to Janet Levin, School of Philosophy, University of Southern California, 3709 Truesdale Parkway, Los Angeles,
CA 90089-0451, USA. E-mail: levin@usc.edu
1. For this reason, the between-subjects design of the Hauser experiments that Levy cites may not tell us as much about the pitfalls of
traditional philosophical methods as he suggests.
10 ajob Neuroscience
likely to judge that an agent (in an imagined situation) produces the relevant side effect intentionally if the side effect
strikes them as morally bad than if it strikes them as morally
good. And thus, Knobe and Levy contend, one begs the
question by taking the judgment that a bad side effect was
produced intentionally to be evidence that it is impermissible.
Here, too, Levy concedes that there are some methodological problems with these studies.2 But here, too, the
more fundamental problem is that the criticisms based
on these findings misconstrue traditional philosophical
methodology, whichonce againdoes not require that
subjects spontaneous or unreflective responses to imagined or actual situations be taken as the last word on the
subject, but only as prima facie evidence that must withstand the scrutiny of further reflection. And such further
scrutiny typically includes attempts to make ones initial
intuitions consistent both with one another and with the
responses of others.
It is therefore interesting that at least some recent
work in experimental philosophy acknowledges this difference. For example, in another article by Knobe and
Nichols (2007) on freedom of action and moral responsibility, 3 the authors found various discrepancies in intuitionbased judgmentsboth within individual subjects, and
between different subjectsthat depended on how the
intuition-prompting scenarios were described. But they report (Knobe and Nichols 2007, 121) that when the subjects
were made aware of these discrepancies, they attempted
to make their judgments consistent by giving up one or the
other. Another recent discussion of intentional action (Cushman and Mele 2008, 117) cites further studies in which the
evidence suggests that people sometimes override or reject
their intuitive responses when they fail to align with their
consciously held views, especially when the contradiction is
made apparent. These findings suggest that when subjects
are explicitly made aware of discrepancies in their judgments, they behave much like moral philosophers do when
evaluating a putative counterexample to an established thesis; that is, they subject their initial judgments to further
scrutiny to see whether they retain their intuitive force.
In short, even if these experimental findings show something interesting about subjects initial unreflective moral
judgments, they have marginal relevance to the (thoughtful) evaluation of moral principles, either by professional
philosophers or the folk.
2. Among them are (i) the intuition that an action is done intentionally is not equivalent to the intuition that the agent intended
to do it, and (ii) the questions seem to be framed in a way that
leads the subjects to the judgments in question. See Cullen (2010)
for a general critique of the methodology of many of the so-called
experimental philosophers.
3. Also reprinted in Knobe and Nichols (2008). I discuss this article,
and others in the (2008) volume, in my work (Levin 2009).
REFERENCES
Cullen, S. 2010. Survey-driven romanticism. Review of Philosophical
Psychology 1(2): 275296.
Cushman, F., and A. Mele. 2008. International action: Two-and-ahalf folk concepts? In Experimental philosophy, ed. J. Knobe and S.
Nichols, 171188. Oxford: Oxford University Press.
Knobe, J., and S. Nichols. 2007. Moral responsibility and determinism: The cognitive science of folk intuitions. Nous 41: 663685.
Knobe, J., and S. Nichols, eds. 2008. Experimental philosophy. Oxford:
Oxford University Press.
Levin, J. 2004. The evidential status of philosophical intuition. Philosophical Studies 121: 193224.
Levin, J. 2009. Experimental philosophy. (Critical notice of J. Knobe
and S. Nichols, 2008). Analysis Reviews 69(4): 761769.
Levy, N. 2011. Neuroethics: A new way of doing ethics. AJOB Neuroscience 2(2): 39.
ajob Neuroscience 11