Professional Documents
Culture Documents
LYCAN
It is commonly held that a mental state is the mental state that it is in virtue
of the stimuli that characteristically activate it, the responses which it in turn
activates, and other mental entities. It is also considered fruitful to compare
human beings and other sentient organisms to computing machines, and
human mental processes to the computational processes occurring in those
machines. Recent 'functionalist' philosophers of mind have made use of the
latter idea in giving theoretical content to the former, holding that a correct
and complete description of a person's mental life (qua mental) would ap-
proximate an abstract machine program, which program would detail the
functional relations between possible 'inputs', possible 'outputs', and the
various inner states of the person which mediate between the two. Accordingly,
functionalists envision a theory of mind whose explications of individual
mental stateswill take the form: 'To be in mental state M is to realize or
instantiate machine program P and be in functional state S relative to p,.1
Let us call the view that some such theory is correct 'Machine Functionalism'.
Standard functionalist theories thus imply that having a set of purely rela-
tional properties suffices to make some inner state of a person a mental state.
Critics of functionalism have deplored this consequence. In Section I of this
paper I shall review a particularly vivid and compelling attack on it, put
forward by Jerry Fodor and N. J. Block in recent writings, z I shall argue that
despite the intuitive appeal of their attempt, it is inconclusive. But I shall go
on in Section II to offer a somewhat similar argument that I believe does
inflict serious damage on Machine Functionalism (hereafter 'MF').
phenomenal feels. Briefly, they charge the functionalist with ignoring qualia.
More specifically, Fodor and Block may be interpreted as arguing that no
specification of an organism's functional organization, however complex,
could suffice to determine the respective qualitative characters of that
organism's pains, visual sensations, and what have you. Two organisms might
well share the same functional organization and all the same machine programs
and still have their visual color spectra inverted relative to each other; in fact,
For all we know, it may be nomologically possible for two psychological states to be
functionally identical (that is, to be identically connected with inputs, outputs, and suc-
cessor states), even if only one of the states has a qualitative content. In this case,
functionalist theories would require us to say that an organism might be in pain even
though it is feeling nothing at all, and this consequence seems totally unacceptable.
(p. 173)
Block goes on to develop this attack on functionalism at length, maintaining
that it is quite possible, or seems quite possible, for something to share the
functional organization of a sentient being and yet not be (phenomenally)
conscious, in the sense of having qualia, at all - there might not be 'anything
it is like to be' that thing, as Nagel puts it, 3 while there is certainly something
it is like to be that thing's sentient functional twin. 4
It is fairly clear that as it stands this 'absent qualia' argument begs the
question. First, let us not forget that the machine programs we humans
instantiate are incredibly complex, and that any other object or organism
which instantiated them would have a comparably complex physical
structure and behavioral repertoire. Is it really possible to imagine some-
thing's sharing my entire many-levelled functional organization and still not
being conscious in the way that I am? Well, perhaps it is; questions of this
kind are notoriously difficult to answer definitively. But, second, even if we
can imagine Fodor and Block's 'absent qualia' situation, our ability to do this
does not refute MF, for to imagine that situation is simply to imagine MF's
being false. Since M F is a scientific or quasi-scientific theory of mind, our
ability to imagine its falsity is unsurprising, and has no bearing on the reason-
ableness of believing that M F is in fact true.
Quite aware that he has so far succeeded at best in promulgating an
impasse between attractive theory and imaginative intuition, Block gives two
more detailed examples to back up his claim. The first of these is a hypo-
thetical case involving
a body externally like a human body, say yours, but internally quite different. The
neurons from sensory organs are connected to a bank of lights in a hollow cavity in the
MACHINE F U N C T I O N A L I S M 28i
head. A set of buttons connect to the motor output neurons. Inside the cavity reside a
group of little men. Each has a very simple task: to implement a 'square' of a reasonably
adequate machine table [program] which describes you. On one wall is a bulletin board
on which is posted a card from a deck in which each card bears a symbol which designates
one of the states specified in the machine table .... Each little man has the task cor-
responding to a single quadruple [= square of a Turing Machine table]. Through the
efforts of these little men, the system described realizes the same (reasonably adequate)
machine table as you do and is thus functionally equivalent to you. (pp. 18-19).
To make the case slightly m o r e personal, suppose that the brain m a t t e r were
actually to be scooped out o f y o u r o w n head and that the j o b o f realizing
y o u r present program were instantaneously to be taken over by a waiting
corps o f h o m u n c u l i .
In Block's second e x a m p l e ,
moral is that a version of M F based even on the most elaborate and sophist-
icated kind of machine program is still too 'liberal' or tolerant a view, in that
it would award consciousness and qualia to organisms which very plainly do
not have them.
Graphic though they are, Block's two examples cannot be considered
conclusive, s since philosophers who have offered what they believe are good
reasons in favor of accepting M F may well be inclined to swallow the con-
sequence that a homunculihead or the population of China would be
conscious and have qualia under such bizarre circumstances. Again, the MFist
might point out, it is easy to underestimate the complexity of a human being's
program. For example, one may be misled by thinking that all the Mad
Scientist who hires the homunculi has to do is to get them to realize one
rather complex but manageable Turing Machine; in fact, the homunculi
would have to realize fabulously many distinct machine programs simul-
taneously, right down to the very complicated subroutines that underlie
even so (phenomenologically) simple a conscious state as sharp pain. 6
Perhaps it does seem that if someone were to scoop out my head and replace
my brain matter with a corps of homunculi who busied themselves with index
cards realizing m y program, I would cease to have qualia - for how could an
actual quale be produced by or in a mere aggregate of dull little men suffling
index cards about? - and so on. But, as Block himself hints on p. 33, a
paparell point could be urged against brain matter itself: "Each neuron is a
dull, simple little device that does nothing but convey electrical charge from
one spot to another; ~ust stuffing an empty head with neurons wouldn't
produce qualia - immediate phenomenal feels!" But of course it does
produce them, even if we cannot imagine how that happens,
Block defends his cases against this second insinuation of question-begging
as well. I believe the core of his attempt to back up his conservative intuitions
about homunculi-heads is to be found in his remarks on pp. 5 1 - 5 2 :
There might well be good reeason for supposing that [a homunculi-head system] has
some mental states. Propositional attitudes are an example. Perhaps psychological theory
will identify remembering that P with having 'stored' a sentence-like object which
expresses the proposition that P... Then if one of the little men has put a certain
sentence-like object in 'storage', we may have reason for regarding the system as remem-
bering that P. But what reason of this sort could there conceivably be for regarding the
system as having qualia? I can think of none.
I am not sure what reasons Block intends to pick out as being reasons 'of this
sort' (my italics). Possibly he means reasons based on the knowledge (a) of
MACHINE FUNCTIONALISM 283
i1
Let us use the Turing Machine model for simplicity; it seems clear that the
argument will generalize to any version of MF. So St, is one among the set
of functional or logical states Si mentioned by the relevant Turing Machine
program realized by h. That program also tabulates inputs si and outputs
oi. Therefore,
(2) h realizes Sp. (1, our hypothesis)
(Def) For any i, O receives input ai iff O's 'h' component (viz., h)
receives whatever physical stimulus we would normally identify
MACHINE FUNCTIONALISM 285
Now that our machine program M has been constructed in this way, we
may note that the relation defined by (Def) constitutes a correlation of the
sort described above; so O realizes our defined machine program M. Of
course, O's realizing M is trivially parasitic on the functional doings of O's
component h. But the results are surprising:
A parallel argument could be constructed for any other mental state occurring
in any of O's component homunculi. Therefore, M F entails that a homunculi-
head or other group person such as O would be himself having any mental
state occurring in any of his constitutive horde of homunculi. Were I to turn
out to be a homunculi-head, I would (or courld) have thousands of explicitly
contradictory beliefs (and there would be indexical problems too): further,
despite my overwhelming inclination to deny it, I would have conscious
awareness of each of my homunculi's conscious mental lives. But this seems
outrageous and possibly impossible. If so, then it seems MFis far too liberal,
and false.
This argument has a faint air of hokum. But it has no obvious flaw. One
might protest the arbitrariness and silliness of my choice of the interpretive
correlation recorded in (Def); but one would then bear the onus of proposing
and motivating a suitable restriction on the kinds of correlation in virtue of
which some physical structure may be said to 'realize' a particular abstract
machine program. This could be tricky. Note that the notion of 'realization'
that I defined above is the going notion, appealed to (so far as I can see) by
every MFist in the literature. The present suggested response to my argument
would prevent the MFist's being able to make do with this nice, crisp formal
286 WILLIAM G. LYCAN
NOTES
course, and it further seems to me that the stipulated condition might well be empirically
unreasonable.
6 For a fine and instructive sketch of the functional complexities of pain, see D . C .
D e n n e t t ' s 'Why y o u can't make a c o m p u t e r that feels Pain', Synthese, forthcoming.
Block anticipates m y appeal to 'a developed psychology' on pp. 5 1 - 5 2 : "... nothing
we know about the psychological processes underlying our conscious mental life has
anything to do with qualia. What passes for the 'psychology' of sensation or pain, for
example, is either (a) physiology, (b) psychophysics, ... or (c) a grab-bag of descriptive
studies .... Of these, only psychophysics could be psychophysics touches only the
)'unctional aspect of sensation, and not its qualitative character. Psychophysical experi-
m e n t s done on y o u would have the same results if d~ne on a n y o n e Psychofunctionally
equivalent to you, even if he had inverted or absent qualia. If experimental results
would be unchanged whether or n o t the experimental subjects had inverted or absent
qualia, they can hardly be expected to cast light on the nature of qualia.
Indeed, on the basis of the kind o f conceptual apparatus now available in psychology,
I d o n ' t see how psychology in anything like its present incarnation could explain qualia."
But even this defense of Block's antiliberal intuitions begs the question, it seems. For if
someone (viz., the functionalist) is saying precisely that mental states, along with what-
ever qualitative characters they m a y have, are functional in nature, then that person will
justifiably balk at Block's phrase, " o n l y the ]hnctional aspect o f sensation, and not its
qualitative character". Of course, here as elsewhere, Block has begged in the question
in a clinical sense only if the functionalist has antecedently defended his version of
functionalism; the quality o f various functionalists' defenses o f their theories is another
matter.
8 A n u m b e r of philosophers have claimed that this epistemic truism is a conceptually
necessary truth. See m y 'Noninductive evidence: R e c e n t w o r k on Wittgenstein's"Criteria"',
American Philosophical Quarterly 8 ( 1971).
9 For example, a referee (for whose acute c o m m e n t s I am grateful) has suggested that
O realizes Sp parasitically and that the MFist m a y simply require that to be in M an
organism m u s t realize the relevant functional state nonparasitically, in the sense that the
organism m u s t not have any proper part that is also realizing that functional state. This
suggestion is natural, but admits of fairly straightforward counterexamples. To avoid
these, we might refine the notion of 'parasiticness' in one way or another; this would
lead to a further search for counterexamples, and so it goes.
~o I am grateful to Block, to Rober Kraut, to Louis Loeb, and to the members of the
Philosophy Departments of the University of Virginia and the University of North
Carolina at Greensboro for helpful discussions on this topic.