You are on page 1of 14

Received: 20 September 2017 Revised: 23 January 2018 Accepted: 9 February 2018

DOI: 10.1002/wcs.1463

OVERVIEW

The mind–body problem


Bryan Chambliss

Department of Philosophy, University of Arizona,


Tucson, AZ The mind–body problem is the problem of explaining how the happenings of our
Correspondence mental lives are related to physical states, events and processes. Proposed solu-
Department of Philosophy, University of Arizona, tions to the problem vary by whether and how they endorse physicalism, the claim
PO Box 210027, Tucson, AZ 85721.
that mental states are ultimately “nothing over and above” physical states, and by
Email: bccham@email.arizona.edu
how they understand the interactions between mental and physical states. Physi-
calist solutions to the mind–body problem have been dominant in the last century,
with the variety of physicalism endorsed (reductive or nonreductive) depending
upon both the outcome of philosophical arguments and methodological develop-
ments in the cognitive and neural sciences. After outlining the dominant contem-
porary approach to the mind–body problem, I examine the prospects for a solution
in light of developments in the cognitive sciences, especially the scientific study
of consciousness.
This article is categorized under:
Philosophy > Consciousness
Philosophy > Metaphysics
Philosophy > Foundations of Cognitive Science

KEYWORDS

consciousness, functionalism, philosophy of mind, physicalism

1 | INTRODUCTION

That humans have experiences is beyond question. For most of human history, the best available means of studying the mind
involved attentively introspecting these experiences, noting their causes and effects. But relating the emotions, thoughts, and
perceptions of our mental lives to the operations of our physical bodies is a tricky business. The mind–body problem seeks
an explanation of how mental states and physical states are related to one another. An adequate explanation is assumed to
take the form of an account of the relation that holds between mental and physical states. Giving this account has proven dif-
ficult, as mental states and physical states appear to have heterogeneous features. So understood, the mind–body problem has
a distinguished history as a philosophical problem, and developing responses to the problem sets the agenda for much of
contemporary philosophy of mind (Armstrong, 1999; Braddon-Mitchell & Jackson, 2006; Campbell, 1984; Kim, 2010).
Although its origins are in philosophy, the mind–body problem has not developed in isolation. Mental states and processes
are now the subject of flourishing research programs carried out in the cognitive and neural sciences. Despite the humbling
complexity of both cognitive processes and neural systems, much has been learned about each, and about how neural mecha-
nisms explain mental processes. To some, this suggests that we need not solve the mind–body problem by developing a meta-
physical account of the relation between mental states and brain states. Instead, explaining mental processes in terms of neural
mechanisms, without developing an allied metaphysical account, appears to dissolve the problem as a relic of an outdated
means of studying the relation between mental states and brain states. The problem can now be addressed scientifically.
However, reflecting on the prospects of systematically developing such explanations makes clear that the mind–body
problem brings with it a methodological quandary just as real as the metaphysical one. There is something deeply puzzling,
and maybe even intractable, about the mind–body problem. Nonetheless, an adequate understanding of the problem must

WIREs Cogn Sci. 2018;e1463. wires.wiley.com/cogsci © 2018 Wiley Periodicals, Inc. 1 of 14


https://doi.org/10.1002/wcs.1463
2 of 14 CHAMBLISS

take account of developments in the cognitive and neural sciences. Thus, any solution to the mind–body problem must
appreciate both why the problem is so deeply puzzling, and how the relevant sciences might be brought to bear upon it. This
article will examine the contemporary mind–body problem, our mind–body problem, in light of certain scientific develop-
ments. What we find is neither a cloistered philosophical issue, nor a relic to be swept aside, but a living problem addressed
by philosophers and scientists alike.

2 | T H E M I N D– B O D Y P RO B L EM

How are mental states related to physical states? This mind–body question construes “mental states” broadly to include vari-
ous aspects of our mental life, including experiential states (like smelling freshly brewed coffee), folk psychological states
(like beliefs or desires), emotions (like anger), and the perceptual and cognitive processes studied by cognitive scientists (like
semantic memory). “Physical states” are states of physical systems, where “physical” is also construed broadly to include not
merely the kinds of systems studied by physics, but also those studied by the “natural sciences” more generally (which, in
addition to physics, includes chemistry, biology, and the neurosciences).
Answering mind–body questions presents a problem insofar as our mental states have features not readily explainable in
terms of physical states, and vice versa.1 Consequently, generating a mind–body problem requires appreciating features of
both mental and physical states, and realizing that each is unsuited to explaining the other. Delineating the features of physi-
cal states presents complications, and I will simplify matters by dispensing with talk of physical states more generally, and
instead discuss how mental states are related to what happens in our brains.2 Brain states have spatial locations, weights,
shapes, and electrical and chemical properties. More carefully characterizing the features of physical states is itself a substan-
tive philosophical undertaking, and it should not be taken for granted that we have an articulate understanding of what “the
physical” amounts to (Chomsky, 2009; Montero, 1999). But even this simplified take on “the physical” illustrates that para-
digmatic features of mental states cannot readily be explained in terms of brain states.
What are the problematic features of mental states? Two such features are intentionality and consciousness. The inten-
tionality of mental states consists in their being directed at objects or states or affairs, or representing things as being some
way (Dennett, 1987; Fodor, 1987; Searle, 1983). For example, consider the belief that Austin is in Texas. This belief is about
two objects, the city of Austin and the state of Texas, and it represents one object (Austin) as being contained inside the other
(Texas). Furthermore, this belief is accurate, even true, if the objects are related as the belief represents them as being, viz.,
if Austin is in Texas. It is false otherwise. The intentionality of a mental state like belief is to be understood in terms of the
objects of the mental state, and the properties and relations it represents these objects as instantiating.
Conscious mental states both embody one's subjective perspective and have an experiential feel (Kriegel, 2009; Levine,
2001). My own conscious states are mine in a way that other people's mental states cannot be. For example, if I have a head-
ache, you might be able to think about my headache, but you cannot experience my headache directly as I do when having
it. It is, in this sense, my headache. The subjectivity of a conscious state captures this sense of “mineness” or ownership. In
addition to their subjectivity, conscious mental states have a qualitative feel. To illustrate, contrast the experience of smelling
coffee with a different olfactory experience, like smelling pickles. Call the way it “feels” to have each experience its phe-
nomenal character. Two experiences, like smelling coffee and smelling pickles, are different types of experiences because
they differ in phenomenal character. Following Thomas Nagel, philosophers often combine the subjectivity and phenomenal-
ity of conscious experience into a slogan: there is something-it-is-like-for-me to undergo a conscious mental state
(Nagel, 1974).
The mind–body problem is the problem of explaining how our mental life is related to the physical states, events and
processes that occur in our brains and bodies. Giving this explanation constitutes a problem as it is unclear how the state of a
brain, or any physical object, could be about anything at all, let alone accurate or true. It is even less clear that brain states
could be, or give rise to, conscious states. Moreover, the lack of clarity does not reside in the details—at first glance, brain
states do not even appear to be the right kinds of things to explain mental states (or vice versa).
But this appearance could be little more than a failure of imagination, manifesting our incomplete understanding of our-
selves and the physical world. The mind–body problem solidified as it became clear that plausible claims about mental states
and brain states could not be jointly held. We can represent the problem as a set of independently plausible, but collectively
inconsistent claims3:

(1) Brain states are physical states.


(2) Mental states are nonphysical states.
(3) Mental states and brain states causally interact.
(4) Physical states and nonphysical states cannot causally interact.
CHAMBLISS 3 of 14

There are compelling reasons to believe each of these claims but endorsing any three implies the falsity of the fourth.
Claims (1)–(4) are logically inconsistent, and thus, cannot collectively be true. Assuming that reality is not mysterious, or
contradictory, something must go. But what?

3 | T R AD IT IO N AL AP P RO A CH E S T O TH E M I N D– B O D Y P RO B L EM

Solutions to the mind–body problem seek a principled resolution to this inconsistency. Adequate solutions do not seek an
account on which (1)–(4) are true, as such an account cannot be given. Instead, an adequate solution yields a principled
rejection of one (or more) of the claims, and a defense of the remainder. There is some reason to believe each of (1)–(4), so
solving the problem hinges upon the costs of rejecting a proposition. To help weigh these costs, consider the problems facing
some traditional approaches.
Claim (1) might appear to be trivial, but it is not universally accepted. Philosophers who endorse idealism reject the claim
that brain states are physical (in the operative sense). But few contemporary philosophers, and almost no cognitive scientists
are idealists. Thus, while arguments concerning the truth of (1) are important for assessing scientific realism or objectivity,
they lie beyond the scope of this paper. In what follows, I assume that (1) is true.
More relevant is whether, in addition to brain states, everything else is physical. Physicalism claims that mental states are
ultimately “nothing over and above” the neural states described by a mature science of the brain. Much as chemical com-
pounds are ultimately made up of the entities described by physical theory, the physicalist claims that neural states are “made
up” out of the states described by physical theory, albeit via intervening steps. Exactly how to understand physicalism is a
controversial matter (Dowell, 2006; Stoljar, 2010; Wilson, 2006). However, all solutions to the mind–body problem that
endorse physicalism see mental states as ultimately based in physical states.
If physicalism is true, then (2) is false. A long philosophical tradition has endorsed physicalism (called materialism before
the development of contemporary physics) as a solution to the problem. But physicalist accounts have traditionally faced a
problem: minds and bodies seem to be distinct things, without shared parts or properties. Minds have features that physical
systems seem to lack (like intentionality and phenomenal character), and minds seem to lack features that physical states
have (like size, location, and chemical properties). This makes determining which physical thing(s) minds might be seem
akin to determining how many of the natural numbers can fit into your office. Absent a compelling account of how minds
and brains are related, merely rejecting (2) is insufficient to solve the problem.
But what are minds, if not physical objects? One possibility is that minds are things altogether different from physical
objects. Substance dualism contends that minds are mental substances, and endorses (1) and (2) on the grounds that mental
and physical substances are fundamentally different kinds of substance (Foster, 1991; Lowe, 1996; Swinburn, 1997). Tradi-
tionally, substances are understood to be things that possess properties and can persist through changes of these properties.
Accordingly, substance dualism claims that minds are nonphysical things that exist independently of any physical objects,
including brains.
In his Meditations on First Philosophy, Rene Descartes defended a form of substance dualism by arguing that there are
two fundamentally different kinds of substances: mental substances, which have thoughts and experiences but are not located
in space–time, and physical substances, which are spatiotemporally located (Descartes, 1985). Because they have heteroge-
neous properties, no mental substances are also physical substances, and no physical substances are also mental substances.
Despite these differences, Descartes endorsed (3), maintaining that some mental phenomena cause physical phenomena and
vice versa. According to this Cartesian or interactionist form of substance dualism, human persons are “unions” of a mind
and a body, and human perception and action is accomplished via the causal interaction of mental and physical substances.
Accordingly, Cartesian dualism rejects (4). But explaining these causal interactions between mental and physical sub-
stances poses two problems. First, one must explain how a mental substance, which is nonphysical, could causally interact
with a physical substance like a brain. Due to the nature of their differences (e.g., physical substances are spatially located
while mental substances are not), it is unclear how these substances could causally interact (Richardson, 1982; Tollef-
sen, 1999).
Second, many take the physical domain to be causally closed: if we were to map the causal history of any physical event,
we would never find among its causes a nonphysical event. If true, every physical effect has a set of purely physical causes
sufficient for bringing it about. What role in explaining physical events does causal closure leave for mental causes? If they
change something in the physical world, mental causation risks running afoul of physical law. For example, the conservation
law of energy states that while energy might change form as a system evolves, the total amount of energy in an isolated sys-
tem does not change. Thus, to take a simplified case, if a brain is treated as a physical system, and mind to body causation is
understood in terms of a change in the total amount of energy in this system, mental causation of physical events appears to
violate physical law. But if mental causes do not change anything in the physical world, then it remains unclear that they do
4 of 14 CHAMBLISS

any genuine explaining. The status of this sort of anti-dualism argument remains a matter of debate.4 But many philosophers
have drawn the following lesson: even if we understood how mental and physical substances might interact, it is not clear
that such interactions could explain physical events like neural events or bodily actions.
The inability to explain the interactions between mental and physical substances is widely taken as a devastating objec-
tion to Cartesian dualism. But it also introduces a recurring problem for putative solutions to the mind–body problem: the
problem of mental causation (Heil & Mele, 1993; Kim, 1998; Yablo, 1992). The idea that our mental states are causally
efficacious—both caused by our environment, and causes of our behavior—is deeply ingrained in our understanding of the
world. Most solutions to the mind–body problem seek to capture this idea by explaining how mental states could cause
behavior. But giving a coherent account of mental causation is difficult. Thus, while substance dualism is widely rejected
because of its problems accounting for mental causation, it cannot be taken for granted that other positions fare better. The
various problems of mental causation recur throughout discussions of the mind–body problem.
One response is to deny that mind–body causal interaction takes place. Epiphenomenalism argues that mental states have
no causal influence on nonmental states (Jackson, 1982; Robinson, 2004). Epiphenomenalism shows up most clearly in the
discussion of the causal efficacy of conscious thought. For example, the 19th century biologist T.H. Huxley compared the
working of the conscious mind to a steam-whistle caused by the operation of an engine—mental operations are caused by
the workings of neurophysiological mechanisms, but like the steam they do not have any causal influence upon the operation
of the neural “engine” (Huxley, 1874). An epiphenomenalist solution to the mind–body problem generalizes this conclusion
about conscious thought to all mental states, and rejects (3).
Rejecting (3) resolves the inconsistency of (1)–(4), but at substantial cost. It is hard to believe that my experience of eat-
ing hot salsa does not cause me to drink water (or better, milk). Furthermore, forms of epiphenomenalism that endorse
(1) and (2) claim that minds really exist, that they are nonphysical, and that they do not cause anything physical. But, given
what they have turned out to be, shouldn't the epiphenomenalist just scrap her commitment to minds? The failure to find any
acceptable alternative might force us to epiphenomenalism, but, until then, it is preferable to accommodate mind–body causal
interaction.
Rather than rejecting mind–body causation, one might instead avoid it. If there are no mental states, then there is no
mind–body causation to explain, and no positive account needed of how mental states and brain states relate. Eliminativism
denies that there are mental states, thus rejecting (2) and (3). One eliminativist proposal that exerted considerable influence
upon 20th century philosophy and psychology is behaviorism.
Behaviorism is a family of views, each of which attempts to understand the behavior of organisms by downplaying
appeals to inner processes or experiential episodes, and emphasizing patterns in publicly observable behavior. Behaviorist
views arose in response to the claim that one's mental states could only be accessed via introspection, and thus any science
of psychology must make use of introspection. Early behaviorists agreed that mental states could only be accessed introspec-
tively, but argued that using introspective methods kept psychology from becoming a scientific discipline. Radical
(or methodological) behaviorism concludes that one ought to do psychological science in a way that makes no mention of
mental states or processes, thus transforming psychology into the science of behavior (Skinner, 1951; Watson, 1913).
Radical behaviorism registers an early point of contact between the sciences of the mind and the mind–body problem. If
the science of psychology does not find explanatory work for mentality, then there is no genuine problem of relating mental
states to the physical world. Or so the thought goes. By avoiding appeal to mental states, radical behaviorism motivates a
solution to the mind–body problem. However, this form of behaviorism has been widely rejected. The development of psy-
chology in the latter half of the 20th century is largely the story of scientists finding it useful (even necessary) to posit a com-
plex network of mental states in order to explain behavior (Baars, 1986). As such, the cognitive sciences themselves suggest
that we cannot sidestep the mind–body problem merely by replacing the investigation of mental states with the study of
behavior.
Alternative behaviorist strategies might seem more promising. Instead of relying upon claims about scientific methodol-
ogy, logical (or philosophical) behaviorism offers a behaviorist theory of the meaning of mental state terms (Carnap, 1932;
Ryle, 1949). According to logical behaviorism, attributions of mentality are properly understood not as making claims about
the internal states of the organism, but as making claims about the organism's behavior. Thus, attributing a mental state to an
organism is just to claim that the organism is exhibiting, or is disposed to exhibit appropriate patterns of behavior. For exam-
ple, to say that Andre desires breakfast tacos is to say that Andre has a certain behavioral disposition, say, the disposition to
exhibit taco-acquiring behavior in appropriate circumstances.
Logical behaviorism has some appeal, but has largely been abandoned in light of its failure to wholly rid itself of refer-
ence to internal mental states. By treating talk of mental states as referring to dispositions, the logical behaviorist evokes a
physicalist strategy for explaining behavior. Patently physical objects have dispositions. For example, glass is fragile, which
means that glass is disposed to break if struck with sufficient force, etc. If mental states explain behavior in much the same
CHAMBLISS 5 of 14

way that fragility explains the breaking of glass, then the logical behaviorist can allow that “mental states” explain behavior
without committing to the existence of mental states (conceived as internal states of the organism).
The problem is that giving these explanations requires specifying the conditions under which dispositions are manifested
and doing so involves appealing to further mental states (Putnam, 1975a). For example, suppose that Andre desires breakfast
tacos, but does not exhibit taco-acquiring behavior because he believes that all the tacos in the restaurant have been poi-
soned. Spelling out the appropriate circumstances in which Andre's behavioral disposition will be manifest requires appealing
to further mental states, which must themselves be given a behaviorist treatment. A resilient logical behaviorist might claim
that each mental state ascription involves a complex web of other mental state ascriptions, and thus that mental state terms
must be translated en masse. But this solution seems forced, and compelling “behavioral translations” of mental state attribu-
tions have not been produced. As behaviorist approaches lost their grip upon psychological science, logical behaviorism lost
steam as well.
Traditional approaches to the mind–body problem yield a metaphysical quandary. Simply eliminating mental states does
not furnish a promising solution, but neither does granting that minds exist as a distinct kind of entity. But if either minds or
mental states exist, and are causally efficacious, what might they be? The contemporary response to this quandary has been
to put aside the traditional questions about what kind of entity minds might be, and to focus on how mental states relate to
brain states. Furthermore, contemporary approaches reject accounts that treat mental states as distinct from the physical
world. Thus, minds are no longer understood as distinct entities generated by or housed in the brain, and mental states are
taken as a part of the physical world itself. The ambition of contemporary approaches to the mind–body problem is to lever-
age the developing sciences of mind and brain to articulate this physicalist account of mental states.

4 | C O N T E M P OR AR Y A P P R O A C H E S T O T H E M I N D– B O D Y P RO B L EM

Most contemporary approaches to the mind–body problem deny that mental states are distinct from the physical world. They
reject (2) and give an account of what, in a physical world, mental states are. As Jaegwon Kim puts it, “Through the 1970s
and 1980s and down to this day, the mind-body problem—our mind body problem—has been that of finding a place for the
mind in a world that is fundamentally physical” (Kim, 1998). Since mid-century, solutions to the mind–body problem have
reoriented themselves and are no longer primarily construed as explanations of how physicalism could be true, given the
reality of mental states, but of what mental states could be, given that physicalism is true.
A natural suggestion is that mental states are brain states. The mind–brain identity theory (also called central-state materi-
alism, the brain state theory and type physicalism) claims that each type of mental state is identical to a type of brain state
(Place, 1956; Smart, 1959). For example, an identity theorist might claim that having a migraine headache is identical to hav-
ing some specified pattern of neural activity N. The claim is not merely that each token instance of having a migraine
(e.g., the migraine I had at 6:45 last night) is identical to some token brain state (e.g., the pattern of neural activity that gave
rise to that migraine), but the stronger claim that each type of mental state (e.g., having a migraine headache) is identical to a
specified type of neural state (e.g., the neural process N).
The identity theory claims that, all appearances to the contrary, mental states are brain states. This is neither a semantic
claim, for example, the claim that mental state terms are synonymous with brain state terms, nor the loose claim that mental
states are made up from, caused by, or realized by brain states. The identity theorist contends that empirical discoveries will
reveal that types of mental states are identical to types of brain states. Thus, much as we have discovered that lightning is a
kind of electrostatic discharge, the developing neurosciences will discover mind–brain type-identities.
The discovery of these identities would not only dissolve the tension between (1) and (4), but demystify mental causation
as well. Although the brain is incredibly complicated, there is no principled problem in explaining how brains causally inter-
act with the physical world, or how brain states cause other brain states. Accordingly, if mental states are identical to brain
states, there is no principled problem in explaining mental causation either.
Despite its considerable appeal, the mind–brain identity theory has been widely rejected. By identifying mental states
with brain states, the identity theory reduces mental states to brain states. The sense of reduction at work here requires speci-
fication, and there are (at least) two ways of reducing mental states to brain states. An ontological reduction of mental states
to brain states claims that mental states are, or are made of nothing more than, brain states. A semantic reduction of mental
states to brain states claims that the explanatory purchase of theories about mental states can (ultimately) be accomplished by
neuroscientific theories.5 We can put this distinction aside for the moment because identity is a blunt instrument of
reduction—it drives both ontological and semantic reductions. The identity theory has been widely rejected because identify-
ing mental states with brain states runs afoul of two powerful trends in philosophy and the cognitive sciences: nonreductive
physicalism and functionalism.
6 of 14 CHAMBLISS

4.1 | Nonreductive physicalism and the functionalist orthodoxy


There is no philosophical consensus concerning the nature of the mental states, but current orthodoxy endorses some form of
nonreductive physicalism (Baker, 2009; Pereboom, 2002). All physicalists contend that mental states are “nothing over and
above” physical states, and address the mind–body problem by rejecting (2). Nonreductive physicalists add that mental states
cannot be type-identified with physical states. Thus, nonreductive physicalists reject (2) because while mental states are not
identical to brain states, the connection between the two is tight enough to count as a form of physicalism. Whether any such
relation can be specified is a major point of contention among contemporary approaches.
The basic idea of nonreductive physicalism is that while mental states are not distinct from brain states, they are nonethe-
less different from them (not identical to them). A macroscopic object like a table might not be distinct from the microscopic
particles that make it up, nonetheless tables are different from clouds of such particles. In similar fashion, mental states are
not distinct from the brain states that instantiate them, nonetheless, mental states and brain states are different kinds of states.
The difference between mental states and brain states can be articulated in terms of the multiple realizability of mental states.
A mental state is multiply realizable when it can be realized by different neural mechanisms or materials. This might arise
from cross-species comparisons: although certain aspects of human long-term memory are intimately tied to the hippocam-
pus, some birds seem to have long-term memory despite lacking that structure (Putnam, 1975b). Alternatively, it might arise
from individual differences within a species: if normal human color vision is realized in different individuals by distinct con-
figurations of red and green photopigments in the retina, then normal human color vision is not merely multiply realizable,
but multiply realized (Aizawa & Gillett, 2009).
Multiple realizability yields an argument for nonreductive physicalism (Fodor, 1974; Putnam, 1975b). If mental states are
multiply realizable, then mental states cannot be type-identified with brain states. (This is true because if mental states are
realized by multiple types of brain states, then by the transitivity of identity, each of these brain state types would be
identical—a contradiction.) Mental states are multiply realizable, or so the argument contends, thus mental states cannot be
type-identified with brain states. Because the type-identity of mental states and brain states is a necessary condition of reduc-
ing mental states to brain states,6 reductive physicalism is rejected. Accordingly, the argument contends, those defending
physicalism should defend a nonreductive form.
Early advocacy for multiple realizability was not typically based upon empirical evidence suggesting that mental states
are multiply realized, but on the intuition that mental states could be realized in different physical substrates. This intuition
was bolstered by the early development of the cognitive sciences, which understood mental processes in terms of their com-
putational or information-processing structure. By construing mental processes in terms of the functions being performed,
and in turn construing these functions in terms of computation or information processing, the neural systems that implemen-
ted these functions could largely be ignored when studying the mental processes themselves. Thus, explanation of mental
processes proceeded “autonomously” from the explanation of the neural systems implementing these processes in the sense
that explanations given in terms of mental states can be genuinely explanatory regardless of whether or not they reduce to
explanations in terms of the neural systems that implement these mental states.7
Multiple realizability and the “autonomy of psychology” fit hand in glove with a philosophical theory about the nature of
mental states called functionalism. Functionalism is a family of theories about what kind of thing mental states are (Fodor,
1968; Lewis, 1972; Putnam, 1975c). Mental states are parts of larger cognitive systems, and functionalists contend that what
makes something the mental state it is (and not another), is the way it functions within the cognitive system of which it is a
part. For example, to specify the function of a mental state M, a functionalist might list the inputs that cause M, the interac-
tions of M with other mental states, and the outputs that are caused by M. These inputs, interactions, and outputs comprise
the functional role of M, and anything that performs this functional role is an instance of M. To give an example, if one spec-
ified the function of pain, then any state that performs that function is an instance of pain, regardless of what material it is
made from.
Functionalism has considerable appeal. Aside from accommodating multiple realization and the autonomy of psychology,
functionalism suggests a means of understanding mental causation. Functionalism is compatible with physicalism, thus if
mental states are real and not distinct from physical things, then mental states can have causes and effects in the physical
world. This remains true even though mental states are not identical to physical states.
The appeal of functionalism gave rise to the functionalist orthodoxy, which endorses both nonreductive physicalism and
functionalism. While widely endorsed, the functionalist orthodoxy is not a complete solution to the mind–body problem.
Whatever its initial aspirations, functionalism is primarily an account of what mental states are, not an account of how mental
states are related to brain states. While functionalism is compatible with physicalism, it does not entail physicalism. And
while nonreductive physicalism denies that the mind–brain relation is identity, it does not offer a positive account of the rela-
tion that holds between mental states and brain states. To yield a complete physicalist solution to the mind–body problem,
something must be added to the functionalist orthodoxy.
CHAMBLISS 7 of 14

To determine what must be added, one might look to changes in the explanations generated by the cognitive sciences
themselves. Since their inception, the cognitive sciences have undergone a “turn to the brain” or a “cognitive neuroscience
revolution” (Boone & Piccinini, 2016). While accounts of cognitive processes are still given (largely) in terms of computa-
tion or information processing, these accounts are now accompanied by a pervasive emphasis on explaining cognitive pro-
cesses in terms of the operation of neural mechanisms. These “multilevel mechanistic explanations” articulate detailed
relations between cognitive processes and the neural mechanisms that underpin them. Many such explanations have been
given, and examining the relations they articulate might clarify the relation(s) between mental states and brain states.
For example, maybe such explanations proceed by articulating ways in which our brain states “fix” or determine our
mental states. If so, then what needs to be added to the functionalist orthodoxy is an account of this form of determination.
Two such determination relations are supervenience and realization.

Supervenience: The mental supervenes upon the physical when there can be no mental differences without cor-
responding physical differences. More formally, a property (or set of properties) M supervenes upon another
property (or set of properties) P just in case two things cannot differ in their M-properties without differing in
their P-properties. A variety of supervenience relations have been explored (Kim, 1992a; McLaughlin, 1995),
and many philosophers now believe that mind–brain supervenience is not a strong enough claim to secure the
physicalist bona fides that the nonreductive physicalist desires (Horgan, 1993; Wilson, 2005).

Realization: The mental is realized by the physical just in case the instantiation of the physical properties
“makes real” the instantiation of the mental properties. For example, a state of a computer program is realized
by the hardware running it. A variety of more detailed articulations of this relation exist in the literature
(Gillett, 2002; Polger & Shapiro, 2016; Shoemaker, 2007). Current research aims to clarify whether realization
can capture the sense in which mental states are not distinct from brain states, while allowing for multiple reali-
zation and without lapsing into a pernicious reduction.

Alternatively, closer examination of cognitive science explanations might show that mental states are more loosely
related to brain states. Maybe the most we can say is that the operations of neural mechanisms give rise to mental states, and
that mental states emerge from neural activity. Scientists and philosophers alike have developed accounts of emergence.

Emergence: The mental emerges from the physical just in case mental states arise from neural interactions, but
are “novel” in the sense that mental states are features of a neural system as a whole, and not any sub-
component of it. Thus, emergent properties do not occur arbitrarily, they “emerge” out of the workings of some
lower-level base. But neither are they explicable purely in terms of the workings of that base. Various forms of
emergence have been specified in the literature (Bedau & Humphreys, 2008; Clayton & Davies, 2006; Hum-
phreys, 2016), and some are not suitable for developing non-reductive physicalism. Current research aims to
clarify whether any form of emergent property can be causally efficacious without lapsing into an objectionable
form of dualism (Gillett, 2016).

Realization and emergence are each terms of art, defined and applied differently by those using them. To illustrate some
differences between these classes of relations, consider an example of each. Jaegwon Kim articulates the Subset Account of
Realization as follows: “If a second-order property F is realized on a given occasion by a first-order property H (that is, if
F is instantiated on a given occasion in virtue of the fact that one of its realizers, H, is instantiated on that occasion), then the
causal powers of this particular instance of F are identical with (or are a subset of ) the causal powers of H (or of this
instance of H).” (Kim, 1998) Kim differentiates properties in terms of the potential to do things (causal powers) that they
contribute to the individuals (including systems) instantiating them. According to the Subset Account, if an occasion of a
lower-level property realizes an occasion of a higher-level property, it is because of the (partial) overlap in the causal powers
contributed by these properties. Due to this partial overlap, an occurrence of the higher-level property is to be expected given
an occurrence of the lower-level property (alongside the appropriate supplemental conditions.)
To illustrate the Subset Account, consider long-term memory consolidation, the processes by which a new memory is sta-
bilized over time and transformed into an item in long-term memory. The neural systems and biochemistry involved are
fairly well understood, allowing scientists to explain long-term memory consolidation progressively in terms of the neural
systems, activity of individual neurons, and biochemical processes involved. Such explanations illustrate the intimate ties
between the psychological states involved in long-term memory and the properties that realize these states, as one explains
how the psychological states have the causal powers they do by articulating the biochemical processes, individual neural
activity, and neural systems involved in consolidation. Carefully examining cases like memory consolidation can be used to
8 of 14 CHAMBLISS

both assess the plausibility of the different accounts of realization, and to answer questions about multiple realization and
reduction (Aizawa, 2007; Bickle, 2003).
Not all occurrences of mental properties are so tightly connected to the operations of the neural systems that underpin
them. To illustrate, consider Weak Emergence. Weakly emergent phenomena arise from the activities of a lower-level
domain, but the existence and character of these phenomena cannot be derived or computed (in any way other than simula-
tion) from the regularities of the lower-level domain (Bedau, 1997; Chalmers, 2006). Instead, weakly emergent phenomena
are properties of a whole system which are both unexpected given the principles that govern interactions between the parts of
the system, and qualitatively novel when compared to the features of these parts. To echo P.W. Anderson's slogan, when
higher-level phenomena emerge, the whole is not merely more, but different, from the sum of its parts (Anderson, 1972).
As an illustration of Weak Emergence, consider the sophisticated (“cognitive”) behaviour that emerges in some connec-
tionist networks. James McClelland and colleagues tell us that “human thoughts and utterances have a rich and complex
structure that, in our view, is also the emergent consequence of the interplay of much simpler processes.” (McClelland et al.,
2010) In this case, the simpler processes are nodes and their interconnections, which together comprise a network. The nodes
activate when given the appropriate stimuli, and simple principles govern how activation transmits throughout the network.
Over time, as these principles are changed in accordance with a learning algorithm, such systems can learn how to perform
tasks. In doing so, they exhibit behavior that is more complex than the behavior of their simple parts, and that could not be
predicted from the principles that govern these simple parts (except by simulation). A suitably trained up network, then,
offers an explanation of how the cognitive process takes place in terms of the simple operations of the components, and the
learning embodied by the network. But this explanation need not posit tidy connections between the high-level phenomena
exhibited by the network, and the simple parts out of which it emerges. Instead, it is often the case that a network can per-
form a task, even though it remains unclear how it does so.
Each explanation appears to relate cognitive and neural processes differently. While the intimate connection between
mental property and neurobiological realizer in memory consolidation calls to mind a part-whole organization, the higher
and lower-level phenomena in connectionist networks exhibit a looser connection. These differences are mirrored in the
Realization and Emergence relations examined. While I don't mean to suggest that only one mind-brain relation fits any
given explanation, a relation that fits naturally with an explanation tends to encompass some features of the explanation
itself. The challenge then is to craft a mind–brain relation that both complements good explanatory practice in the sciences,
and is capable of doing the needed philosophical work. How to understand realization and emergence, their inter-relations,
and their implications for reduction, are vibrant topics of ongoing research. We currently lack a widely accepted account of
the mind-brain relation that completes the physicalist solution to the mind–body problem required by the functionalist
orthodoxy.

4.2 | Problems for the functionalist orthodoxy


The functionalist orthodoxy faces problems beyond those of giving a positive account of the relation obtaining between men-
tal states and brain states. During the early development of the cognitive sciences, the explanation of mental processes
enjoyed a distinctive kind of autonomy. But the study of mentality is now deeply intertwined with the study of the brain,
with the neurosciences placing constraints upon the explanation of mental processes. If the earlier cognitive sciences under-
stood mental states in terms of computation or information processing, the current cognitive sciences understand mental
states in terms of brain states. Nonreductive physicalism must accommodate these changes in scientific practice, while
addressing other standing philosophical challenges.

a. Troubles with multiple realization. While the multiple realization argument once commanded near universal assent
amongst philosophers, challenges to the argument have become more prominent. Early challenges focused on blocking
the nonreductivist conclusions based upon the multiple realization of mental states. For example, philosophers argued
that scientific reductions are context-specific (Churchland, 1986; Lewis, 1969), and thus the existence of some multiple
realization is consistent with “local reductions” in which mental states reduce to brain states within a given structure
(Kim, 1992b). More recent challenges focus on the multiple realization premise itself, by arguing that neuroscientific
practices support type-identities (Bechtel & Mundale, 1999), or that philosophers have been too permissive in what
counts as multiple realization (Shapiro, 2000). These skeptical challenges have prompted further developments of the
multiple realization argument, and more careful defense of the claim that mental states are multiply realized. Ongoing
work on multiple realization aims to clarify the standards and evidence for multiple realization, and whether multiple
realization militates against reduction in the relevant sense (Aizawa & Gillett, 2009; Polger & Shapiro, 2016).
b. Problems with mental causation. If mental states are identical to brain states, then mental causation is simply a species
of causation between physical states. The functionalist orthodoxy denies this identity claim, but seeks to explain mental
CHAMBLISS 9 of 14

causation in much the same way. Jaegwon Kim's exclusion argument (or supervenience argument) concludes that this
cannot be done (Kim, 1998, 2005). Any physical event (like the intentional action of reaching for a glass) has a set of
physical causes sufficient for bringing it about. Proponents of mental causation claim that there is also a set of mental
causes sufficient for bringing about the action. If mental and physical causes are distinct, and physical events are not
overdetermined, then there is no room for mental causation as all the relevant causal work has already been done by
physical processes. But if mental and physical causes are not distinct, then we need some reason to think that mental
causes aren't redundant. After all, Kim asks, aren't the physical causes really doing the work?

Responses to the problems of mental causation are numerous, and ongoing work aims to clarify the nature and viability
of mental causation (Gibb, Lowe, & Inghorsson, 2013; Woodward, 2015). It remains clear, however, that the problems
posed by mental causation are not limited to forms of dualism. What is less clear is whether the problems of mental cau-
sation are unique to mentality. For the substance dualist, the problem of mental causation is the problem of explaining
how mental and physical substances might causally interact. This problem of mental causation faces only mental sub-
stances, and is thus a problem unique to minds. But for other nonreductive accounts of mind, such as nonreductive phys-
icalism or property dualism, the problems facing an account of mental causation seem to be the same kinds of problems
facing an account of the causal efficacy of the events, properties and states of the special sciences more generally. If so,
the exclusion argument faces the social sciences no less than the mental ones. To the extent that mental causation is sim-
ply one variety of causal efficacy in higher-level sciences, the problem of mental causation looks to be a problem of a
more general sort.

c. Nonrelational features of mental states. Functionalism claims that all mental states are individuated by their functional
role, which is understood in terms of a mental state's relations to inputs, other mental states, and outputs. In rough out-
line, it seems clear enough how to implement the functionalist proposal for mental states like beliefs and desires. For
example, beliefs have characteristic perceptual causes, interactions with other mental states, and behavioral outputs. But
the intentionality of mental states seemingly resists this relational treatment. To illustrate the problem, consider the inten-
tionality of belief. The intentionality of a belief consists in its being about things, thus it seems natural to understand
intentionality in terms of the belief state's relation to the object(s) of belief. For example, my belief that my car is in my
driveway consists in my relation to a specific car and a specific driveway. This proposal runs into problems: people have
beliefs about non-existent objects (e.g., the belief that Santa Claus lives at the North Pole), and about classes of objects,
but not any specific individual (the belief that dogs are larger than cats). Consequently, it isn't clear that the intentionality
of belief consists in the believer's relation to physical objects. What then does intentionality consist in? Various physical-
ist proposals have been put forth (Dennett, 1987; Fodor, 1987; Millikan, 1989), but none are generally accepted. To the
extent that any adequate account of mental states must capture their intentionality, if the functionalist orthodoxy cannot
do so in terms consistent with physicalism, then it captures something less than genuine mentality.

Phenomenal character resists a relational treatment as well. Conscious experiences are naturally understood as being indi-
viduated by their experiential feel. For example, smelling coffee is a different conscious experience than smelling garlic
because it “feels” differently to undergo these experiences. Furthermore, it seems like the functional role of a mental state
and its experiential feel could come apart. One can imagine someone with inverted red-green color experiences—where
someone with normal color vision sees red, they see green—but without any differences in the functional structure of
their visual system (Shoemaker, 1982). If the experiential feel of a mental state is a nonrelational feature of the mental
state, viz., not a matter of how that mental state relates to other things, then experiences are not individuated by their
functional role (Nida-Rumelin, 1996). Making sense of conscious experience is an important component of a solution to
the mind–body problem, and the functionalist orthodoxy loses considerable appeal if it cannot.

While some form of functionalism remains the orthodox account of mind, it is not uncontroversial, and each of (a)–
(c) represents a major topic of contemporary research in the philosophy of mind. None of these problems have uncontentious
solutions, but it was intentionality that was traditionally taken to pose the major obstacle to an adequate physicalist account
of mental states. This changed in the mid-1990’s when philosophical and scientific work on consciousness became increas-
ingly visible. While there remains no generally accepted physicalist account of intentionality, many philosophers and scien-
tists now take conscious experience to pose the most pressing challenge to any physicalist solution to the mind–body
problem.8 It is to this issue that we now turn.
10 of 14 CHAMBLISS

5 | T H E H A RD PR OB L E M AN D T HE S C IE NC E O F C O NS C I OU S NE S S

In response to the metaphysical quandary faced by traditional approaches, contemporary approaches to the mind–body prob-
lem deny that mental states are distinct from brain states. With support from the cognitive and neural sciences, contemporary
physicalists aim to develop a positive account of this mind–brain relation. However, doing so yields a methodological quan-
dary just as pressing as the metaphysical one. Introspection might deliver the phenomenal character of a conscious experi-
ence, but it yields little insight into how experience is related to brain states. The physicalist appeals to the cognitive and
neural sciences precisely to clarify this relation. But, whether the introspectively revealed features of experience can be cap-
tured by these sciences is a contentious matter. To illustrate the resulting tension, consider the scientific study of
consciousness.
“Consciousness” and “experience” are capacious terms, and scientific accounts of consciousness seek to explain a range
of psychological processes including the integration of information, the focus of attention, the deliberate control of behavior,
the ability of a system to access its own internal states, and the ability to report on one's mental state. Each of these phenom-
ena pose scientific challenges, but there is no reason to doubt that we might eventually explain each in terms of their under-
lying computational and neural bases. The real problem of explaining consciousness arises in explaining phenomenal
consciousness, sometimes called qualia, the introspectively accessible what-it-is-like-for-me aspects of experience.
Attempts to explain phenomenal consciousness in physical terms face an explanatory gap: nothing we know, or can even
hypothesize, about the neural basis of a phenomenally conscious experience (like smelling coffee) allows us to understand
why that neural activation has any phenomenal character at all, much less the particular phenomenal character that it does
(the smell of coffee that we know and love) (Levine, 1983). But the nature of an explanatory gap depends upon what must
be explained, and what counts as an adequate explanation of it. To illustrate the nature of the explanatory gap facing phe-
nomenal consciousness, consider two attempts to construct scientific explanations of consciousness.
Scientific accounts of consciousness begin by attempting to find the brain basis of consciousness. A neural correlate of
consciousness for a kind of conscious state C* is a minimally sufficient neural state for instances of C* (Chalmers, 2000). A
complete correlate is sufficient in that it alone suffices for C*, and minimal in that it includes no neural states that are merely
causally relevant in producing C*. Though disagreement and complications abound (Lau, 2011), the attempt to discover the
neural correlates of consciousness is a major research project in neuroscience. But even assuming the discovery of the com-
plete neural correlate of some experience, a further question lingers: why does that pattern of neural activation result in any
experience at all, and in an experience with the particular phenomenal character of C*? Neural correlates of consciousness
alone do not answer this question, and thus do not offer the explanation of phenomenal consciousness that many desire.
One might supplement neural accounts with evolutionary explanations of consciousness. Biologists often explain the
presence of a trait in terms of its evolutionary history. Supposing that consciousness is a natural feature of certain biological
systems, one might seek to explain why certain patterns of neural activation are phenomenally conscious in evolutionary
terms. This strategy yields interesting work, and understanding the evolutionary history of consciousness might enrich our
understanding of consciousness (Feinberg & Mallatt, 2016; Godfrey-Smith, 2016). For many, however, such accounts leave
the lingering explanatory itch unscratched. Delineating the function of consciousness is challenging, and it would be difficult
to explain how the performance of this function contributed to the evolutionary success of an organism. But even setting
these issues aside, the question remains—why does the performance of some function result in any experience at all, and in
particular, an experience with a specific phenomenal character?
Attempts to bridge the explanatory gap by developing a scientific theory of phenomenal consciousness give rise to what
David Chalmers calls the Hard Problem of Phenomenal Consciousness (Chalmers, 1995). The hard problem is the problem
of explaining why any physical state has the phenomenal character it does, as opposed to some other phenomenal character,
or none at all. Chalmers clarifies why explaining phenomenal consciousness is a hard problem by characterizing the explana-
tory practices of the cognitive sciences generally, and then showing why these practices are poorly suited to explaining con-
sciousness. Typically, the cognitive sciences explain some cognitive ability by specifying the functions performed when
exercising the ability, and in turn explain the performance of these functions by specifying the neural mechanisms that per-
form them. This process generates an account of a cognitive ability in terms of physical states (the neural mechanisms
involved). But phenomenal consciousness cannot be explained in this manner. It is unclear which functions are involved, but
even assuming that we determined them and discovered the neural mechanism(s) that perform them, it would not answer
questions about why a physical state has the phenomenal character that it does. So, even a successful functional and/or mech-
anistic explanation leaves the question untouched: why is it that when a neural mechanism performs this function it is accom-
panied by a phenomenal character?
Some draw a stark lesson here: we cannot explain phenomenal consciousness scientifically because physicalism (includ-
ing the nonreductive sort) is false. They advocate a form of dualism which accepts that all substances are physical, but
CHAMBLISS 11 of 14

contends that there are two kinds of properties: mental and physical. These property dualists argue that mental properties are
not merely different, but genuinely distinct from physical properties. For example, David Chalmers argued for a form of
“naturalistic dualism” which adds mental properties to the fundamental constituents of reality (Chalmers, 1996). For Chal-
mers, the task of a theory of consciousness is not to explain what mental properties are, but to relate experience to other fea-
tures of the world. Property dualists face many of the standard problems of dualism (e.g., accounting for mental causation).
But, to them, phenomenal consciousness is the lump under the physicalist rug. The lump can be pushed around but always
pops back up in a dualism of some sort.
The hard problem of phenomenal consciousness is the most recognizable contemporary guise of the mind–body problem.
After posing a mind–body question, and making explicit certain assumptions about the nature of mental states, physical
states, and how the relation between these two must be explained, the hard problem invites pessimism about the prospects
for a scientific explanation of consciousness. But it is worth questioning whether we should make these assumptions when
studying consciousness. For example, the hard problem assumes that conscious states have their phenomenal character intrin-
sically.9 Daniel Dennett argues that this notion of phenomenal consciousness is overinflated, generating a felt need to explain
something that does not exist (Dennett, 1988, 1991). The resulting hard problem is, according to Dennett, an “illusion” and
not a “real problem to be solved with revolutionary new sciences” (Dennett, 2001). Instead, a science of consciousness
“solves” the hard problem simply by avoiding the illusion. Stanislas Dehaene makes a similar point: “once our intuitions are
educated by cognitive sciences and computer simulations, Chalmers' hard problem will evaporate”. By progressively solving
the “easy” problems of consciousness, “the science of consciousness will keep eating away at the hard problem of conscious-
ness until it vanishes” (Dehaene, 2014).
Alternatively, the hard problem assumes that a single, deductive explanation is needed to close the explanatory gap. It is
standardly assumed that there is a single gap, which consists in our inability to explain consciousness by deducing facts
about phenomenal consciousness from facts about the physical world (Levine, 1983). A deductive explanation of phenome-
nal consciousness might well close such a gap. But there might also be multiple gaps to be traversed, as well as nondeductive
forms of explanations which might narrow (or close) these gaps (Majeed, 2015; Taylor, 2016). Proposed accounts of con-
sciousness are quite heterogeneous—varying in both spatiotemporal scale, and in how abstractly they characterize the mecha-
nisms of consciousness. While any given account might fail to explain phenomenal consciousness in its entirety,
consciousness simply might not be the sort of thing that is adequately explained by a single account. If “consciousness is a
complicated biological function underlain by diverse processes at multiple scales of complexity” (Dale, Tollefsen, & Kello,
2012), then it might also be the case that explaining consciousness involves appreciating how different explanations fit
together to yield a more complete account. Thus, in addition to what might fully close the explanatory gap(s), we must con-
sider what might narrow it.
How then, should one respond to the Hard Problem? Upon first glance, it seems that no progress has been made in
directly addressing the problem since Chalmers first posed it. That does not suggest that the problem is defective. While
assumptions about the nature of experience and the explanatory gap are not sacrosanct, they are based upon widespread intu-
itions about what is required to explain phenomenal consciousness. Neither does it mean that the problem is unsolvable.
Although it is hard to imagine what could close the explanatory gap, new theories and experimental techniques that might
narrow the gap are rapidly developing.
Alternatively, instead of attempting to solve the problem as Chalmers posed it, progress in addressing the hard problem
might consist in modifying the version of the problem that needs to be addressed. With the outcome of the hard problem
uncertain, it remains important to evaluate the plausibility of the assumptions driving it. If what gets explained after rejecting
some assumptions seems too different from phenomenal consciousness, then one neglects the phenomena that they meant to
explain. But if the differences get smaller, one might believe that they are not neglecting anything worth worrying about. By
evaluating the assumptions that drive the hard problem, one “negotiates” between what needs to be explained and what can
be explained. More generally, negotiating accounts of mental states, brain states, and the relation between the two, is charac-
teristic of solutions to the mind–body problem. Such negotiations are carried out by philosophers and scientists alike, and the
details of these negotiations explain why the mind–body problem can seem so deeply puzzling and yet how solutions might
be sensitive to scientific developments.

6 | CON CLU SION

The contemporary mind–body problem has set the agenda for the last 50 years of Anglophone philosophy of mind. By
focusing on the Anglophone tradition, this article has been selective in its coverage. For example, I haven't discussed contri-
butions to the mind–body problem that draw upon the phenomenological tradition, or those that seek to bring Buddhist phi-
losophy into contact with the neurosciences. However, within the Anglophone tradition, the dominant research project is to
12 of 14 CHAMBLISS

accommodate mental states into a physicalist framework by first figuring out how brain states explain mental states, and then
by further articulating the relations appealed to in such explanations. A main unresolved issue is whether these explanations
should be reductive, viz. whether mental states are identical to brain states, and what it indicates if they are not. After a
period in which orthodoxy took the multiple realization argument to show that mental states are multiply realized by brain
states, recent scrutiny suggests that such arguments are less secure than they once seemed. Moreover, attempts to state the
relation that holds between mental states and brain states, be it supervenience, realization, etc., have proven problematic.
Work in the philosophy of mind attempts to address these issues while remaining focused upon the heterogeneity of mental
states and brain states that makes the mind–body problem so puzzling in the first place.
Despite some appearances to the contrary, philosophers are not alone in this endeavor. It has always been the case that
developing an adequate understanding the mind–body problem involves appreciating work in the relevant sciences, be they
physics or the neurosciences. But attempts to construct a scientific study of consciousness, and to grapple with the hard prob-
lem, show the mind–body problem to be alive (and well?) in the cognitive and neural sciences themselves. Coordination
between philosophers and scientists on what the mind–body problem is, and what kinds of explanations are needed to solve
it, paves the way to actually giving any such explanations. Only then might an adequate solution to the problem emerge.

CONFLICT OF INTEREST
The author has declared no conflicts of interest for this article.

NOTES
1
Throughout, I discuss states, for example, mental states and brain states. This is a simplifying device, not a metaphysical
thesis. Please understand “states” as shorthand for “states, events and/or processes.”
2
My focus here is the mind–brain relationship. I don't mean to imply that the relationship between brain states and physical
states is more straightforward than the relationship between mental states and brain states. It might not be.
3
This way of structuring the mind–body problem is based upon work by Campbell (1984), and Westphal (2016). There are
different ways of structuring the problem, leading some philosophers to talk about mind–body problems (Jaworksi, 2011;
Ludwig, 2003).
4
The Causal Argument against interactive dualism goes back (at least) to Leibniz (Woolhouse, 1985), with a prominent con-
temporary version developed by David Papineau (2002). Contemporary discussions of this argument seek to clarify what it
succeeds in establishing (Vicente, 2006), or to pose outright challenges to it (Gibb, 2010; Montero, 2006).
5
Clarifying the different senses of reduction, and how they bear on the mind–body problem, is a topic for a separate article.
Various account of reduction have been developed (Bickle, 1998; Hooker, 1981; Nagel, 1961), and current work on the
mind–body problem seeks to clarify the relation between ontological reduction and other mind–brain relations like realization
and emergence (Gillett, 2016; Van Gulick, 2001).
6
Type-identity of mental states and brain states is a necessary condition for semantic reduction, and some forms of ontologi-
cal reduction. Other forms of ontological reduction understand reduction in terms of other relations, for example, composi-
tion, and are compatible with the rejection of mind–brain type-identities.
7
Jerry Fodor provides a classic defense of the claim that special sciences are autonomous (Fodor, 1997). While it is widely
accepted that the explanations given by lower-level sciences (like the neurosciences) place some constraints upon those given
by higher-level sciences (like psychology), contemporary discussions of autonomy seek to clarify how stringent such con-
straints are, and what implications they might have for the autonomy of the special sciences (Boone & Piccinini, 2016;
Knoll, 2018).
8
Though some philosophers of mind focus on consciousness as the central phenomena in need of explanation, others focus
instead on multiple realization and mental causation. These areas of emphasis are not exclusive, and there is nothing
approaching a disciplinary consensus on the matter. That said, for many philosophers, phenomenally conscious states yield
the clearest example of a mental state that resists adequate functionalist treatment. Most philosophers whose work focuses on
conscious experience take phenomenally conscious states to be individuated by how they feel, and not by their relational fea-
tures. In the remainder of the paper, I follow their lead.
9
Phenomenal externalists, who claim that the phenomenal character of an experience is not an intrinsic property of the expe-
rience itself, take issue with this claim. Such disagreements are to be expected, as the particular variant of the hard problem
one faces depends upon their specific commitments. Perusing the literature, one finds various hard problems depending upon
the details of the theories under discussion.
CHAMBLISS 13 of 14

RELATED WIREs ARTICLES


Philosophy of Mind
Materialism
Functionalism as a philosophical theory of the cognitive sciences

FUR TH E R RE AD ING
Crane, T., & Patterson, S. (Eds.). (2000). History of the mind-body problem. London, England: Routledge.
Jaworski, W. (2011). Philosophy of mind: A comprehensive introduction. Malden, MA: Wiley-Blackwell.
Kim, J. (2005). Physicalism, or something near enough. Princeton, NJ: Princeton University Press.
Kim, J. (2010). Philosophy of mind (3rd ed.). Boulder, CO: Westview Press.
Westphal, J. (2016). The mind-body problem. Cambridge, MA: MIT Press.

REFERENC ES
Aizawa, K. (2007). The biochemistry of memory consolidation: A model system for the philosophy of mind. Synthese, 155, 65–98.
Aizawa, K., & Gillett, C. (2009). Levels, individual variation, and massive multiple realization in neurobiology. In J. Bickle (Ed.), The Oxford handbook of philosophy
and neuroscience. Oxford, England: Oxford University Press.
Anderson, P. W. (1972). More is different. Science, 177, 393–396.
Armstrong, D. M. (1999). The mind-body problem: An opinionated introduction. Boulder, CO: Westview Press.
Baars, B. J. (1986). The cognitive revolution in psychology. New York, NY: The Guilford Press.
Baker, L. R. (2009). Non-reductive materialism. In A. Beckermann, B. McLaughlin, & S. Walter (Eds.), The Oxford handbook of philosophy of mind. Oxford,
England: Oxford University Press.
Bechtel, W., & Mundale, J. (1999). Multiple realizability revisited: Linking cognitive and neural states. Philosophy in Science, 66, 175–207.
Bedau, M. (1997). Weak emergence. Philosophical Perspectives, 11, 375–399.
Bedau, M. A., & Humphreys, P. (Eds.). (2008). Emergence: Contemporary readings in philosophy and science. Cambridge, MA: MIT Press.
Bickle, J. (1998). Psychoneural reduction: The new wave. Cambridge, MA: MIT Press.
Bickle, J. (2003). Philosophy and neuroscience: A ruthlessly reductive account. Boston, MA: Kluwer Academic.
Boone, W., & Piccinini, G. (2016). The cognitive neuroscience revolution. Synthese, 193, 1509–1534.
Braddon-Mitchell, D., & Jackson, F. (2006). Philosophy of mind and cognition: An introduction. Malden, MA: Wiley-Blackwell.
Campbell, K. (1984). Body and mind (2nd ed.). Notre Dame, IN: University of Notre Dame Press.
Carnap, R. (1932). Psychology in physical language. Erkenntnis, 3, 107–142.
Chalmers, D. (1995). Facing up to the problem of consciousness. Journal of Consciousness Studies, 2, 200–219.
Chalmers, D. (1996). The conscious mind: In search of a fundamental theory. Oxford, England: Oxford University Press.
Chalmers, D. (2000). What is a neural correlate of consciousness? In T. Metzinger (Ed.), Neural correlates of consciousness: Empirical and conceptual issues.
Cambridge, MA: MIT Press.
Chalmers, D. (2006). Strong and weak emergence. In P. Clayton & P. Davies (Eds.), The re-emergence of emergence. Oxford, England: Oxford University Press.
Chomsky, N. (2009). The mysteries of nature: How deeply hidden? Journal of Philosophy, 106, 167–200.
Churchland, P. (1986). Neurophilosophy. Cambridge, MA: MIT Press.
Clayton, P., & Davies, P. (Eds.). (2006). The re-emergence of emergence. Oxford, England: Oxford University Press.
Dale, R., Tollefsen, D., & Kello, C. (2012). An integrative pluralistic approach to phenomenal consciousness. In S. Edelman, T. Fekete, & N. Zach (Eds.), Being in
time: Dynamical models of phenomenal experience. Amsterdam, The Netherlands: John Benjamins.
Dehaene, S. (2014). Consciousness and the brain: Deciphering how the brain codes our thoughts. New York, NY: Viking.
Dennett, D. (1988). Quining qualia. In A. Marcel & E. Bisiach (Eds.), Consciousness in modern science. Oxford, England: Oxford University Press.
Dennett, D. (1991). Consciousness explained. Boston, MA: Little, Brown and Company.
Dennett, D. (2001). Are we explaining consciousness yet? Cognition, 79, 221–237.
Dennett, D. C. (1987). The intentional stance. Cambridge, MA: MIT Press.
Descartes, R. (1985). Meditations on first philosophy. In J. Cottingham, R. Stoothoff, & D. Murdoch (Eds.), The philosophical writings of Rene Descartes. Cambridge,
England: Cambridge University Press.
Dowell, J. (2006). The physical: Empirical, not metaphysics. Philosophical Studies, 131, 25–60.
Feinberg, T. E., & Mallatt, J. M. (2016). The ancient origins of consciousness: How the brain created experience. Cambridge, MA: MIT Press.
Fodor, J. (1974). Special Sciences. Synthese, 28, 97–115.
Fodor, J. (1968). Psychological explanation. New York, NY: Random House.
Fodor, J. (1987). Psychosemantics: The Problem of Meaning in the Philosophy of Mind. Cambridge, MA: MIT Press.
Fodor, J. (1997). Special sciences: Still autonomous after all these years. Philosophical Perspectives, 11, 149–163.
Foster, J. (1991). The immaterial self. London, England: Routledge.
Gibb, S. (2010). Closure principles and the laws of conservation of energy and momentum. Dialectica, 64(3), 363–384.
Gibb, S. C., Lowe, E. J., & Inghorsson, R. D. (Eds.). (2013). Mental causation and ontology. Oxford, England: Oxford University Press.
Gillett, C. (2002). The dimensions of realization: A critique of the standard view. Analysis, 62, 316–323.
Gillett, C. (2016). Reduction and emergence in science and philosophy. Cambridge, England: Cambridge University Press.
Godfrey-Smith, P. (2016). Other minds: The octopus, the sea, and the deep origins of consciousness. New York, NY: Farrar, Straus and Giroux.
Heil, J., & Mele, A. (Eds.). (1993). Mental causation. Oxford, England: Clarendon Press.
Hooker, C. (1981). Towards a general theory of reduction. Part I: Historical and scientific setting. Part II: Identity in reduction. Part III: Cross-categorical reduction.
Dialogue, 20, 38–59, 201-236, 496–529.
Horgan, T. (1993). From supervenience to superdupervenience: Meeting the demands of a material world. Mind, 102, 555–586.
Humphreys, P. (2016). Emergence. New York, NY: Oxford University Press.
Huxley, T. H. (1874). On the hypothesis that animals are automata, and its history. Fortnightly Review, 22, 555–580.
Jackson, F. (1982). Epiphenomenal qualia. Philological Quarterly, 32, 127–136.
Jaworksi, W. (2011). Philosophy of mind: A comprehensive introduction. Malden, MA: Wiley-Blackwell.
14 of 14 CHAMBLISS

Kim, J. (1992a). Supervenience and mind: Selected philosophical essays. Cambridge, England: Cambridge University Press.
Kim, J. (1992b). Multiple realization and the metaphysics of reduction. Philosophy and Phenomenological Research, 52, 1–26.
Kim, J. (1998). Mind in a physical world: An essay on the mind-body problem and mental causation. Cambridge, MA: MIT Press.
Kim, J. (2005). Physicalism, or something near enough. Princeton, NJ: Princeton University Press.
Kim, J. (2010). Philosophy of mind (3rd ed.). Boulder, CO: Westview Press.
Knoll, A. (2018). Still autonomous after all. Minds and Machines, 28, 7–27.
Kriegel, U. (2009). Subjective consciousness: A self-representational theory. New York, NY: Oxford University Press.
Lau, H. (2011). Theoretical motivations for investigating the neural correlates of consciousness. WIREs Cognitive Science, 2, 1–7.
Levine, J. (1983). Materialism and qualia: The explanatory gap. Pacific Philosophical Quarterly, 64, 354–361.
Levine, J. (2001). Purple haze: The puzzle of consciousness. New York, NY: Oxford University Press.
Lewis, D. (1969). Review of art, mind, and religion. Journal of Philosophy, 68, 203–211.
Lewis, D. (1972). Psychophysical and theoretical identifications. Australasian Journal of Philosophy, 50, 249–258.
Lowe, E. J. (1996). Subjects of experience. Cambridge, England: Cambridge University Press.
Ludwig, K. (2003). The mind-body problem: An overview. In S. Stich & T. Warfield (Eds.), The Blackwell guide to philosophy of mind. Malden, MA: Blackwell.
Majeed, R. (2015). The hard problem and its explanatory targets. Ratio, 19, 298–311.
McClelland, J. L., Botvinick, M. M., Noelle, D. C., Plaut, D. C., Rogers, T. T., Seidenberg, M. S., & Smith, L. B. (2010). Letting structure emergence: Connectionist
and dynamical systems approaches to cognition. Trends in Cognitive Sciences, 14, 348–356.
McLaughlin, B. P. (1995). Varieties of Supervenience. In E. Savellos & U. Yalcin (Eds.), Supervenience: New Essays. Cambridge University Press.
Millikan, R. G. (1989). Biosemantics. Journal of Philosophy, 86, 281–297.
Montero, B. (1999). The body problem. Noûs, 33, 183–200.
Montero, B. (2006). What does the conservation of energy have to do with Physicalism? Dialectica, 60(4), 383–396.
Nagel, E. (1961). The structure of science. Problems in the logic of explanation. New York, NY: Harcourt, Brace & World.
Nagel, T. (1974). What is it like to be a bat? Philosophical Review, 4, 435–450.
Nida-Rumelin, M. (1996). Pseudonormal vision: An actual case of qualia inversion? Philosophical Studies, 82, 145–157.
Papineau, D. (2002). Thinking about consciousness. Oxford, England: Oxford University Press.
Pereboom, D. (2002). Robust Nonreductive Physicalism. Journal of Philosophy, 99, 499–531.
Place, U. T. (1956). Is consciousness a brain process? British Journal of Psychology, 47, 44–50.
Polger, T., & Shapiro, L. (2016). The multiple realization book. Oxford, England: Oxford University Press.
Putnam, H. (1975a). Brains and behavior. In H. Putnam (Ed.), Mind, language, and reality: Philosophical papers (Vol. 2). Cambridge, England: Cambridge Univer-
sity Press.
Putnam, H. (1975b). The nature of mental states. In Mind, language, and reality; philosophical papers (Vol. 2). Cambridge, England: Cambridge University Press.
Putnam, H. (1975c). Minds and machines. In Mind, language, and reality Philosophical papers (Vol. 2). Cambridge University Press.
Richardson, R. (1982). The scandal of Cartesian interactionism. Mind, 91, 20–37.
Robinson, W. S. (2004). Understanding phenomenal consciousness. Cambridge, England: Cambridge University Press.
Ryle, G. (1949). The concept of mind. London, England: Hutchinson.
Searle, J. (1983). Intentionality. Cambridge, England: Cambridge University Press.
Shapiro, L. (2000). Multiple Realizations. Journal of Philosophy, 97, 635–654.
Shoemaker, S. (1982). The inverted Spectrum. Journal of Philosophy, 79, 357–381.
Shoemaker, S. (2007). Physical realization. Oxford, England: Oxford University Press.
Skinner, B. F. (1951). Science and human behavior. New York, NY: Macmillan.
Smart, J. J. C. (1959). Sensations and brain processes. Philosophical Review, 68, 141–156.
Stoljar, D. (2010). Physicalism. New York, NY: Routledge.
Swinburn, R. (1997). The evolution of the soul. Oxford, England: Clarendon.
Taylor, E. (2016). Explanation and the explanatory gap. Acta Anal, 31, 77–88.
Tollefsen, D. (1999). Princess Elisabeth and the problem of mind-body interaction. Hypatia, 14, 59–77.
Van Gulick, R. (2001). Reduction, emergence and other recent options on the mind/body problem: A philosophic overview. Journal of Consciousness Studies, 8, 1–34.
Vicente, A. (2006). On the causal completeness of physics. International Studies in the Philosophy of Science, 20, 149–171.
Watson, J. L. (1913). Psychology as the behaviorist views it. Psychological Review, 20, 158–177.
Westphal, J. (2016). The mind-body problem. Cambridge, MA: MIT Press.
Wilson, J. (2005). Supervenience-based characterizations of Physicalism. Noûs, 29, 426–459.
Wilson, J. (2006). On characterizing the physical. Philosophical Studies, 131, 61–99.
Woodward, J. (2015). Interventionism and causal exclusion. Philosophy and Phenomenological Research, 91, 303–347.
Woolhouse, R. (1985). Leibniz's reaction to Cartesian interaction. Proceedings of the Aristoteian Society, (66), 69–82.
Yablo, S. (1992). Mental Causation. Philosophical Review, 101, 245–280.

How to cite this article: Chambliss B. The mind–body problem. WIREs Cogn Sci. 2018;e1463. https://doi.org/
10.1002/wcs.1463

You might also like