You are on page 1of 12

An Epistemological Nightmare

Raymond M. Smullyan, 1982


Scene 1
Frank is in the office of an eye doctor. The doctor holds up a book and asks !hat color is it"
Frank ans#ers, Red. The doctor says, $ha, %ust as & thou'ht( )our #hole color mechanism
has 'one out of kilter. *ut fortunately your condition is curable, and & #ill ha+e you in perfect
shape in a couple of #eeks.
Scene 2
,$ fe# #eeks later.- Frank is in a laboratory in the home of an e.perimental epistemolo'ist.
,)ou #ill soon find out #hat that means(- The epistemolo'ist holds up a book and also asks,
!hat color is this book" /o#, Frank has been earlier dismissed by the eye doctor as
cured. 0o#e+er, he is no# of a +ery analytical and cautious temperament, and #ill not make
any statement that can possibly be refuted. So Frank ans#ers, &t seems red to me.
Epistemologist:
!ron'(
Frank:
& don1t think you heard #hat & said. & merely said that it seems red to me.
Epistemologist:
& heard you, and you #ere #ron'.
Frank:
2et me 'et this clear3 did you mean that & #as #ron' that this book is red, or that & #as
#ron' that it seems red to me"
Epistemologist:
& ob+iously couldn1t ha+e meant that you #ere #ron' in that it is red, since you did not say
that it is red. $ll you said #as that it seems red to you, and it is this statement #hich is #ron'.
Frank:
*ut you can1t say that the statement &t seems red to me is #ron'.
Epistemologist:
&f & can1t say it, ho# come & did"
Frank:
& mean you can1t mean it.
Epistemologist:
!hy not"
Frank:
*ut surely & kno# #hat color the book seems to me(
Epistemologist:
$'ain you are #ron'.
Frank:
*ut nobody kno#s better than & ho# thin's seem to me.
Epistemologist:
& am sorry, but a'ain you are #ron'.
Frank:
*ut #ho kno#s better than &"
Epistemologist:
& do.
Frank:
*ut ho# could you ha+e access to my pri+ate mental states"
Epistemologist:
4ri+ate mental states( Metaphysical ho'#ash( 2ook, & am a practical epistemolo'ist.
Metaphysical problems about mind +ersus matter arise only from epistemolo'ical
confusions. 5pistemolo'y is the true foundation of philosophy. *ut the trouble #ith all past
epistemolo'ists is that they ha+e been usin' #holly theoretical methods, and much of their
discussion de'enerates into mere #ord 'ames. !hile other epistemolo'ists ha+e been
solemnly ar'uin' such 6uestions as #hether a man can be #ron' #hen he asserts that he
belie+es such and such, & ha+e disco+ered ho# to settle such 6uestions e.perimentally.
Frank:
0o# could you possibly decide such thin's empirically"
Epistemologist:
*y readin' a person1s thou'hts directly.
Frank:
)ou mean you are telepathic"
Epistemologist:
7f course not. & simply did the one ob+ious thin' #hich should be done, +i8. & ha+e
constructed a brain9readin' machine99kno#n technically as a cerebroscope99that is operati+e
ri'ht no# in this room and is scannin' e+ery ner+e cell in your brain. & thus can read your
e+ery sensation and thou'ht, and it is a simple ob%ecti+e truth that this book does not seem
red to you.
Frank (thoroughly subdued):
:oodness 'racious, & really could ha+e s#orn that the book seemed red to me3 it sure
seems that it seems read to me(
Epistemologist:
&1m sorry, but you are #ron' a'ain.
Frank:
Really" &t doesn1t e+en seem that it seems red to me" &t sure seems like it seems like it
seems red to me(
Epistemologist:
!ron' a'ain( $nd no matter ho# many times you reiterate the phrase it seems like and
follo# it by the book is red you #ill be #ron'.
Frank:
This is fantastic( Suppose instead of the phrase it seems like & #ould say & belie+e that.
So let us start a'ain at 'round le+el. & retract the statement &t seems red to me and instead &
assert & belie+e that this book is red. &s this statement true or false"
Epistemologist:
;ust a moment #hile & scan the dials of the brain9readin' machine99no, the statement is
false.
Frank:
$nd #hat about & belie+e that & belie+e that the book is red"
Epistemologist (consulting his dials):
$lso false. $nd a'ain, no matter ho# many times you iterate & belie+e, all these belief
sentences are false.
Frank:
!ell, this has been a most enli'htenin' e.perience. 0o#e+er, you must admit that it is a
little hard on me to reali8e that & am entertainin' infinitely many erroneous beliefs(
Epistemologist:
!hy do you say that your beliefs are erroneous"
Frank:
*ut you ha+e been tellin' me this all the #hile(
Epistemologist:
& most certainly ha+e not(
Frank:
:ood :od, & #as prepared to admit all my errors, and no# you tell me that my beliefs are
not errors3 #hat are you tryin' to do, dri+e me cra8y"
Epistemologist:
0ey, take it easy( 4lease try to recall< !hen did & say or imply that any of your beliefs are
erroneous"
Frank:
;ust simply recall the infinite se6uence of sentences< ,1- & belie+e this book is red3 ,2- &
belie+e that & belie+e this book is red3 and so forth. )ou told me that e+ery one of those
statements is false.
Epistemologist:
True.
Frank:
Then ho# can you consistently maintain that my beliefs in all these false statements are
not erroneous"
Epistemologist:
*ecause, as & told you, you don1t belie+e any of them.
Frank:
& think & see, yet & am not absolutely sure.
Epistemologist:
2ook, let me put it another #ay. =on1t you see that the +ery falsity of each of the statements
that you assert sa+es you from an erroneous belief in the precedin' one" The first statement
is, as & told you, false. >ery #ell( /o# the second statement is simply to the effect that you
belie+e the first statement. &f the second statement #ere true, then you #ould belie+e the first
statement, and hence your belief about the first statement #ould indeed be in error. *ut
fortunately the second statement is false, hence you don1t really belie+e the first statement, so
your belief in the first statement is not in error. Thus the falsity of the second statement
implies you do not ha+e an erroneous belief about the first3 the falsity of the third like#ise
sa+es you from an erroneous belief about the second, etc.
Frank:
/o# & see perfectly( So none of my beliefs #ere erroneous, only the statements #ere
erroneous.
Epistemologist:
5.actly.
Frank:
Most remarkable( &ncidentally, #hat color is the book really"
Epistemologist:
&t is red.
Frank:
!hat(
Epistemologist:
5.actly( 7f course the book is red. !hat1s the matter #ith you, don1t you ha+e eyes"
Frank:
*ut didn1t & in effect keep sayin' that the book is red all alon'"
Epistemologist:
7f course not( )ou kept sayin' it seems red to you, it seems like it seems red to you, you
belie+e it is red, you belie+e that you belie+e it is red, and so forth. /ot once did you say that it
is red. !hen & ori'inally asked you !hat color is the book" if you had simply ans#ered
red, this #hole painful discussion #ould ha+e been a+oided.
Scene 3
Frank comes back se+eral months later to the home of the epistemolo'ist.
Epistemologist:
0o# deli'htful to see you( 4lease sit do#n.
Frank (seated):
& ha+e been thinkin' of our last discussion, and there is much & #ish to clear up. To be'in
#ith, & disco+ered an inconsistency in some of the thin's you said.
Epistemologist:
=eli'htful( & lo+e inconsistencies. 4ray tell(
Frank:
!ell, you claimed that althou'h my belief sentences #ere false, & did not ha+e any actual
beliefs that are false. &f you had not admitted that the book actually is red, you #ould ha+e
been consistent. *ut your +ery admission that the book is red, leads to an inconsistency.
Epistemologist:
0o# so"
Frank:
2ook, as you correctly pointed out, in each of my belief sentences & belie+e it is red, &
belie+e that & belie+e it is red, the falsity of each one other than the first sa+es me from an
erroneous belief in the proceedin' one. 0o#e+er, you ne'lected to take into consideration the
first sentence itself. The falsity of the first sentence & belie+e it is red, in con%unction #ith the
fact that it is red, does imply that & do ha+e a false belief.
Epistemologist:
& don1t see #hy.
Frank:
&t is ob+ious( Since the sentence & belie+e it is red is false, then & in fact belie+e it is not
red, and since it really is red, then & do ha+e a false belief. So there(
Epistemologist (disappointed):
& am sorry, but your proof ob+iously fails. 7f course the falsity of the fact that you belie+e it
is red implies that you don1t belie+e it is red. *ut this does not mean that you belie+e it is not
red(
Frank:
*ut ob+iously & kno# that it either is red or it isn1t, so if & don1t belie+e it is, then & must
belie+e that it isn1t.
Epistemologist:
/ot at all. & belie+e that either ;upiter has life or it doesn1t. *ut & neither belie+e that it does,
nor do & belie+e that it doesn1t. & ha+e no e+idence one #ay or the other.
Frank:
7h #ell, & 'uess you are ri'ht. *ut let us come to more important matters. & honestly find it
impossible that & can be in error concernin' my o#n beliefs.
Epistemologist:
Must #e 'o throu'h this a'ain" & ha+e already patiently e.plained to you that you ,in the
sense of your beliefs, not your statements- are not in error.
Frank:
7h, all ri'ht then, & simply do not belie+e that e+en the statements are in error. )es,
accordin' to the machine they are in error, but #hy should & trust the machine"
Epistemologist:
!hoe+er said you should trust the machine"
Frank:
!ell, should & trust the machine"
Epistemologist:
That 6uestion in+ol+in' the #ord should is out of my domain. 0o#e+er, if you like, & can
refer you to a collea'ue #ho is an e.cellent moralist99he may be able to ans#er this for you.
Frank:
7h come on no#, & ob+iously didn1t mean should in a moralistic sense. & simply meant =o
& ha+e any e+idence that this machine is reliable"
Epistemologist:
!ell, do you"
Frank:
=on1t ask me( !hat & mean is should you trust the machine"
Epistemologist:
Should & trust it" & ha+e no idea, and & couldn1t care less #hat & should do.
Frank:
7h, your moralistic han'up a'ain. & mean, do you ha+e e+idence that the machine is
reliable"
Epistemologist:
!ell of course(
Frank:
Then let1s 'et do#n to brass tacks. !hat is your e+idence"
Epistemologist:
)ou hardly can e.pect that & can ans#er this for you in an hour, a day, or a #eek. &f you
#ish to study this machine #ith me, #e can do so, but & assure you this is a matter of se+eral
years. $t the end of that time, ho#e+er, you #ould certainly not ha+e the sli'htest doubts
about the reliability of the machine.
Frank:
!ell, possibly & could belie+e that it is reliable in the sense that its measurements are
accurate, but then & #ould doubt that #hat it actually measures is +ery si'nificant. &t seems
that all it measures is one1s physiolo'ical states and acti+ities.
Epistemologist:
*ut of course, #hat else #ould you e.pect it to measure"
Frank:
& doubt that it measures my psycholo'ical states, my actual beliefs.
Epistemologist:
$re #e back to that a'ain" The machine does measure those physiolo'ical states and
processes that you call psycholo'ical states, beliefs, sensations, and so forth.
Frank:
$t this point & am becomin' con+inced that our entire difference is purely semantical. $ll
ri'ht, & #ill 'rant that your machine does correctly measure beliefs in your sense of the #ord
belief, but & don1t belie+e that it has any possibility of measurin' beliefs in my sense of the
#ord belie+e. &n other #ords & claim that our entire deadlock is simply due to the fact that
you and & mean different thin's by the #ord belief.
Epistemologist:
Fortunately, the correctness of your claim can be decided e.perimentally. &t so happens
that & no# ha+e t#o brain9readin' machines in my office, so & no# direct one to your brain to
find out #hat you mean by belie+e and no# & direct the other to my o#n brain to find out
#hat & mean by belie+e, and no# & shall compare the t#o readin's. /ope, &1m sorry, but it
turns out that #e mean e.actly the same thin' by the #ord belie+e.
Frank:
7h, han' your machine( =o you belie+e #e mean the same thin' by the #ord belie+e"
Epistemologist:
=o & belie+e it" ;ust a moment #hile & check #ith the machine. )es, it turns out & do belie+e
it.
Frank:
My 'oodness, do you mean to say that you can1t e+en tell me #hat you belie+e #ithout
consultin' the machine"
Epistemologist:
7f course not.
Frank:
*ut most people #hen asked #hat they belie+e simply tell you. !hy do you, in order to find
out your beliefs, 'o throu'h the fantastically roundabout process of directin' a thou'ht9
readin' machine to your o#n brain and then findin' out #hat you belie+e on the basis of the
machine readin's"
Epistemologist:
!hat other scientific, ob%ecti+e #ay is there of findin' out #hat & belie+e"
Frank:
7h, come no#, #hy don1t you %ust ask yourself"
Epistemologist (sadly):
&t doesn1t #ork. !hene+er & ask myself #hat & belie+e, & ne+er 'et any ans#er(
Frank:
!ell, #hy don1t you %ust state #hat you belie+e"
Epistemologist:
0o# can & state #hat & belie+e before & kno# #hat & belie+e"
Frank:
7h, to hell #ith your kno#led'e of #hat you belie+e3 surely you ha+e some idea or belief as
to #hat you belie+e, don1t you"
Epistemologist:
7f course & ha+e such a belief. *ut ho# do & find out #hat this belief is"
Frank:
& am afraid #e are 'ettin' into another infinite re'ress. 2ook, at this point & am honestly
be'innin' to #onder #hether you may be 'oin' cra8y.
Epistemologist:
2et me consult the machine. )es, it turns out that & may be 'oin' cra8y.
Frank:
:ood :od, man, doesn1t this fri'hten you"
Epistemologist:
2et me check( )es, it turns out that it does fri'hten me.
Frank:
7h please, can1t you for'et this damned machine and %ust tell me #hether you are
fri'htened or not"
Epistemologist:
& %ust told you that & am. 0o#e+er, & only learned of this from the machine.
Frank:
& can see that it is utterly hopeless to #ean you a#ay from the machine. >ery #ell, then, let
us play alon' #ith the machine some more. !hy don1t you ask the machine #hether your
sanity can be sa+ed"
Epistemologist:
:ood idea( )es, it turns out that it can be sa+ed.
Frank:
$nd ho# can it be sa+ed"
Epistemologist:
& don1t kno#, & ha+en1t asked the machine.
Frank:
!ell, for :od1s sake, ask it(
Epistemologist:
:ood idea. &t turns out that...
Frank:
&t turns out #hat"
Epistemologist:
&t turns out that...
Frank:
?ome on no#, it turns out #hat"
Epistemologist:
This is the most fantastic thin' & ha+e e+er come across( $ccordin' to the machine the best
thin' & can do is to cease to trust the machine(
Frank:
:ood( !hat #ill you do about it"
Epistemologist:
0o# do & kno# #hat & #ill do about it, & can1t read the future"
Frank:
& mean, #hat do you presently intend to do about it"
Epistemologist:
:ood 6uestion, let me consult the machine. $ccordin' to the machine, my current
intentions are in complete conflict. $nd & can see #hy( & am cau'ht in a terrible parado.( &f the
machine is trust#orthy, then & had better accept its su''estion to distrust it. *ut if & distrust it,
then & also distrust its su''estion to distrust it, so & am really in a total 6uandary.
Frank:
2ook, & kno# of someone #ho & think mi'ht be really of help in this problem. &1ll lea+e you
for a #hile to consult him. $u re+oir(
Scene 4.
,2ater in the day at a psychiatrist1s office.-
Frank:
=octor, & am terribly #orried about a friend of mine. 0e calls himself an e.perimental
epistemolo'ist.
Doctor:
7h, the e.perimental epistemolo'ist. There is only one in the #orld. & kno# him #ell(
Frank:
That is a relief. *ut do you reali8e that he has constructed a mind9readin' de+ice that he
no# directs to his o#n brain, and #hene+er one asks him #hat he thinks, belie+es, feels, is
afraid of, and so on, he has to consult the machine first before ans#erin'" =on1t you think this
is pretty serious"
Doctor:
/ot as serious as it mi'ht seem. My pro'nosis for him is actually 6uite 'ood.
Frank:
!ell, if you are a friend of his, couldn1t you sort of keep an eye on him"
Doctor:
& do see him 6uite fre6uently, and & do obser+e him much. 0o#e+er, & don1t think he can be
helped by so9called psychiatric treatment. 0is problem is an unusual one, the sort that has
to #ork itself out. $nd & belie+e it #ill.
Frank:
!ell, & hope your optimism is %ustified. $t any rate & sure think & need some help at this
point(
Doctor:
0o# so"
Frank:
My e.periences #ith the epistemolo'ist ha+e been thorou'hly unner+in'( $t this point &
#onder if & may be 'oin' cra8y3 & can1t e+en ha+e confidence in ho# thin's appear to me. &
think maybe you could be helpful here.
Doctor:
& #ould be happy to but cannot for a #hile. For the ne.t three months & am unbelie+ably
o+erloaded #ith #ork. $fter that, unfortunately, & must 'o on a three9month +acation. So in si.
months come back and #e can talk this o+er.
Scene 5.
,Same office, si. months later.-
Doctor:
*efore #e 'o into your problems, you #ill be happy to hear that your friend the
epistemolo'ist is no# completely reco+ered.
Frank:
Mar+elous, ho# did it happen"
Doctor:
$lmost, as it #ere, by a stroke of fate99and yet his +ery mental acti+ities #ere, so to speak,
part of the fate. !hat happened #as this< For months after you last sa# him, he #ent
around #orryin' should & trust the machine, shouldn1t & trust the machine, should &, shouldn1t
&, should &, shouldn1t &. ,0e decided to use the #ord should in your empirical sense.- 0e 'ot
no#here( So he then decided to formali8e the #hole ar'ument. 0e re+ie#ed his study of
symbolic lo'ic, took the a.ioms of first9order lo'ic, and added as nonlo'ical a.ioms certain
rele+ant facts about the machine. 7f course the resultin' system #as inconsistent99he
formally pro+ed that he should trust the machine if and only if he shouldn1t, and hence that he
both should and should not trust the machine. /o#, as you may kno#, in a system based on
classical lo'ic ,#hich is the lo'ic he used-, if one can pro+e so much as a sin'le contradictory
proposition, then one can pro+e any proposition, hence the #hole system breaks do#n. So he
decided to use a lo'ic #eaker than classical lo'ic99a lo'ic close to #hat is kno#n as minimal
lo'ic99in #hich the proof of one contradiction does not necessarily entail the proof of e+ery
proposition. 0o#e+er, this system turned out too #eak to decide the 6uestion of #hether or
not he should trust the machine. Then he had the follo#in' bri'ht idea. !hy not use classical
lo'ic in his system e+en thou'h the resultin' system is inconsistent" &s an inconsistent
system necessarily useless" /ot at all( 5+en thou'h 'i+en any proposition, there e.ists a
proof that it is true and another proof that it is false, it may be the case that for any such pair
of proofs, one of them is simply more psycholo'ically con+incin' than the other, so simply
pick the proof you actually belie+e( Theoretically the idea turned out +ery #ell99the actual
system he obtained really did ha+e the property that 'i+en any such pair of proofs, one of
them #as al#ays psycholo'ically far more con+incin' than the other. *etter yet, 'i+en any
pair of contradictory propositions, all proofs of one #ere more con+incin' than any proof of
the other. &ndeed, anyone e.cept the epistemolo'ist could ha+e used the system to decide
#hether the machine could be trusted. *ut #ith the epistemolo'ist, #hat happened #as this<
0e obtained one proof that he should trust the machine and another proof that he should not.
!hich proof #as more con+incin' to him, #hich proof did he really belie+e" The only #ay
he could find out #as to consult the machine( *ut he reali8ed that this #ould be be''in' the
6uestion, since his consultin' the machine #ould be a tacit admission that he did in fact trust
the machine. So he still remained in a 6uandary.
Frank:
So ho# did he 'et out of it"
Doctor:
!ell, here is #here fate kindly interceded. =ue to his absolute absorption in the theory of
this problem, #hich consumed about his e+ery #akin' hour, he became for the first time in his
life e.perimentally ne'li'ent. $s a result, 6uite unkno#n to him, a fe# minor units of his
machine ble# out( Then, for the first time, the machine started 'i+in' contradictory
information99not merely subtle parado.es, but blatant contradictions. &n particular, the
machine one day claimed that the epistemolo'ist belie+ed a certain proposition and a fe#
days later claimed he did not belie+e that proposition. $nd to add insult to in%ury, the machine
claimed that he had not chan'ed his belief in the last fe# days. This #as enou'h to simply
make him totally distrust the machine. /o# he is fit as a fiddle.
Frank:
This is certainly the most ama8in' thin' & ha+e e+er heard( & 'uess the machine #as really
dan'erous and unreliable all alon'.
Doctor:
7h, not at all3 the machine used to be e.cellent before the epistemolo'ist1s e.perimental
carelessness put it out of #hack.
Frank:
!ell, surely #hen & kne# it, it couldn1t ha+e been +ery reliable.
Doctor:
/ot so, Frank, and this brin's us to your problem. & kno# about your entire con+ersation
#ith the epistemolo'ist99it #as all tape9recorded.
Frank:
Then surely you reali8e the machine could not ha+e been ri'ht #hen it denied that &
belie+ed the book #as red.
Doctor:
!hy not"
Frank:
:ood :od, do & ha+e to 'o throu'h all this ni'htmare a'ain" & can understand that a person
can be #ron' if he claims that a certain physical ob%ect has a certain property, but ha+e you
e+er kno#n a sin'le case #hen a person can be mistaken #hen he claims to ha+e or not
ha+e a certain sensation"
Doctor:
!hy, certainly( & once kne# a ?hristian Scientist #ho had a ra'in' toothache3 he #as
frantically 'roanin' and moanin' all o+er the place. !hen asked #hether a dentist mi'ht not
cure him, he replied that there #as nothin' to be cured. Then he #as asked, *ut do you not
feel pain" 0e replied, /o, & do not feel pain3 nobody feels pain, there is no such thin' as
pain, pain is only an illusion. So here is a case of a man #ho claimed not to feel pain, yet
e+eryone present kne# perfectly #ell that he did feel pain. & certainly don1t belie+e he #as
lyin', he #as %ust simply mistaken.
Frank:
!ell, all ri'ht, in a case like that. *ut ho# can one be mistaken if one asserts his belief
about the color of a book"
Doctor:
& can assure you that #ithout access to any machine, if & asked someone #hat color is this
book, and he ans#ered, & belie+e it is red, & #ould be +ery doubtful that he really belie+ed it.
&t seems to me that if he really belie+ed it, he #ould ans#er, &t is red and not & belie+e it is
red or &t seems red to me. The +ery timidity of his response #ould be indicati+e of his
doubts.
Frank:
*ut #hy on earth should & ha+e doubted that it #as red"
Doctor:
)ou should kno# that better than &. 2et us see no#, ha+e you e+er in the past had reason to
doubt the accuracy of your sense perception"
Frank:
!hy, yes. $ fe# #eeks before +isitin' the epistemolo'ist, & suffered from an eye disease,
#hich did make me see colors falsely. *ut & #as cured before my +isit.
Doctor:
7h, so no #onder you doubted it #as red( True enou'h, your eyes percei+ed the correct
color of the book, but your earlier e.perience lin'ered in your mind and made it impossible for
you to really belie+e it #as red. So the machine #as ri'ht(
Frank:
!ell, all ri'ht, but then #hy did & doubt that & belie+ed it #as true"
Doctor:
*ecause you didn1t belie+e it #as true, and unconsciously you #ere smart enou'h to
reali8e the fact. *esides, #hen one starts doubtin' one1s o#n sense perceptions, the doubt
spreads like an infection to hi'her and hi'her le+els of abstraction until finally the #hole belief
system becomes one doubtin' mass of insecurity. & bet that if you #ent to the epistemolo'ist1s
office no#, and if the machine #ere repaired, and you no# claimed that you belie+e the book
is red, the machine #ould concur.
/o, Frank, the machine is99or, rather, #as99a 'ood one. The epistemolo'ist learned much
from it, but misused it #hen he applied it to his o#n brain. 0e really should ha+e kno#n better
than to create such an unstable situation. The combination of his brain and the machine each
scrutini8in' and influencin' the beha+ior of the other led to serious problems in feedback.
Finally the #hole system #ent into a cybernetic #obble. Somethin' #as bound to 'i+e sooner
or later. Fortunately, it #as the machine.
Frank:
& see. 7ne last 6uestion, thou'h. 0o# could the machine be trust#orthy #hen it claimed to
be untrust#orthy"
Doctor:
The machine ne+er claimed to be untrust#orthy, it only claimed that the epistemolo'ist
#ould be better off not trustin' it. $nd the machine #as ri'ht.

You might also like