This action might not be possible to undo. Are you sure you want to continue?
MINDS AND MACHINES
E d i f r d by
ALAN ROSS ANDERSON
Yale U n i c ~ c r s i t y
Drawi~~g Ricl~~er; h!.
@ 1958 T h e N e w Yorkcr hlagazinc, Inc.
C O N T E M P O R A R Y PERSPECTIVES I N P H I L O S O P I I Y S13RIES
I'RENTICE-IIALL, INC. E~lglcwood Cliffs, New Jcrscy
A. h. TURINU 1
I. TIIE IAIITATION QAllE
I propose to consider the question "Can machines think?" 'This shot~lil begin with definitions of the meaning of the terms "machine" and "think." T h e definitions might be framed so as to rcflett so far as possible the normal use of the words, but this attitude is dangerous. If the meaning of the words "machine" and "think" are to be found by examining how they a r e commonly used it is difficult to escape the conclusion that the meaning and the answer to thc question, "Can machines think?" is to be s o u g l ~ tin a statistical survey such as a Gallup poll. But this is absurd. Instead of attempting such a definition I shall replace the question by another, which is closely related to it and is expressed in rclntivcly unambiguous words.
"Computinp M a f h i n r r y and Intrlligrnrr," Mind, I'oI. L I X , A'o. 236, (1950). Rrprintrd b y prrrni~jiono/ h f r ~ .Turing nnd hfind.
T h e new form of the problem can be described in terms of a game we call tlre "imitation game." I t is played with three people, a man ( A ) , a woman ( B ) , and an interrogator ( C ) w h o may be of sex. T h e interrogator stays in a room apart from the other two. 1.h~ object of the game for the intcrrogator is to determine which of the other two is the man and which is the woman. f i e knows them by labels X and Y, and a t thc end of the game he says either "X is A and Y is 11" or "X is B and Y is A." T l i c interrogator is allowcd to put questions to A and B thus: C: W i l l X please tell me t h t length of his or her hair? I\lo\v suppose X is actuallv A, then A must answer. I t is A's object in the came to try to cause C to make the wrong identification. Iiis answer tnight therefore be "My hair is shingled, nntl tile longest strands are about nine inches long." 111order that tones of voice tnay not help tlie intcrrogator the answers should be written, or bettcr still, typewritten. T h e ideal arrangrment is to have a teleprinter communicatinfi between the t w o rooms. Alternativcly the question and answcrs can be repeated by an intermediary. T h e object of the gamc for thc tltird plnycr ( R ) is to help the intcrropator. Tlle best strategy for hcr is probably to give truthful answers. She can ndd such things as "I nm tlic uPornnn, don't listen to I~im!" t~ her anslvcrs, but it will avail nothins as the man can make similar remarks. W e now ask the question, " W l ~ a twill happen when a mac11i1)etakes t l part of A in this gamc?" \Vill the interrogator decide wlongly as ~ often when the game is playcd like this as he does when the game is played between a man and a w o m a n ? These questions replace our original, "Can machines think?"
2. CRITIQUIS O F TIIE NEW PROBLEhI
As wcll as asking, " W l ~ n t is the answer to this new form of tlle question," one may ask, "Is this new question a w o r t l ~ yone to invcstigate?" T h i s latter question we investigate without f u r t l ~ c rado, thereby cutting short an infinite regress. ?'he new problem has the advantage o f drawing a fairly sharp line het\r-ten the physical and the intcllcctual capacities of a man. N o cnginccr or chemist claims to be able to produce a material which is indistinguishable from the human skin. I t is possible that a t sornc time this
C O ~ ~ I ' U T I N Gr i A C I I I N E R Y A N D IN7'1;.1.1.I(;ENCB I
might be done, but even supposing this invention available we should feel there was little point in trying to make a "thinking machine" more human by dressing it up in such artificial flesh. T h e form in which we have set the problem reflects this fact in the condition which prevents the interrogator from seeing o r touching the other competitors, or hearing their voices. Some other advantages of the proposed criterion may be shown up by specimen questions and answers. T h u s : Q : Please write me a sonnet on the subject of the Forth Bridge. A : Count me out on this one. I never could write poetry. Q : Add 34957 to 70764. A: (Pause about 30 seconds and then give as answer) 105621. Q : D o you play chess ? A: Yes. Q: I have K at my K t , and no other pieces. You have only K at K6 and R a t RI. I t ic your move. W h a t do you play? A : (After a pause of 15 seconds) R-R8mate. T h e question and answer method seems to be suitable for introducing almost any one of the fields of human endeavor that we 'wish to include. W e d o not wish to penalize the machine for its inability to shine in beauty competitions, nor to penalize a man for losing in a race against an airplane. T h e conditions of our game make these disabilities irrelevant. T h e "witnesses" can brag, if they consider it advisable, as much as they please about their charms, strength or heroism, but the interrogator cannot demand practical demonstrations. T h e game may perhaps be criticized on the ground that the odds arc weighted too heavily against the machine. If the man were to try and pretend to be the machine he would clearly make a very poor showing. H e would be given away a t once by slowness and inaccuracy in arithmetic. M a y not machines carry out something which ought to be dexribed as thinking but which is very different from what a man does? T h i s objection is a very strong one, but a t least we can say that if, nevertl~eless, a machine can be constructed to play the imitation game sntisfactorily, we need not be troubled by this objection. I t might be urged that when playing the "imitation game" the best strategy for the machine may possibly be something other than imitation o f the behavior of a man. T h i s may be, but 1 think it is unlikely that there is any great effect of this kind. I n any case there is no intention to investigate Irere the theory of the game, and it will he assumed that the best stratrgy is to try to provide answers that would naturally be given by a mag.
T ~ I C question which we put in $ 1 will not be quite definite until we have specified what we mean by the word "machine." I t is natural that should wish to permit every kind of engineering technique to be used in our machines. W e also wish to allow the possibility that an engineer or team of cngineers may construct a machine which works, but whose manner of operation cannot he satisfactorily described by ib constructors because they have applied a method which is largely experimental. Finally, we wish to exclude from the machines men born in the usual mannet. I t is difficult to frame the definitions so as to satisfy these three conditions. One might for instance insist that the team of engineers should be all of one sex, but this would not really be satisfactory, lor it is probably possible to rear a complete individual from a single cell of the skin (say) of a man. T o do so would be a feat of biological technique deserving of the very highest praise, but we would not be inclined to regard it as a casc of "constructing a thinking machine." This prompts us to abandon the requirement that every kind of technique sllould be permitted. W e are the more ready to do so in view of the fact that the present interest in "thinking machines" has been aroused by a particular kind of machine, usually called an "electronic computer" or " d i ~ i t a lcomputer." Following this suggestion we only permit digital computers to take part in our game. This restriction appears at first sight to be a very drastic one. 1 slrall attempt to show that it is not so in reality. T o do this necessitates a short account of the nature and properties of these computers. I t may also be said that this itlcntification of machines with digital computers, like our criterion for thinki in^," will only be unsatisfactory if (contrary to my belief), it turns out that digital computcrs are unable to give a good showing in the game. l'here are already a nrlmbcr o f d i ~ i t a lcomputers in working order, and it may be asked, "Why not try t l ~ ccxperimcnt straight a w e ? I t would be easy to satisfy the conditions of tlie Came. A number of interrogators could he used, and statistics compilrtl to slrow how often thc right identification was ~iven." ?'he short answer is that we are not asking whether all digital cornprrters woultl (lo well in the came nor whrtt~crthe computers at present avnilnhlc worrld tlo well, hut whcthcr there are imaginal~lccornprrters which would do well. n u t this i s only tire short answer. \Yc shall see this question in a different light Inter.
Cnh,P"TIiu'C F i A C i i i S B i i i '
i . I 7. I , ~ , I ~ # I : ~ A L E . . I. ,\ -
T h e idea behind digital computers may be explained hy saying that t11c.x rnacl~incsare intended to carry out any operations which could he (lone by a human computer. T h e human computer is supposed to he following fixed rules; he has no authority t o deviate from them in any detail. W e may suppase that these rules are supplied in a book, which is altered whenever he is p u t on t o a n e w job. H e has also an unlimited supply o f paper on which he docs his calculations. H e may also do his multiplications end additions on a "desk machine," but this is not important. I f w e use the above explanation as a definition w e shall be in danger o f circularity of argument. W e avoid this by giving an outline of the means by which the desired effect is achieved. A digital computer can usually be regarded as consisting of three parts: ( i ) Store. ( i i ) Executive unit. (iii) Control. T h e store is n store of information, and corresponds to the human computer's paper, whether this is the paper o n which he does his calculations o r that on which his book of rules is printed. Insofar as the human computer does calculations in his head a part of the store will correspond to his memory. l'he executive unit is the part which carries o u t the various individual operations involved in a calculation. W h a t these individual operations a r e will vary from machine to machine. Usually fairly lengthy operations can be done such a s "Multiply 3540675445 by 7076345687" but in some machines only very simple ones such as "Write down 0" are possible. We have mentioned that t l ~ e "book of rules" supplied to the computer is replaced in the machine by a part of the store. I t is then called the "tahle of instructions," I t is the duty of the control to see that these instructions a r e obeyed correctly and in the right ordcr. T h e control is so constructed that tliis necessarily happens. T h e information in the store is usually broken up into packets of modrrately small size. I n one machine, for instance, a ~ a c k c tmight consist o f ten decimal digits. Numbers are assigned t o the parts of the store in which the various packets of information a r e stored, in some systelnatic manner. A typical instruction might say-
u ~ d d number stored in position 6809 to that in 4302 and put tllr the reslllt back into the latter storage position." Needless to say it would not occur in the machine expressed in J-:nclisll. I t would more likely be coded in a form strcl~as 6809430217. Ilere 17 says which of various possible operations is to be performed on the tIvo numbers. I n this case the operation is that dcscrihed above, viz. "Add the number. I t will be noticed that the instruction takes up 10 digits and so forms one packet of information, very conveniently. T h e control will normally take the instructions to be obeyed in the ordcr o f the positions in which they are stored, but occasionally a n instruction such as "Now obey tlle instruction stored in position 5606, and continue from there" mav he encountered, or again "1 t position 4505 contains 0 obey next tllc instruction stored in 6707, otherwise continl~e straight on." lnstructions of tllcse latter types are very important bccnuse they rn;~kc it possible for a sequence of operations to be repeated over and over acain until sonle condition is fulfilled, but in doing so to ohey, r:ot fresh instructions on each repetition, but the same ones over and over nfnin. 1'0take a tlomcstic analogy. Suppose M o t h e r wants Tornmy to call at the cobbler's every morning on his way to school to see i f I ~ e rshoes nre (lone; sllc can ask him afresh every morning. Alternatively shc can stick up a notice,nnce and for all in the hall which lie will see when tic lravcs for school and which tells him to call for the shocs, and also to dcstrt~y the notice when he comes back i f he has the shoes with hirn. T h e reader must accept it as a fact that digital computers can be constructed, and indeed have hecn constructed, accortling to the principles we have described, and that they can in fact mimic the actior~sof a human computer very closely. T h e book of rules which we have described our human computcr as u s i h is of course a convenient fiction. A c t ~ ~ a l t~umnrlcomputers rcnllv remember what they have got to do. I f one wants to make a machine mimic the behavior of the human computer in somc c o ~ ~ i p l eoperatio11 x one Ilas to ask him how it is donc, and tllen translate the answer into tile form of an instruction table. construct in^ irlrtrr~ction tables is ll~rlally described as " p r o g r a n ~ i n ~ . "?'o "proCrnrn a rnncl~inc to carry out tile operation A" means to put the approprinte ir~structiontable into r;;r n~achine that it will do A. so Arl interesting variant on the idca of n digital computcr is a "digitnl
. . ."
A , hf. T U R I N G
coll~putcrwith a random element." These have instructions i n v o l v i n ~ the throwing of a die or some equivalent electronic process; one such instruction might for instance be, " T h r o w the die nnd put the resulting number into store 1000." Sometimes such a macltine is describcd as having free will (though I would not use this phrase myself). I t is not normally possible to determine from observing a machine whether it has a random element, for a similar effect can be produced by such devices as making the choices depend on the digits of the decimal for r. M o r t actual digital computers have only a finite store. T h e r e is no theoretical difficulty in the idea of a computer with a n unlimited store. o f course only n finite part can have been used a t any one time. Likewise only a finite amount can have b u n constructed, but we can imagine more and more being added as required. Such computers have special theoretical interest and will be called infinite capacity computers. T h e idea of a digital computer is a n old one. Charles Babbage, Lucasinn Professor of Mathematics a t Cambridge from 1828 to 1839, planned such a machine, called the Analytical Engine, but it was never colnpleted. Although Babbage had all the essential ideas, his machine was not a t that time such a very attractive prospect. T h e speed which would have been available would be definitely faster than a human computer but something like 100 times slower than the Alianchestcr machine, itself one of the slower of the modern machines. T h e storage w a s to be purely mechanical, using wheels and cards. ?'he fact that Ilabbage's Analytical Engine was to be entirely mecllanical will help us t o rid ourselves of a superstitution. Importance is often attached t o the fact that modern digital computers are electrical, and that the nervous system also is electrical. Since Bnbbage's machine w a s not electrical, and since all digital computers are in a sense equivalent, w e see that this use of electricity cannot be of theoretical importance. O f course electricity usually comes in where fast signaling is c h c e r n e d , so that it is not surprising that we find it in both these connections. I n the nervous system chemical phenomena are a t least as important a s electrical. I n certain computers the storage system is mainly acoustic. T h e feature of using electricity is thus seen to be only n very superficial similarity. If w e wish to find such similarities we should look rather for mathematical analogies of function.
move by sudden jumps or clicks from one quite drlinite state to another. l'hese states are s u f f i ~ i c n t l Jiffrrent for the pobsibility of cor~fusion ~ het\rveen them to be ignored. Strictly speaking there are no sucll machines. every thin^ really rnoves continuously. I h t there are many kinds of machines which can profital)ly be tIroughr o/ as heing discrrte state machines. For instance in considering the switches for a lighting system it is a convenient fiction that each switch must be defiliitely on or definitcly off. T h e r e must he intermediate positions, but for most purposes we can forget about them. As an example of a discrete state machine we might consider a wheel which clicks round through 120' once a second, but may be stopped by a lever which can be operated from outside; in addition a lamp is to light in one o f the positions of the wheel. T h i s machine could be described al~stractlyas follows: T h e i n t e r ~ ~ a l state o f the machine (which is described by the position.of the wheel) may he q l , q 2 or 93. T h e r e is an input signal in r i, (position of lever). T h e o internal state a t any moment is determined by the last state and input signal according to the tahle Last Stntc
T h e output signals, the only externally visible indication of the internal state (the light) are dcscribetl by the tahle
State ql Otttput 00
5. UNIVERSALITY OF DIGITAL COAlPUTERS
T h e dip;;tal eornpuiers considcred in the iast section may be classified among the "discrete state tnachines." These are the machines which
l ' h i s example is typical of discrete state machincs. 'I'lrey can be described by such tables provided thcy have only a finite number of possiblc sta tcs. I t will seem that ~ i v e nt l ~ einitial state of t l ~ cniacl~ineand 4lie input sifr~alsit is always possihlc to predict all f ~ ~ t u r e states. T h i s is rcmiriiscent of Laplace's view that from the complete state of the univcrse a t one moment of time, as de5cril)ed hy the positions and velocities of all particles, it should he pcissil~icto prrclict all f ~ r t u r e states. T h e predictior~ which w e are considerin): is, I~owever, rather nrarer to practical)ility than that considcred by L:~place. 'Tlic system o f the "universe as a whole" is such that quite small errors in the initial conditions can have an overlvhelming effect a t n Iatcr time. T h e ,lisl>lacement of n s i ~ i ~ l e
CnhfpUTINC M A C l I I N E R Y A N D I N T E L L I G E N C E
electron by a billionth of a ccntimete~a t one moment might make the difference between a man being killed by an avalanche a year later, o r escaping. I t is an essential property of the mechanical systems which we have called "discrete state machines" that this phenomenon does not occur. Even when w e consider the actual physical machines inst~:~tl. of the idealized machines, reasonably accurate knowledge of the state at one n m n e n t yields reasonably accurate knowledge any number of stcps later. A s w e have mentioned, digital computers fall within the class of discrete state machines. B u t the number of states of which such a machine is capable is usually enormously large. F o r instance, the number f o r the machine now working a t Manchester is about 218"w0, i.e., ahout 1OWJ"'O.Compare this with o u r example of the clicking wheel descrihcd above, which had three states. I t is not difficult to see why the numher of states should be so immense. T h e computer includes a store correwonding t o the paper used by a human computer. I t must be possihle to write into the store any one of the combinations of 'symbols wliicl~ might have hcen written on the papcr. F o r simplicity suppose that only digits from 0 to 9 a r e used as symbols. Variations in handwriting are ignored. Suppose the computer is allowed 100 sheets of papcr each containing 50 lines each with room for 30 digits: T h e n the number of statrs is 10'OOxnoxm,i.e., 101aO-m. T h i s is about the number of states of three Manchester machines put together. T h e l o r a r i t h ~ nto the base t\vo of the number of states is usually called the "storage capacity" of thc macl~ine.T h u s the Manchester machine has a storage capacitv of ahout 165,000 and the wheel machine of o u r example ahout 1.6. If two machines a r e p u t together their capacities must be added to obtain the capacity of the resultant machine. T h i s leads t o the possibility of statements such a s " T h e Manchester machine contains 64 magnctic tracks eaclt with a capacity of 2560, eight electronic tubes with a capacity of 1280. Miscellaneous storage amounts to about 300 making a total of 174,380." Givcn the ti111le corrrsponding to a discrete state machine it is possihlc to predict w h a t i t will do. T h e r e is n o reason w h y this calculation should not be carried out by means o f a digital computer. Provided it c o ~ ~ lhe carried o u t sufficiently quickly the digital computer could d mimic the behavior of any discrete state machine. T h e imitation Cnnle could then he played with the machine in question (as B ) and the mimicking digital computer (as A ) and the interrogator would bc unable to disting~lish them. O f course the digital computer must have
adcquarc storage capacity as wcll as working sufficiently fast. hlorc,,\.cr, it must be progra~ned afresh for each new macl~ine which it is . . desired to mimic. This special property of digital computers, that they can mimic any dizcrete state machine, is dcscrihcd by saying that they a r e unicrrrrnl ma. ,-llincs. The existence of machines with this property has the important consequence that, considerations of speed apart, it is unnccessary to dr-ign various new machinrs to do various computing processes. T h e y can all be done with one digital computer, suitably programed for cnch case. I t will be seen that as a conscqucnce of this all digital computers are in a sense equivalent. W e may now consider again t l ~ cpoint raised a t the end of $3. I t was 911~~ested tentatively that the qucstion, "Can machinrs think?" should be rcplaced by "Are there imaginable digital computers wliicli would do \veil in the imitation game?" If w e wish w e can make this superficially more general and ask "Are there discrcte state machines which would do wcll?" n u t in view of the universality property w e sce tlmt either of these questions is equivalent to this, "Let us fix our attention or) one particular digital computer C. Is it true that by modifying this computer to have an adequate storapc, suitably increasing its speed of nction, and providing it with an appropriate program, C car1 1)c rnatle to play sntisfactorily the part of A in the imitation game, the part of I)ri~lg tnkcn by a man ?"
F. CONTRAIIY V I E W S O N T 1 5 M A I N Q1115STION I1
W e may now consider tlie ground to have been clcaretl and we are rcady to proceed to the dehntc on our question, "Can machil~cstllirlk?" and the variant of it quoted at tile cnd o f the In5t section. \Ve cannot altocethcr abandon the original form o f the problem, for opinions will tliHcr as to the appropriatcnrss of the substitution ant1 wc must a t lcnst ~r listen to what has to he snit1 i l l this conncction. I t will simplify rnattcrs for the reader i f I explain first my own bclicfs in tlie matter. Consider first the Inore accurate for111of the clucstitrn. I 1,clicvc that in : ~ l ~ o ufifty ).cars' tilnc it will I)e possiblc to t program computers, with n 5torace cnpacitv of allout IOU, to make them play the imitation game so wcll that a n average interrogator will not IIRVC more than 70 per cent chance of making the right identification after five minutes of qucstioning. l ' h c original question, "Can macllincs think?" I believe to be too mcarlingless to deserve discussion. Ncverthc-
A . h i . TURING
C , - , b ~ ~ ~ ~ IMACIIINERY A N I ) IN1'ELI.ICENCE N G
less 1 believe that a t the end of the century the use of words and educated opinion will have alterecl so much that one will I,e ahlc to speak of macliines thinking witliout expecting to be contradicted. I I)elieve further that no useful purpose is served by concealing tltesc heliefs. T h e popular view that scientists proceed inexorably from well-estnblished fact to well-established fact, never being influenced by any unproved conjecture, is quite mistaken. Provided i t is made clear which are proved facts and which a r e conjectures, no harm can result. Conjectures a r e of great importance since they suggest useful lines of research. I now proceed to consider opinions opposed to my own. ( I ) T h e Thrologicat Objection. Thinking is a function of nun's immortal soul. G o d has given a n immortal soul t o every man and woman, but not to a n y other animal o r t o machines. Hence no animal o r machine can think.' 1 a m unable to accept any part of this, but will attempt to reply in theological terms. I should find the argument more convincing i f animals were clasxd with men, for there is a greater difference, t o my mind, between the typical animate and the inanimate than there is between man and the other animals. T h e arbitrary character of the orthodox view becomes clearer if w e consider how it might appear to a member of some other relinious community. H o w d o Christians regard the hloslcm view that women have n o souls? But let us leave this point aside and return to the main argument. I t appears to me that the argument quotctl above implies a serious restriction of the omnipotence of the A l m i ~ l ~ t y . I t is admitted that there are certain things that H e cannot do such as making one equal to two, hut should we not believe that Hc has freedom t o confer a soul on a n elephant if H e sees fit? W e might expect that Hr would only exercise this power in conjunction with a mutation tvhich provided the elephant with a n appropriately improved brain to minister t o the needs of this soul. A n argument of exactly similar form ' . may be made f o r the case of machines. I t may seem different hecause it is Inore difficult to " s I ~ ~ ~ J ~ o w . " n u t this really only means that we think it would be less likely that H e would consider the circumstances suitable for conferring a soul. T h e circumstances in question are discussed in the rest of this paper. I n rttcmpting to construct such machines we should not he irreverently usurping H i s power of creating souls, any more than
' Possibly this view is heretical. St. Thomas Aquinar [ S u m m a T h r o l a p i r n , quoted by Bertrand Ilt~snell, A H i ~ t o r y of lYrrttrn Philorophy ( N e w York: Simon and Schurter, 1945), p. 4181 states that God cannot make a man tb have no soul. But this may not bc a real restriction on His powers, but only a result o f i h i fact ~ h a r men'o souls arc immortal, and therefore indestructible.
,, ,re in the procreation of chiI(1ren: rather we are, in either caFc. in.tr.nl~nts of fiis will providing ~liansions for the wuls t l n t H e creatCS. jlon.evcr, this is mere speculation. I an1 not very impressed with theological arguments whatever they may be uscd to support. Such ,,,pments have often heen found tlnsatislactory in the past. I n the time of Cnlilco it was argued that the texts, "And the sun stood still . . 2nd hasted not to go down about a whole day" (Joshua x. 13) and ."I-Ie .. IAi<( the foundations of the earth, that it should not move a t any time" (psalm cv. 5) were a n adequate refutation of the Copernican theory. IVith our present knowledge sllcli an argument appears futile. W h e n that knowledge was not available it made a quite different impression. ( 2 ) 7'hc "Ilcadr in the Sand" O b j e c t i o n . " T h e consequences of machines thinking would be too dreadful. Let us l~opcand I~elievethat they cannot do so." T h i s argument is seldom expressed quite so openly as in the form above. Dut it affects most of us who think about it a t all. W e like to bclieve that M a n is in some subtle way superior to the rest of creation. I t is best i f Ile can he shown to be n e c c r ~ o r i ls~ ~ p c r i ofor then there is ~ r, no danger of him losing his commanding position. ?'he popularity of the theological a r ~ u m e n tis clearly c o n n ~ c t e d , ~ w i tthis feeling. I t is likely h to be quite strong in intellectllal people, since they value the power of thinking more highly than others, and are more inclined to hase their I,clief in the'superiority of M a n on this power. I do not think that this argument is sufficiently substantial to require refutation. Consolation would be more appropriate: perhaps this should be sought in the transmigration of souls. ( 3 ) T h e Mafhrmaticnl Objcctian. T h e r e are a number of results of ~nlYhematicallogic which can be used t o show that there arc limitations to the powers of discrctc state machines. T h e hest known of these results is known as C6del's tlleorcm, and sllows that in any stlfficicntly powrrful logical system statements ran Oe formulated which can neithcr be proved nor disbroved within the system, unless possibly the system itself is inconsistent. T h e r e are other, in some respects similar, results due to Chnrch,2 Klcrnr, Rosztr, and Turing. T h e latter result is the most convcnicnt to sincc it rcfcrs directly to machines, wllcreas the others can only be uscd in a comparatively indirect argumcnt: for instance i f Gijdcl's theorem is to be used w e need in addition to ]lave iomc means describing iogicai s).stems in terms of macllincs, ant1
'Autllorll name9 in italic$ r c f c r to works citctl in t l ~ eI>ih!iogral,ily.
C ~ ; . : p ~ T ! N C f A C I i I N t R Y ANLJ h
nl:icl~ines in terms of logical systems. T h e result ~n question refers to a tjrpe of machine which is essentially a digital computer with an infinite capacity. I t states that there a r e certain things that such a machine u cannot do. I f it is r i ~ g e d p to give answers to questions as in the imitation game, there will be some questions to which it will either give a wrong answer, o r fail to give an answer a t all however much time is nllowed for a reply. T h e r e may, of course, be many such questions, and questions wliich cannot be answered by one machine may be satisfactorily answered by another. W e a r e of course supposing for the present that the questions a r e of the kind to which a n answer "Ycs" o r "No" is appropriate, rather than questions such as " W h a t do you think of Picasso?" T h e questions that w e know the machines must fail on are o f this type, "Consider the machine specified as follows. Will t h i ~ machine ever answer 'Yes' to any question?" T h e dots a r e to be replaced hy a d w r i p t i o n of some machine in a standard form, which could be sonitthing like that used in Sec. 5. W h e n the machine described bears a certain comparatively simple relation to the machine which is under interrogation, it can be shown that the answer is either w r o n g o r not forthcoming. T h i s is the mathrmatical result: it is argued that it proves R disability of machines t o which t h e human intellect is not subject. T h e short answer to this argument is that although it is established that there a r e limitations to the powers of any particular machine, it has only been stated, witliout any sort of proof, that no such limitations apply to the human intellcct. n u t I d o not think this view can be dismissed quite so liglitly. Whenever one of these machines is asked the appropriate critical question, and give, a definite nnswer, we know that this answer must be wrong, and this gives us a certain feeling of superiority. I s this feeling illusory? I t is no doubt quite genuine, but I do not think too much importance should be attached to it. W e too often ~ive wrong answers to questions ourselves to be justified in being very pleased a t such evidence of fallibility on the part of the machines. F u r tlicr, o u r superiority can only he felt on such an occasion in relation to tl~c one machine over which w e have scored o u r petty triumph. T h e r e would be no question of triumphing simultaneously over ail machines. I n short, then, there might be men cleverer than anv given machine, Ili~tthen a ~ a i n there might be other machines cleverer again, and so on. 'I'hose who holcl to the mathemntical argument would, I think, mostly he rilli in^ to accept the imitation game as a basis for discussion. T h o s e who believe in the twn previous ohjections would probably not he iiltcr. ested in any criteria.
( 4 ) The , f r g ~ ~ r n r t ft r o m C o n r c i o u r n r r r . I llir rrgument is very uvcll t c x p ~ ~ s s e d I'rofrssor Jcffcrson's Listcr O r a t i o ~ lf o r 1949, from whirl1 in I (luote. "Not until a machine can write a sonnet or compose a concerto hecause o f thoughts and emotions felt, arld not hy the chance fall o f s ~ ~ b ocould, w e agree that maclline equals brain-that ~ s is, not o n l ~ write i t hut know that it had written it. N o mechanism could feel (and not mcrcly artificially signal, an easy contrivance) pleasure a t its S U C cesses, grief when its valves fuse, be warmed by flattery, be made miscrable by its mistakes, be charmcd by sex, he a n g r y o r depressed when it cannot get w h a t it wants." T h i s argument appears to bc a denial of the validity of o u r test. According to the most extreme forin of this view the only way hy u\.hich one could be sure that a machine thinks is to be the machine and to feel oneself thinking. O n e could then describe these fcelitigs to the world, but of course no one would be justified in t a k i n ~any notice. Likewise according to this view the only way to know that a man thinks is to I)e that particular man. I t is in fact the solipsist point of view. I t mav be the most logical view to hold but it makes comtnunication of ideas difficult. A is liable to believe "A thinks but 11 does not" while Jl l)elievcs "0 thinks but A does not." Instead of arguing continually over this point it is usual to have the polite convention that everyone thinks. I am sure that Professor Jefferson does not wish to adopt the extreme and solipsist point of view. Probahly he would he quite willing to accept the imitation game as a test. T h e game ( w i t h the player B omitted) is frequently used in practice under the name of vivn voct to discover wllcther someone really understantls something or has "lcarned it parrot fashion." Let us listen in to a part of such a vivo vocc: Interrogator: I n the first line of your sonnet which reads "Shall I compare thee to a s ~ ~ m n ~ eday," would not "a spring day" do as r's well o r better? Witness: I t wouldn't scan. Ihterrogator: f l o w ahaut "a winter's day." T h n t would scan all r i ~ h t . \Yitness: Yes, but nobotly wants to he compared to a winter's day. Interrogator: FVould you say h l r . Pickwick reminded yo11 of Christmas? Witness: I n a wry. Interrocator: Yet Christmas is a wintcr's day, and I do not think M r . Pickwick would mind thc conlparison. Witness: I don't think you're seriot~s.Ily a winter's dny one means a typical winter's day, rattler than a special one like Christmas.
A. h f . T U R I N G
~nh!l'UTlh'G h 1 A C I I I h ' E R Y
And so on. Wlrat would Professor Jefferson say i f the sonnet-w ritinl: machine was able to answer like this in the viva r ~ o c c PI do not know whether he would regard the machine as "~nerely art~ficiallysignaling" these answers, but if the answers were as satisfactory and sustairled as in the above passage 1 d o not think he would describe it as "an easy contrivance." T h i s phrase is, I think, intended to cover such devices as the inclusion in the machine of a record of someone reading a sonnet, with appropriate switching t o turn it on from time to time. I n short then, I think that most of those who support the argument from consciousness could be persuaded to abandon it rather than be forced into the solipsist position. T h e y will then probably be willing to accept o u r test. 1 d o not wish to give the impression that I think there is no mystery about consciousness. T h e r e is, f o r instance, something of a paradox connected with any attempt to localize it. But I d o not think these mysteries necessarily need to be solved before w e -can answer the question with which w e are concerned in this paper. ( 5 ) Arguments from Various Disabilities. These a r ~ u m e n t stake the form, "I grant you that you can make machines do all the things you have mentioned hut you will never be able to make one to do X." N u merous features X are suggested in this connection, I offer a selection: He kind, resourccf~~l, beautiful, friendly (p. 19), have initiative, have a sensc of humor, tell right from wrong, make mistakes (p. 19), fall in love, enjoy strawberries and cream (p. 19), make someone fall in love with it, learn from experience (pp. 2 5 f . ) , use words properly, be thc subject of its own thought (p. 20))have as much diversity of bellavior as a man, d o something really new (p. 20). (Some of these disabilities are given special consideration as indicated by the page numbers.) N o support is usr~allvoflered for these statements. I believe they arc mostly founded on the principle of scientific induction. A man has aeeri thousands of machines in his lifetime. F r o m what he sees of them he d r a w s a number of general conclusions. T h e y are ugly, each is designed for a very limited purpose, when required for a minutely diffednt purpose they arc rselcss, the variety of bcllavior of any one of them is very small, etc., etc. Naturally Ire concludcs that these are necessary propertics of machines in general. M a n y of these limitations arc associated with the very ma!! ::o:age capacity o! most machinrs. ( 1 an? assuming that the idea of storage capacity is extended in some way to cover machines other than discrete state maclrincs. T h e exact definition docs not mattcr
110 rnatl~cmaticalaccuracy is claimed in tlre present discussion.) A few \,cars nGo, when very little had bccn hcnrcl of digital corliputers, it was Il,'ssiI,le to clicit much irlcrcdulity concerning tliem, i f one rnentioncd tllcir properties without describing thcir construction. T h a t was presunlably due to a similar application o f the principle of scientific induction. These applications of tlie principle are of course largely unconscious. \Vhen a burned child fears the fire and shows that he fears it by avoiding it, I should say that he was applying scientific induction. ( I could o f course also describe his behavior in many other ways.) T h e works and customs of mankind do not seem ,to be very suitable material to which to apply scientific induction. A very large part of space-tirnc must I1e investigated if reliable results arc to be obtained. Otherwise w e may (as niost EnKlish children d o ) decide tliat everybody speaks English, nrld that it is silly to lrarn French. There are, however, special remarks to be made about many of tlie disabilities that have been mentioned. T h e innhility to enjoy strawberries and cream may have struck the reader as frivolous. Possibly a macllinc might be made to enjoy this delicious dish, but any attempt to make one (lo so would be idiotic. W h a t is important about this disability is that it coritrihutes to some of the other disabilities, e . ~ . to the difficulty of the , came kind of friendliness occurring bctwecn Inan and machine ns hetwcen \vllitc man and white man, or between hlack man and hlack man. 'The claim that "macllines cannot make mistakes" seems a ctrrious one. One is tempted to retort, "Are they any the worse for that?" But let us adopt a more sympathetic attitude, and try to see what is really meant. I think this criticism can be explained in terms of the imitation game. It is clnimcd that the iritcrrogator could distinguish the machine from the r n h simply by setting ttlcm a number of prohlems in arithnictic. T h e machine would be unmasked hecause of its drndlv accuracy. 'The reply to this is simple. T h e machine (programed for playing the gnnic) would not attempt to give tlle r i g h t answcrr to tllc arithmetic problcrns. I t worrl,l deliberately introduce mistakes in a manlier calculated to c o n f ~ ~ s c inthe terrogator. A mechanical fault would prohilhly show itsclf ttirougll an ~~nsuitable decision as to what sort of a mistake to rnake in the arithmetic. Kvcn this interpretation of the criticism is not slrfficicntly sympatllctic. Hut we cannot afford the space to go irlto i t much further. It seems to me that this criticistn depends on n conf11sio11 Iirt\veen two kinds of mistakes. W e may call thcrn "errors of functioning" and "errors of conclusion." Errors of functioning are due to snriie mccllanical o r electrical fault \vl~icl~ cnlrscs the rnacllirle to bellnvc otllcrwise than it was clcsigrled
COhIPUTINC to do. I n philosophical discussions one likes to ignore tlie possibility of such errors; one is therefore discussing "abstract mnchines." 'J'hese al)stract mnchincs are mathematical fictions rather than physical olliects. By definition thcy arc incapable of errors of functioning. I n this srtlse w e can truly say that "machines can never make mistakes." Error?; of conclusion can only arise when some meaning is attached to tllc o ~ ~ t p u t signals from the machine. T h e machine mirIlt, f o r instance, type out mathematical equations, or sentences in English. W h e n a false proposition is typed w e say that the machine has committed an error of conclusion. T h e r e is clearly no reason a t all for saying that a machine cannot make this kind of mistake. I t might do nothing but type out rcpentedly "0 = 1." TOtake a less perverse example, it might have sonic method for drawing conclusions by scientific induction. W e must expect such a method t o lead occasionnllp to erroneous results. T h e claim that a'machine cannot he the subject of its own t h o r r ~ h tcan of corlrx only he answered if it can he shown that tlle m ac h'Inc Iins sotrrr thought wit11 lome subject matter. Nevertheless, "the subject mattcr o f a machine's operations" docs seem to mean something, a t Icast to t1:c prople who deal with it. If, for instance, the machine was trying to find n solution of the equation x2 - 40% I I = 0 one would be tempted to dercrihe this equation as part of the machine's subject matter a t that moment. I n this sort of sense a rnacl~ineundouhtcdly can be its own sr~bject matter. I t may be usccl to l ~ c l pin making up its own programs, or to predict the effect o f alterations in its own structure. By observing the results of its o w n bchavior it can modify its own programs so as to achieve somc purpose more effectively. These a r e possibilities of the near future, rather than Utopian dreams. T h e criticism that a machine cannot have much diversity of behavior is just a way of saying that it cannot have much storage capacity. Until fairly recently a storaEe capacity of even a thousand digits was very rare. T h e criticisms that w e are considering here are often disguised forms of the argument from consciousness. Usually if one maintains that a machine mn d o one o f tllcse things, and describes the kind of mcthod tllrt tl~e machine c n ~ ~ l c i one will not make much of an impression. I t is use, tlloupht that the mcthod (whatever it may be, for it must be m e c h a ~ ~ i c a l ) is rather base. C o ~ n p n r ethe parenthesis in Jefferson's s t n t c m e ~ ~ t quoted ahovc. ( 6 ) Lndv Lovrlncr's O b j r c ~ i o n .O u r most dctnilcd in fnrmatiol~of Babbajie's Analytical Enpine comes from a memoir by Lad!. Lo\.clace. I n it slle states, " T h e Analytical Engine has no pretensions to or.iainntr
h l A C I I I N R R Y A N D ISl'I'I.LIGE.h'('F:
an)'thing. I t can do w h n t c c v r I L ' ~k n n r u hr)v~* o r ~ l t rit to pcrforrn" (her 10 italics). 'l'his statement is quoted by Hartrcc \vlro aJtls : "'l'l~is tlues 11ot imply that it may not be possible to construct elcctronic cquipnient wl~ichwill 'think for itself,' or in which, in biological terms, one could set up a conditioned reflex, w l ~ i c lwould serve as a I~asisfor learni in^.' ~ Whether this is possiblc in ~ r i n c i p l e not is a stimulating and exciting or qucstion, suggested by some of these rccent dcvelopn~ents.But it did not sccrn that the machi~lcsconstructed or ~ r o j c c t c d a t the time had this propcrty." 1 am in t l ~ o r o u ~ h agreement with Hartree over this. I t will he noticed that he does not assert that the machinrs in qucstion had not got the propcrty, but rather that the evidence available to Lady Lovelace did not encourage Iicr to believe that thcy had it. I t is quite possible that the machines in qucstion had in a scnsc got this propcrty. For suppose tliat somc discrete state machine has the propcrty. l ' h c Analytical Engine was n universnl digital computer, so that, if its storaEc capacity ant1 specrl were adequate, it co~rldhy suitable progrnming be madc to rnimic the rnacl~ine in q~icstion. Probahly this argumcnt did not occur to the Countess or to linhbage. In any casc there was no obligation on them to claim all that co111tl he claimed. T h i s whole question will hc consiticrcd a ~ n i n~ ~ n t l c r Ileading of the learning machines. A variant of Latly 1,ovclacc's ol>jection st;~trs that a niacl~inc can "never do anything renllv new." T h i s may be parried for a mo~:ient wit11 tlie saw, "There is n o t l ~ i r ~new undcr the sun." W h o can he certain g tl1;lt "original work" that he has done was not simply the growth of the sccd planted in him bv teaching, or the effect of following well-known general principles. A hcttcr variant of the objection says that a machine can never "take us by surprise." T h i s statement is a more direct challenge and can be met directly. Machines take me by surprise with great fr&uency. T h i s is largely because I do not do sufficient calculation to decide what to expect them to do, o r rather because, although I do a calculation, I do it in a hurried, slipshod fashion, taking risks. Perhaps I say to myself, "I suppose the voltage here ought to bc the same ns there: anyway let's assume it is." Naturally I am often wrong, and the result is a surprise for me, for by the time the experiment is done these ass~~niptions have been forgotten. T l ~ e s eadmissions lay mc open to Iccturcs on the s ~ ~ b j e oft my vicious ways, but do nct throw any doubt on c my credibility when I testify to the surprises I experience. I (lo not expect tl~isrrply to silence my critic. l i e will probahly ~y
A . hf.
i:lzt such surprises are due to some creative mental act on my part, and f c b x t no crrtlit on the nlachine. T h i s leads us back to the argument fro111 conscio~tsncss,and far from the idea of surprise. I t is a line of argument w e must consider closed, hut it is perhaps worth remarking tlrat the a~~preciationf sometl~ingas surprising requires as nlucl~of a "creativc o mental act" whether the surprising event originates from a man, a book, a machine o r anything else. 'The view that machines cannot give rise to surprises is due, I believe, to .I f:~lliicy to which philosopl~ersand mathematicians arc particularly i;ilf:.kct. T h i s is the assumption that a s soon as a fact is presented to a rrl~hdali cor:sequences of that fact spring into the mind simultaneously v t l : i ~it. I t is a very useful assumption under many circumstances, but one ii:: tasily forgets that i t is false. A natural consequence of doing so is that c)ne then assumes that there is n o virtue in the mere working out of ci1,seqtlences from data and general principles. (7) Argunzmt jrom Continuity in the N t r v o u r System. T h e nervous sjstem is certainly not a discrete state machine. A small error in the ini a r n a t i o n about the size of a nervous impulse impinging on a neuron, difference t o the size of the outgoinl: impulse. I t may may make a l a r ~ e hr argued that, this being so, one cannot expect to he able to mimic the I~ehavior the nervous system with a discrete state system. of I t is true that a discrete state machine must he different from a contir~uousmachine. n u t if w e adhere to the conditions of the imitation game, t l ~ c itlterrogator will not be able t o take arlp advantage of this d;ficrcnce. 'I'he situation can be made clearer if we consider some other simpler continuous macl~ine.A differential ~ n a l y z c r will d o very wcll. ( A difiercntial analyzer is a certain kind of machine not of the discrete state type used for some kinds of calculation.) Some of these provide their answers in a typed form, and so a r e suitable for takinc part in the game. I t would not be possible for a digital computer to predict exactly w h a t answcrs the differential analyzer would give to a problem, but it woultl be quite capable of giving the r i ~ h tsort of answer. For instance, if asked to give the value of r (actually about 3.1416) it would be rcasonahle t o choose a t random between the values 3.12, 3.13,, 3 . 1 4 , 3 . 1 5 , 3.16 with the roba abilities of 0.05, 0.15, 0.55, 0.19, 0.06 (say). u n d e r these circumstances it would be very difficult for the interrogator to distinguish the differential analyzer from the d i ~ i t a lcomputer. ( 8 ) The Argurrrrnt lronr I n farmalify of Brhnr*ior. I t is not possihle to prodttce a srt of rules purporting to describe what a man should t l o i n
rvcry conceivable set of circumstanccs. O n e nliCIlt for instance llavc a rule that one is to stop when onc scrs a rrtl tri~fficlicllt, and to go i f one sccs a green onc, hut what i f hv somc fa111t hot11 appear tojietl~cr? O n c mnv perhaps decide that it is snfcst to stop. llut sonic further dilticultg may wcll arise from this decision later. '1'0 attcmpt to provide r~tlcs of conduct to cover every eventuality, even those arising from traffic lights, appcars to be impossible. With all this I agree. From this it is argued that we cannot hc machines. I sllall try to reproduce the argument, but 1 fear I shall hardly do i t justice. I t seems to run something like this. "If each man had a definite set of rirlcs of conduct hy which Ilc regulated his life he would be no hettcr tllan a n~achine.But thcre arc no such rulcs, so men cannot be machines." 'l'he undistributed middle is glaring. I d o not think the argument is ever put quite like this, but I believe this is the argument used nevertheless. There may however be a certain confusion between "rules of conduct" and "laws of behavior" to cloud the issue. By "rules of conduct" I nlcan precepts such as "Stop if you see red lights," on which one can act, and of which one can be conscious. By "laws of behavior" I rnean laws of nature as applied to a man's body such as "if yorl pinch him 11e will squeak." I f we substitute "laws of behavior which regulate his life" for "laws o f conduct by which he regulates his life" in the argumcnt quoted the undistributed middle is no longer insuperable. F o r w e believe that it is not only true that being regulated by laws of hehavior implies Ixinl: some sort of machine ( b o u g h not necessarily a discrete state macl~inc), but that conversely being such a machine implies being r e ~ t ~ l a t c d such by laws. I4owever, we cannot so easily convince ourselves of tile alrsc~~cc of complete laws of behavior as of complete rules of conduct. l'he only way a p e know of for finding such laws is scientific observation, and we sertainly know of no circumstanccs under which we could say, "\Ve have searched enough. T h e r e are no s11ch laws." W e can demonstrate more forcibly that any such statement woultl be unjustified. For suppose we cottltl he sure of finding such laws i f tl~cy cxisted. l'llen given a discrcte state machine it should certainly he possible to discover by observation sufficient about it to predict its fut~rrc hchavior, and this within a reasonal)lc titne, say a tltousand years. 1111t this docs not scem to be the case. I have set I I on thc Mancllestcr cotn~ Putcr a srnall program using only 1000 trnits of storacc, whercbv t l ~ c machine supplied wit11 one sixteen f i ~ u r e number replies with anotller within two seconds. I would defy allyone to learn from ~ h e s c rrplirs
sll6cicnt a t ~ o u tthc progralli to he able to predict any replies to untried t alllcs. (9) Thr .!rgrrmrnt f r o m E x ~ r a - S r n r o r yP r r c r ~ t i o n .I assume thilt the r c - ~ d e ris fatiiiliar with tlie idea of extra-sensory perception, and the nlcaninr: of the four items of it, viz,, telepathy, clairvoyance, precngnition and psychokinesis. These disturbing phenomena seem to deny all our ilsual scientific ideas. H o w we should like to discredit them1 Unfortunately the statistical evidence, a t least for telepathy, is overwhelming. I t i.; vcry difficult t o rearrange one's ideas so as to fit these new facts in. O n c e one has accepted them it does not seem a very big step to believe in ghosts and bogies. T h e idea that o u r bodies move simply according to the known laws of physics, together with some others not yct discovereil but somewhat similar, would he one of the first to go. T h i s argument is to my mind quite a strong one. O n e can say in reply that many scientific theories seem to remain workable in practice, in spite of clashing with E.S.P.; that ill fact one can get along vcry nicely if one forgets about it. T h i s is rather cold comfort, and one fears that thinking is just the kind of phenomenon where E.S.P. may be especially relevant. A more spcilic argument based on E.S.P. might run as follows: "Let u* play the imitation game, using as witnesses a man who is good as n telcpatl~icreceiver, and a digital computer. T h e interrogator call ask s u c l ~ questions as 'Wllat suit does the card in my right hand hclon~;to?' T h e man hy telrpatl~yo r clairvoyance gives the right answer 130 timcs oirt of 400 cartls. 'The machine can otily guess a t random, and perllaps jict 104 right, so the interrogator makes the right idcntification." l'here is a n interest in^ possibility which opens here. Suppose the digital computer contains a random nutnbcr generator. 'l'hen it will be natural to use this to decide what answer to give. Ilut then the random number genrrntor will 11c srthjcct t o the psychokinetic powers of the interrogator. Pcrhaps this psvchokinesir might cause the machine to guess right more often than would he expected o n R probahility calculation, so that tlie interrogator n\icht still hc unablc to make the right identification. O n the otller hand. hr niiplit he a l ~ l e o guess riaht without any q i ~ e s t i o n i n by cli~irvoyancc. t ~, \Vith K.S.P. anything may happen. I f telepatl~yis admitted it will be necessary to tighten our test. l ' h c s i t t r a t i o ~coi~ldbe regarded as analogous to that which wo111d occur i f ~ the i t ~ t r r r n c a t o r were talking .to llimsclf and one of the colnpctitors upas listening with.his ear t o the wall. T o put the competitors into a "telepathy-proof room" would satisfy all requirements.
7. LEARNI.U(: LIACII INES
T h e reader will have anticipated that I have no vcry convincing nrjiumcnts of a positive nature to srtpport my views. If I hnd 1 should not have taken such pains to point out tile fallacies in contrary views. Such cviticnce as I have I shall now give. k t us return for a moment to Lady Lovelace's objection, which srattd that the maclline can only do what we tell it to do. O n e could say that a man can "inject" an idea into the machine, and that it will respond to a certain extecit and then drop into quiescence, like a piano string struck by R hammer. Another simile would be an atomic pile of less than critical size: an injected idea is to correspond to a neutron entering the pile from without. Each such neutron will cause a certain disturbance which eventually dies away. I f , however, the size of the pile is sufficiently increasccl, the disturbance caused by such an incoming neutron will verv likely go on and on increasing until the whole pile is destroyed. Is there 3 corrcspending phenomenon for minds, and is there one for machines? 'I'llcre does seem to be one for the human mind. ?'he majority of them seem to to "suhcritical," i.e., to correspond in this analogy to piles of suhcritical size. An idea presented to strch a mintl will on an average give risc t o less than one idea in reply. A smallish proportion are supercritical. A n idea prcsc~ited to such a mind may give rise to a whole "theory" con>istiny of secondary, tertiary ant1 more remote ideas. Anin~als' minds seem to be very definitely subcritical. Adhering to this analogy w e ask, "Can a machine he matle to he sllprrcritical?" 'I'l~e "skin of an onion" analogy is also helpful. I n considering tlir functions of the mind or tllc hrain we find certain operations which wc can explain in pr~rclymechanical terms. 'l'his we say does not correspond to t l ~ creal mind: it is a sort of skin wl~icllwe must strip off if we are to find the real mind. But thcn in what rcmnins we find a further skin to bc stripprd off, and so on. Procrc.tlin~in this way do we ever come to 111r j . rtnl" mind, or do wc eventually comr to tlie skin which has notlling in lc i t ? In the lattrr case the ~ v l ~ o mind is mechanical. ( I t would not hc a discrete state machine however. \Ye llnvc tliscussed this.) 'l'hesc last two paragraphs tlo not clnirn to he convincing argurnrnts 'l'l~c!. should rather be descriticd as "rtciti~tionstending to produce l . x licf." 'l'hc o ~ i l yreally satisfactory support tllnt can be given for the virw
c ~ I > I I ' U T I N C~ I A C I I I N E R Y N D I N ' I F . I . I . I C E N C B A
r s p r r s r d a t the heginning of Sec. 6, p. 13, will bc that provitlctl 1,). H.:I till): for the end of the century and tllcn doing the expcrirnent tlcsc.:;l~rJ. But what can we say in the meantime? W h a t steps sho~lltlbe t ' i t b r t ~ now if the experiment is to be successful? 4 s I !;ave explained, the problem is mainly one o f programinc. Atlvallcrs in cnpilreering will have to be made too, hut it seems unlikely that tllr,.e will rlfB! be adequate for the requirements. Estimates of the storage c a p : l r i ~of the brain vary from 10'0 to 10" binary digits. I incline to tllr lower villuea a n d believe that only a vcry small fraction is used for the l ~ i ~ h tcyrp e of thinking. M o s t of it is probably used for the retention of visual impressions. I should be surprised if more thaq LOg was required f o r satisfactory playing of the imitation game, a t any rate against a blind man. (Note: The capacity of the Encyclopaedia Britannica, eleventh edition, is 2 X lo*.) A storage capacity of lo7 would be a vcry practicable possibility even by present techniques. I t is probably not nectssarv to increase the speed of operations of the machines a t all. Parts of modern machines which can be regarded as analogues of nerve cells work r h o r ~ ta. thousand times faster than the latter. T h i s should provide r "nrargin of safety" which could cover losses of speed arising in many Wa!'F. O u r problem then is to find o u t how to program these machincs to play the g m e . A t my present rate of working I produce about a thousand digitr of program a day, so that about sixty workers, working steadily through the fifty years might accomplish the job, if nothing went into the wastepaper basket. Some more expeditious method seems desirahlr. I n 111eprocms of trying to imitate an adult human mind we are bound t o think a good deal about the process which has brought it to the state that it is in. W e may notice three components, ( a ) T h e initial state of the mind, say a t birth, ( 6 ) T h e education to which it has been subjected, ( r ) O t h e r experience, not to be described as education, to which it has hrcn subjected. Itlstcad of trying to produce a program to simulate the adult mind, wiry not ratlter t r y to produce one which simulatts the child's? If this were then suhjected to an appropriate course of education one would o l ~ t a i nthe adult brain. Presumably the child-brain is somcthing like a notchnok as one buys it from the stationers. Rather little mechanism, and lots of blank sheets. (Mechanism and writing are from our point of view alntost synonymous.) O u r hope is that there is so little mechanism in the child-brain that something like i t can be easily programed. ?'Ire
alnount of work in the education we can assume, as a first approxilnation, to I)c much the same as for thc l~unianchild. i V e havc thus dividrd our problcrn into two parts-'l'tlc cl~ild-procrnln ar~tlthe education process. l'hcsc two rcmain very closely conncctcd. IVc cannot expect to find a good cl~iltl-machinea t the first attempt. One must expcrirnent wit11 teaching one such machine and see how well it learns. O n e can then try another and see if it is better or worse. There is an obvious connection hetwecn this process and evolution, by the idcntifications Structure of the child-macl~ine= Iiercditary material Change " " " " = klutations = Judgment of the cxpcrimcnter Natural sclection O n e may hope, however, that this process will be more expeditious than evolution. T h e survival o f tile fittest is a slow method for measuring arlvantages. l ' h c experimcntcr, h y the exercise of intelli~cncc,s l i n ~ ~ lbe d al)le to speed it up. Equally important is the fact that IIC is not restricted to random mutations. If he can trace a cause for some weakness Ire can probably think of the kind of mutation which wili improve it. It will not be possible to apply exactly the samc tenchinz procrss to the machine as to a normal clrild. It will not, for instance, he provided with legs, so that it could not be asked to go out and f i l l the coal scuttle. Possibly it might not have eycs. But however well t11e.v dcficicncies might be overcome by clever e n ~ i n e e r i n pone could not scnd the crcatllre , to school without the other children making excessive fun of it. It must be given some tuition. \Ve need not be too concerned about the legs, eyes, etc. T h e example of RIiss FIelen Kcller shows that education can take place provided that communication in both directions between tcachcr and pupil can take place hy some mcans or othcr. W e normally associate punishments and rewards with the teaching process. Some simple child-machines can be constrt~ctedor p r o ~ r a m c d on this sort of principlr. T h c machine has to be so constructed that cvcnts which shortly preceded the occurrence of a punishment-signal are unlikely to be repeated, whereas a reward-siRnal incrca$cs the probability of repetition of the events which Ird I I to it. Tllese definitions do not prc~ sl~pposeany feelings on tlrc part o f the machine. I have done some experiments with one such cl~ild-machine, and succcedctl in teaching it a few things, hut the tcaching niethntl was too unortl~orloxfor tlic expcrimrnt to be considered rrnll!+ strcccssful. T h e use of punishments and rcwnrds can a t Ilest he a part of the tcaching process. Roughly :pcal;inK, i f t h e tcnchcr has no other means of
<!!n::n::nlcatin~ to ihc pupii, the amount of information which cnn rr:lt.l1
!sitn clws not exceed the total number o f rewartls and p t ~ n i s l i ~ ~ i r r ~ t ~ nl~lllicd.n y the time a cllild has learned to repeat "C;tsabixncnl' Ilc \ V O I I I I ~ probably feel vcrv sore indeed, if the text could only he discovered h!. ;I ""wrtlty Uucstior~s" technique, every "NO" taking the form of ;I hlow. It is nrcrssary therefore to have some other "unemotionnl" cl1:111nrls of rotnmur~ication. If these are available it is possible to tcncl~ ;I rrqachinc by punishments and rewards to ohcy orders given in sollte l a ; : ~ u a ~ e.g., a symbolic language. These orders are to be transmitteti e, tj*;ough the "unemotional" channels. T h e use of this languapc ~ v i l l diminish greatly the number of punishments and rewards required. '3pinions may vary as t o the complexity which is suitable in the childm:*chine. O n e might try to make it as simple as possible consistently wit11 the general principles. Alternatively one might have a complete systctr, o i logical inference "huilt in."' I n the latter case the store would he largely occupied w i t h detinitions and propositions. T h e propositiotis would have various kinds of status, c.g., well-established facts, conjcctures, mathematically proved thcorcms, statements given by an autllority, expressions having the logical form of proposition but not belief-value. Certain propositions may be describcd as "imperatives." I'he machinc should he so constructed that as soon as an imperative is classed as "well-established" the appropriate action automatically takes place. l ' o illustrate this, suppose the teacher nays to the machinc, "Do your hottirwork now." T h i s may cause "Teacher says 'Do your homework now' " t o hc included among the well-established hcts. Another such fact m i ~ l l t he, "Evcrytlbing that teacher says is true." Combining these may eventually lead to the imperative, "Do your homework now," being incl~~tlcd alnong the well-established facts, and this, by the construction of the machine, will mean that the homework actually gets started, but the effect is very unsatisfactory. T h e processes of inference used by tlle machine need not be such as would satisfy the most exacting logiciar~s. 'There might for instance be no hierarchy of types. But this need not mean that type fallacies will occur, any more than w e are bound to fall over u~lfenccdclitlx. Suitable imperatives (expressed within the systet~~s, not forming part of the rules of the system) such as "Do not use a class unless it is a scthclnss of one which has been mentioned by teacher" cntl have a similar cflect to "Do not go too near the edge." T h e i~npcrativcsthat can be obeyed by a machine that has no litnhs -.--..-.'Or rattler "prngramcd in" for our chilcl-machine will be programed in a digital cotnpt~~cr. the logical system will not have to be learned. But
I,tr~~ntl bc of a r;itl~er irltcllrct~tnl cl~arnctcr,nc in the esample to ( t l t l i n K I~nt~lework) given ahnvc. Inlportnnt nrnonc s~tt.liitnpcrntivcs will 1," onrs wliich rrEulate the ttrdcr in \vllich tlle r l ~ l r sof t l ~ clocical sy5tern c,,,,cerncd are to he applied. For at cacli stage whcn onc is usinl: a logicnl ,,,trtn, there is a very large number of altrrnative step<, nny of whiclt n,lc is permitted to apply, so far as obedience to the rtlles of the logical ,,.,tcm is concerned. These choices make the difference betwccn a hrilliant a footling reasoner, not the difference betwccn a sound and a fnlInCintls one. Propositions leading to imperatives of this kind might be "\Vhen Socrates is mentioned, use t l ~ esyllogism in Rarbara" or "lf one mrttlod has been proved to be quicker tltan another, do not use the slower rncthod." Some of these may be "given by authority," but others may he produced by the machine itself, e.g., by scientific induction. 'I'he idea of a learning machine may appear paradoxical to some rradcrs. H o w can the rules of operation of the machine change? They should describe completely how the machine will rcact whatever its histor)' might be, whatever changes it might undergo. T h e rules arc thus q~tite time-invariant. T h i s is quite true. l'lle explanation of the paradox is that the rules which get changed in the learning process arc of a rather less pretentious kind, clairning only an eplietneral valitlity. l'he reader tliny draw a parallel with the Constitution of tltc Unitcd States. An important feature of a learning machinc is tllnt its tractler will often be very I a r ~ e l yignornrlt of quite what is jioinc on insitle, a l t i i o u ~ l ~ IIC rnny still be able to some extent to prrdict Iiis p~tpil'sbrllavior. T h i s sltould npply most strongly to the later educntinn of a ~nacllincarising Irtinl a child-machine of well-tried design (or procrnm). T h i s is in rlrnr contrast with normal procedure when using a mncl~ineto (lo computntions: one's object is then to have a clear mental picture of the st:~teof the maclline a t each moment in the cornputation. 'This ohjcct can otily he achieved with a struggle. 'l'lie view that "the maclline can only tlo \rrllat we know how to order it to tln," appcnrs stranjie in face of this. most of the projiranis wlticl~we can put into the macliine will rcsult i t ) its doing something t l i a t \vc cannot make sense of a t all, or which we Jr rrznrd as completely rantlot11 I)cltavior. lntclligent hrhavior presumably consists in a departure from tlre complctclv disciplined behavior irlvolved in computation, but a rather slight one, wllicti does not give rise to random behavior, or to pointless repetitive loops. Another important result of preparing our machinc for its part in the imitation game by a process -.
' Cnmpart L a d y love lace'^
statement (pp. Z O f . ) , which
doc9 not contain
A . hi. ~ ' U R I N C :
~f : .:rclli~~:: art11 learning is ['tat "hl~rnanfallibility" is likrly to he omittetl
in :I sathtr ijxttiral way, i.c., without special coachi in^." ( T h e reader should reconcilr tllis with tllc point of view on pp. 15-19) Processes that are learned (lo not produce a hundred pcr cent certainty of res~rlt;i f thr!? did tltey could not hc unlearned. I t is prol)aldy wise to irlclude a random element in a lenrning macl~ir~e (set p. 10). .A rantlom element is rather uscful whcn we are searclling f o r ;I solution o f some problem. Suppose for instance we wanted to find 3 n ~ ~ m b bctwccn 50 and 200 w h F h was equal to the square of the sum er o f its digits, w e might start a t 51 then try 52 and go on until we got a number that worked. Alternatively w e might choose numbers at random until w c got a good one. T h i s method has the advantage .that it is unnecessary to keep track of the values that have been tried, but the disadvantage that one may try the same one twice, but this is not vcry icnport;rnt if there are several solutions. T h e systematic method has the disadvantage that there may be an enormous block without any solutions in the region wltich has to be investigated first. N o w the learning procrss may be regarded as a search for a form of behavior which will satisfy the teacher ( o r some other criterion). Since there is roba ably a vcry lnrgr number of satisfactory solutions the random method sccms to be better than the systematic. I t should be noticed that it is used in the analogous process of evolution. But there the systematic method is not possible. H o w could one keep track of the different genetical combinations that had been tried, so as to avoid trying them again? LVe may hope that machines will e v c n t ~ a l l ycompete with men in all purely intellectual fields. But which are the best ones to start w i t h ? Even this is a difficult decision. M a n y people think that a vcry abstract activity, like the playing of chess, would be best. I t can also be maintained that it is best to provide the machine with the best sense organs that money can buv, iind then teach it to understand and speak E n ~ l i s l ~ . T h i s prwess c(tl~ldfollow the normal teaching of a child. 'Things wolrltl be p o i ~ ~ t r out nntl named, ctc. A ~ a i nI do not know wllat t11e r i ~ l ~ t tl answer is, hut I think hot11 approaches shuuld be tried. U ' r : can only see n short distance ahead, but we can see plenty tllere that nceds to h i done.
CONCEPT OF MIND
hlICIIAlSL S C R I V E N
Is there an essential difference between a man and a machine? T o this question many answers have been suggested. O n e type of answer claims .for the man some psychological * originality, which is said toquality such as intelligence, consciousness, or \,e necessarily lacking in the machine. Other examples are in trospcction, thought, free will, Ilctrnor, love, correlation o f speech and senses. ' l ' h r o ~ ~ l ~ othis paper the sole example ut of consciousness will he used. 'l'lle argument follows very similar lines lor the other terms. A maclline is normally ~rndcrstoodto I)e an artifact, a nlanufnctured mecl~anical (and possibly electrical) contrivance. I t will so be taken here. T h e purpose of this discussion will he to consider in detail the statement that macltincs are never conscious. \Vhen it is said that it is impossil~lc for a machine to be conxior~s, i t is not always clear to \ \ . l ~ a trxtrrlt t l ~ i q is intended to he a logical
" T h r M r t h n n i r a l C o n r r p l 01 A l i n d , " h l i n d , 1'01. I . X I I , h'o. ) r i n / r , l b y p r r r n i ~ l i o n o / / h r a u t h o r a n d i h r r d ~ f o ro/ Mind.
246 ( 1 9 5 3 ) . R r -
This action might not be possible to undo. Are you sure you want to continue?