You are on page 1of 4

Lindstrom 1 Nathan W.

Lindstrom Professor Strozier Minds and Machines May 23, 2010 Consciousness as a Chinese Room In his book How the Mind Works author Steven Pinker states that consciousness possess four features. First, says Pinker, we are aware, to varying degrees, of a rich field of sensation Second, portions of this information can fall under the spotlight of attention, get rotated into and out of short-term memory, and feed our deliberative cognition. Third, sensations and thoughts come with an emotional flavoring Finally, an executive, the I, appears to make choices and pull the levers of behavior (Pinker). Combining this description of consciousness with the socalled hard problem of how the physical processes of consciousness in the brain can permit subjective experience leads us directly to the question, what is consciousness (Chalmers)? And, assuming as many authors have argued that consciousness cannot be truly defined, then how is it possible to identify its presence, or lack thereof, in a person or animal? Professor Chalmers puts forth the idea that consciousness, which so far cannot be explained in terms of lower-level building blocks combined from physical theory, should be promoted to holding the lofty designation of a fundamental feature, irreducible to anything more basic (Chalmers). This approach, however attractive inasmuch as it allows us to readily sweep the heretofore unexplainable under the rug, suffers from a fatal flaw: in lumping consciousness in with other basic forces like gravity and energy it will shift focus away from

Lindstrom 2 prying at the underlying mechanisms of consciousness and instead replace them with simply observing and measuring its effects. In other words, calling consciousness a fundamental feature is no different than saying that lightning is a fundamental feature of rainstorms or that light is a fundamental feature of flipping a switch. I would posit that consciousness is nothing more than a slightly different take on the famous Chinese Room puzzle. Let us say the two of us have just met. My head is the Chinese Room; and inside is my consciousness: a man armed only with a playbook. You scribble hello, how are you doing? on a scrap of paper and push it under the door. My consciousness picks up the paper, and knowing nothing of the squiggles and scratches upon its surface, hurriedly consults the playbook. Finding a match, the man (my consciousness) follows the instructions to scratch new and equally unintelligible marks on a piece of foolscap, which he then pushes back under the door, resulting in you hearing my response of Im well, thank you! Let us say you now strike me over the head with a brick. The Chinese Room is shaken; the man inside flies off his feet and his playbook is tossed about the room, pages flying every which way. Bending over my unconscious form, you shove a piece of paper under the door, which is again inscribed with the words hello, how are you? Unsurprisingly, you do not receive a response. Or consider an alternative scenario: after greeting me, and receiving a greeting in return, a nuclear bomb goes off in the distance and a huge mushroom cloud blots out the horizon. What is the most likely response you might observe in me? Stunned silence. Why? Because the man in the Chinese Room has just received a piece of paper describing a nuclear blast from my eyesand again, while he knows nothing of the information it conveys, it is of such an

Lindstrom 3 incredibly rare pattern that the matching entry in the playbook is so far in the back that it takes him some time to locate it, and to shove a piece of paper back under the door which reads wow! or perhaps run! But this analogy does not explain how each individuals conscious experience is unique and subjective: the hard problem as raised by Professor Chalmers. But consider this: not every person is blonde, and neither is every person brunette. Chance mutations have introduced a plethora of possible hair colors to the human species, so it would stand to reason that a similar mechanism has introduced a plethora of variations to the playbook consulted by each individuals man in the Chinese Room. This is further enhanced by the fact that a great many of the pages in the playbook are blank, and that with the correct sequence of marks on the papers pushed under the door it is possible to have the man add entirely new procedures to his playbook. I suspect this explanation of consciousness will not satisfy many people, chiefly because what I am really proposing is that consciousness is nothing more than an incredibly detailed and constantly-updated if/then/else lookup table. Pinker goes to great pains to point out that the human brain does not appear to have the capacity to deal with the sheer number of entries such a table would require; but if the brain actually does work on a quantum level, as some scientists now believe, the storage size problem goes away, as does the problem with decision tree latency. If what Ive proposed here is really the case for how consciousness works, then replicating consciousness in a machine is not a software problem, but a hardware problem. Until we can build a device as fast as the human mind with similar storage limits then we will be limited to constructing only unconscious or semiconscious machines.

Lindstrom 4 Works Cited

Chalmers, David J. "The Puzzle of Conscious Experience." Scientific American (1995): 62-68. Pinker, Steven. How the Mind Works. New York: Norton, 2009.

You might also like