Welcome to Scribd. Sign in or start your free trial to enjoy unlimited e-books, audiobooks & documents.Find out more
Standard view
Full view
of .
Look up keyword
Like this
0 of .
Results for:
No results containing your search query
P. 1
Richard Feynman and computation - TONY HEY

Richard Feynman and computation - TONY HEY

|Views: 38|Likes:
Published by kajdijkstra

More info:

Categories:Types, Research, Science
Published by: kajdijkstra on Jun 17, 2010
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less





Richard Feynman and computation
The enormous contribution of Richard Feynman to modern physics is well known, both toteaching through his famous
Feynman Lectures on Physics
, and to research with his
Feynman diagram
approach to quantum ®eld theory and his
path integral
formulation of quantum mechanics. Less well known perhaps is his long-standing interest in the physics of computation and this is the subject of this paper. Feynman lectured on computation atCaltech for most of the last decade of his life, ®rst with John Hop®eld and Carver Mead,and then with Gerry Sussman. The story of how these lectures came to be written up as the
Feynman Lectures on Computation
is brie¯y recounted. Feynman also discussed the fundamentals of computation with other legendary ®gures of the computer science and  physics community such as Ed Fredkin, Rolf Landauer, Carver Mead, Marvin Minsky and John Wheeler. He was also instrumental in stimulating developments in both nanotechnol-ogy and quantum computing. During the 1980s Feynman re-visited long-standing interestsboth in parallel computing with GeoŒrey Fox and Danny Hillis, and in reversiblecomputation and quantum computing with Charles Bennett, Norman Margolus, Tom ToŒoli and Wojciech Zurek. This paper records Feynman’s links with the computational communityand includes some reminiscences about his involvement with the fundamentals of computing.
1. Introduction
Feynman Lectures on Computation
[1] were ®nallypublished in September 1996, some eight years after hisdeath. How did an English Professor of Computer SciencecometobeeditingFeynman’slecturesgiven at Caltechwhichhedidnotevenattend?InNovember1987, Ireceiveda phonecall in Southamptonfrom Helen Tuck, Feynman’s secretaryfor many years, saying that Feynman wanted me to helpwrite up his lecture notes on computation. Sixteen yearsearlier, as a post-doc at Caltech, I had declined theopportunity to work with Finn Ravndal on editingFeynman’s `Parton’lectures [2] on the groundsthat it wouldbe a distraction from my research. I had often regretted mydecisionso I did nottake muchpersuading thistimearound.At Caltechin theearly 1970s, I had beena theoreticalparticlephysicist,but ten years later, on a sabbatical visit to Caltechin 1981, I becameinterested in computationalphysicsÐplay-ing with Monte Carlo and variational methods that I laterfound out were similar to techniques Feynman had usedyears beforeatLosAlamos.WhileI wastherein 1981,CarverMead gave a memorable lecture about the future of VLSIÐVery Large Scale IntegrationÐand the semiconduc-tor industry.I returnedto Southampton inspiredby Mead’svision of the future and set about exploring the potential of parallel computing for computational science. Four yearslater, I completed my move from physics to computerscience,when I moved to the Departmentof Electronics andComputer Science at Southampton. Two years after that, Ireceived the call from Helen Tuck.The ocial record at Caltech [3] lists Feynman as joiningwith John Hop®eld and Carver Mead in the fall of 1981 togive an interdisciplinary course entitled `The Physics of Computation’. The coursewas given for two years althoughFeynman was ill with cancer during the ®rst year and Meadon sabbatical for much of the second. A handout from thecourse of 1982
83 reveals the ¯avor of the course: a basicprimer on computation, computability and informationtheory followed by a section titled `Limits on computationarising in the physical world and `fundamental’ limits oncomputation.’ The lectures that year were mainly given byFeynman and Hop®eld with guestlecturesfrom expertssuchas Marvin Minsky, Charles Bennett and John Cocke.
Author’s address:
Department of Electronics and Computer Science,University of Southampton, Southampton SO17 1BJ, UK.
Contemporary Physics,
1999, volume 40, number 4, pages 257±265
Contemporary Physics
ISSN 0010-7514 print
ISSN 1366-5812 online
1999 Taylor & Francis Ltdhttp:
In the fall of 1983, Feynman ®rst gave a course oncomputing by himself, listed in the Caltech record as beingcalled `Potentialities and Limitations of Computing Ma-chines’. In the following years 1984
85 and 1985
86, thelectures were taped and it was from those tapes andFeynman’s notebooks that the lectures were reconstructed.In reply to Helen Tuck, I told her I was visiting Caltech inJanuary 1988 to talk at the
Hypercube Conference
. This wasa parallel computing conference that originated from thepioneering work at Caltech by GeoŒrey Fox and ChuckSeitz on the `Cosmic Cube’ parallel computer. I talked withFeynman in January and he was very keen that his lectureson computation should see the light of day. I agreed to takeon the project and we agreed to keep in touch. Alas,Feynman died a month later and there was no chance for amore detailed dialogue about the proposed content of thepublished lectures.As advertised, Feynman’s lecture course set out toexplore the limitations and potentialities of computers.Although the lectures were given over ten years ago, muchof the material is relatively `timeless’ and represents aFeynmanesque overview of some fairly standard topics incomputer science. Taken as a whole, however, the coursewas unusual and de®nitely interdisciplinary in content andanalysis. Besides giving the `Feynman treatment’ to subjectssuch as computability, Turing machines (or as Feynmansaid `Mr Turing’s machines’), Shannon’s theorem andinformation theory, Feynman also discussed reversiblecomputation, thermodynamics of computation and quan-tum computation. Such a wide-ranging discussion of thefundamental basis of computers was undoubtedly uniquefor its time and a `sideways’ Feynman-type view of thewhole of computing. Not all aspects of computing arediscussed in the lectures and there are many omissions,programming languages and operatingsystems to name buttwo. Nevertheless, the lectures did representa survey of thefundamental limitations of digital computers.Feynman was not a professional computer scientist andhe covered a large amount of material very rapidly,emphasizing essentials rather than exploring all the details.Having said this, his approach to the subject was resolutelypractical and this is underlined in his treatment of computability theory with his decision to approach thesubject via a detailed discussion of Turing machines.Feynman takes obvious pleasure in explaining how some-thing apparently as simple as a Turing machine can arriveat such momentous conclusions. His philosophy of learning and discovery also comes through very stronglyin the lectures. Feynman constantly emphasized theimportance of working things out for yourself, tryingthings out and playing around before looking in the bookto see how the `experts’ have done things. The lecturesconstitute a fascinating insight into Feynman’s way of working.In at least one respect the published lectures do not do justice to Feynman’s course. Included along with the topicsdiscussed above were lectures by invited speakers on avariety of what Feynman called `advanced applications’ of computers. The choice of speaker not only re¯ected topicsthat Feynman thought important but also the ®gures in thecomputational community with whom he had interactedover the years. The purpose of this article is to put onrecord these relationships and shed light on Feynman’scontribution to the ®eld of computation.
2. Feynman’s course on computation
We begin with a look at the evolution of the Feynmancomputation lectures from the viewpoint of the threecolleagues who participated in their construction. As wehave seen, in 1981
82 and 1982
83, Feynman, JohnHop®eld and Carver Mead gave an interdisciplinary courseat Caltech entitled `The Physics of Computation’. ThediŒerent memories that John Hop®eld and Carver Meadhave of the courseare an interesting contrast.Feynman washospitalized with cancer during the ®rst year and Hop®eldremembers this year of the course as `a disaster’, with himand Mead wandering `over an immense continent of intellectual terrain without a map’ [4]. Mead is morecharitable in his remembrances [5] but both agreed that thecourse left many students mysti®ed. After a second year of the course, in which Feynman was able to play an activerole, the three concluded that there was enough material forthree courses and that each would go his own way.The next year, 1983
84, Gerry Sussman was visitingCaltech on sabbatical leave from MIT intending to work onastrophysics. Back at MIT, Sussman supervised Feynman’sson, Carl Feynman, as a student in Computer Science, andat Caltech, Feynman had enjoyed Abelson and Sussman’sfamous `Yellow Wizard Bookon `The Structure andInterpretation of Computer Programs’. Feynman thereforeinvited Sussman to lunch at the Athenaeum, the CaltechFaculty Club, and agreed a characteristic `dealwithSussmanÐSussman would help Feynman develop hiscourse on the `Potentialities and Limitations of ComputingMachines’ in return for Feynman having lunch with himafter the lectures. As Sussman says, `that was one of thebest deals I ever made in my life’ [6].The topicson which Feynman interactedwith these threeare an indication of the breadth of his interests. WithHop®eld, Feynman discussed the problem of how toimplement Hop®eld’s neural networks [7], in parallel, onthe Connection Machine. Hop®eld found it curious thatFeynman was not himself interested in building models of the brainÐalthough there are many stories testifying toFeynman’s interest in the way the brain worked.From Mead, Feynman learnt about the physics of VLSIand the reasons for the silicon scaling behaviour underlying
A. J. G. Hey
Moore’s Law. In 1968, Gordon Moore had asked Mead`whether [quantum] tunnelling would be a major limitationon how small we could make transistors in an integratedcircuit’. This question and its answer took Mead `on adetour that lasted thirty years’ [5]. Mead and Feynman alsohad many arguments about the right way to presentelectrodynamics and in particular about the role of thevector potential. Mead always thought Feynman evadedthe issue in his famous red
Feynman Lectures on Physics
[8].While Sussman was at Caltech, he initiated the buildingof a `Digital Orrery’, a special-purpose computer designedto do high-precision numerical integrations of planetarymotions. Much to Sussman’s surprise, relatively little wasknown about the numerical analysis for this classicalproblem. A serious problem with such very long integra-tionsÐSussman set a new record of 845 million years withJack Wisdom [9]Ðis the build-up of numerical errors.Feynman spent a considerable amount of time during thatyear helping Sussman understand this problem.
3. Reducing the size
Feynman had a long-standing interestin the limitations dueto size, beginning with his famous 1959 lecture `TheresPlenty of Room at the Bottom’, subtitled `an invitation toenter a new ®eld of physics’ [10]. In this astonishing lecture,given as an after-dinner speech at a meeting of theAmerican Physical Society, Feynman talked about `theproblem of manipulating and controlling things on a smallscale’, by which he meant the `staggeringly small world thatis below’. He went on to speculate that `in the year 2000,when they look back at this age, they will wonder why itwas not until the year 1960 that anybody began seriously tomove in this direction’. In his talk Feynman also oŒeredtwo prizes of $1000Ðone `to the ®rst guy who makes anoperating electric motor . . . [which] is only 1
64 inch cube’,and a second `to the ®rst guy who can take the informationon the page of a book and put it on an area 1
25 000 smallerin linear scale in such a manner that it can be read by anelectron microscope’. He paid out on bothÐthe ®rst, lessthan a year later, to Bill McLellan, a Caltech alumnus, for aminiature motor which satied the speci®cations butwhich was a disappointment to Feynman in that it requiredno new technical advances (®gures 1 and 2). Feynman gavean updated version of his talk in 1983 to the Jet PropulsionLaboratory. He then predicted `that with today’s technol-ogy we can easily . . . construct motors a fortiethof that sizein each dimension, 64 000 times smaller than . . . McLellan’smotor, and we can make thousands of them at a time’ [11].It was not for another 26 years that he had to pay out onthe second prize, this time to a Stanford graduate studentnamed Tom Newman. The scale of Feynman’s challengewas equivalent to writing all twenty-four volumes of the
Encyclopedia Brittanica
on the head of a pin (®gure 3).Newman calculated that each individual letter would beonly about ®fty atoms wide and, using electron-beamlithography, he was eventually able to write the ®rst page of Charles Dickens
A Tale of Two Cities
at 1
25 000 reductionin scale (®gure 4). Feynman’s paper is often credited with
Figure 1. Letter from Feynman to Bill McLellan about hisinvention of a miniature motor.Figure 2. Feynman examines Bill McLellan’s motor, 1960.
Richard Feynman and computation

You're Reading a Free Preview

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->