You are on page 1of 6

Conclusion: The Digitization of the World Picture

Between 1945 and 2002 the computer transformed itself over and over
again, each time redefining its essence. ‘‘The computer’’ started out as
a fast scientific calculator; Eckert and Mauchly transformed it into
UNIVAC, a machine for general data processing. Ken Olsen made it
into a real-time information processor that worked symbiotically with its
users. Ed Roberts transformed it into a device that anyone could own
and use. Steve Jobs and Steve Wozniak turned it into an appliance that
was both useful and fun. Gary Kildall and William Gates transformed it
into a standardized platform that could run a cornucopia of commercial
software sold in retail stores. Bob Metcalfe, Tim Berners-Lee, and others
turned it into a window to a global network.
Each transformation was accompanied by assertions that further
transformations were unlikely, yet each time someone managed to
break through. The latest transformation, to the World Wide Web, was
also preceded by statements that the computer industry was stagnating,
that there was, to paraphrase a software salesman, ‘‘no more low-hanging
fruit.’’ He was wrong, and those who predict that the World Wide Web is
the ultimate resting place for computing will no doubt be wrong as well.
Copyright © 2003. MIT Press. All rights reserved.

By the mid-1990s personal computers had become a commodity,


allowing commercial software to come to the fore as the central place
where innovation was conveyed to users. The layering of software, a
process that began with the first ‘‘automatic coding’’ schemes developed
for the UNIVAC, continued. That was the only way to broaden the
market to include users who had no inclination or talent to write
programs. Again, with each software advance, one heard that the ‘‘end
of programming’’ had come, that ‘‘anyone’’ could now get a computer to
do what he or she wished. As new markets opened up, the end proved
elusive. The difficulty many people have in programming a VCR is a
minor but real example of the problem: getting a computer to do what

Ceruzzi, Paul E.. A History of Modern Computing, Second Edition, MIT Press, 2003. ProQuest Ebook Central,
http://ebookcentral.proquest.com/lib/uts/detail.action?docID=3338878.
Created from uts on 2023-11-13 12:07:27.
346 Conclusion: The Digitization of the World Picture

users want it to do is as difficult as ever and requires talent, hard work


and a commitment by developers to the user’s needs.
The ease of use that the Macintosh interface brought to personal
computing, which Microsoft copied with Windows, has led to a new set of
frustrations. Users now find interfaces laid over these interfaces, which
are supposed to make computing even easier. In fact, they have made
things more difficult. This process will doubtless continue. The year
2001 has come and gone, and it did not bring with it a realization of the
intelligent computer HAL, the star of Stanley Kubrick’s movie 2001 A
Space Odyssey. Many people came away from the movie thinking that the
problem with HAL was that it was somehow out of control; but a closer
viewing shows that HAL’s real problem was that it worked perfectly. It
broke down because it was trying to obey two conflicting instructions
that were part of its programming: to obey the humans on board but to
conceal from them the true nature of their mission.1 If a real version of a
HAL-like intelligent interface ever appears, it will probably not be as
robust and reliable as the fictional one.

The Digitization of the World Picture

In 1948 a book appeared with the bold title The Mechanization of the World
Picture. The author, a Dutch physicist named E. J. Dijksterhuis, argued
that much of history was best understood as an unfolding of the
‘‘mechanistic’’ way of looking at the world that actually began with the
Greeks and culminated in the work of Isaac Newton.2 Dijksterhuis’s work
found a willing audience of readers who had experienced the power and
the horrors of a mechanized world view after six years of world war.
It took a millennium and a half for a mechanistic view to take hold,
but it has taken less time—about fifty years—for a view equally as
Copyright © 2003. MIT Press. All rights reserved.

revolutionary to take hold. The ‘‘digitization of the world picture’’


began in the mid-1930s, with the work of a few mathematicians and
engineers. By 1985 this world view had triumphed. It began in an
obscure corner of mathematics. Alan Turing’s ‘‘machine,’’ introduced
in a paper in 1936, was a theoretical construction.3 The invention of the
stored-program electronic computer breathed life into his idea and
made it more real than he probably thought possible. The ensuing
decades saw one field after another taken over, absorbed, or trans-
formed by the computer as if it were a universal solvent.4 A special issue
of the trade journal Electronics, in October 1973 described as ‘‘The Great
Takeover,’’ the way traditional analog electronic circuits were replaced

Ceruzzi, Paul E.. A History of Modern Computing, Second Edition, MIT Press, 2003. ProQuest Ebook Central,
http://ebookcentral.proquest.com/lib/uts/detail.action?docID=3338878.
Created from uts on 2023-11-13 12:07:27.
Conclusion: The Digitization of the World Picture 347

by miniature digital computers programmed to emulate them; most


ordinary radios, for example, had lost their tuning dial by 1973 and were
‘‘tuned’’ by digital keypads. Ten years later, Time proclaimed the compu-
ter ‘‘Machine of the Year’’ for 1983, with the opening headline ‘‘The
Computer Moves In.’’5
The latest manifestation of this takeover is the Internet, embraced
across the political and cultural spectrum, by Newt Gingrich, Al Gore,
Stewart Brand, the late Timothy Leary, ‘‘Generation X,’’ and numerous
people in between. Most accounts describe it as a marriage of commu-
nications and computing.6 The evidence presented here suggests other-
wise; that the Internet simply represents yet another takeover, by digital
computing of an activity (telecommunications) that had a long history
based on analog techniques.
Those who so glowingly describe the World Wide Web as the culmina-
tion of fifty years of prologue either do not know or have forgotten
history. The very same statements were made when the first UNIVACs
were installed, when minicomputers and time-sharing appeared, and
when the personal computer was introduced (figure C.1). This will not
be the last time these words are spoken. But promises of a technological
Utopia have been common in American history, and at least a few
champions of the Internet are aware of how naive these earlier visions
were.7 Silicon Valley has some of the most congested real highways in the
country, as people commute to work with a technology that Henry Ford
invented to reduce urban congestion. Most people have some sense of the
fact that the automobile did not fulfill many of Ford’s promises simply
because it was too successful. The word ‘‘smog’’ crept into the English
language around the time of Ford’s death in the late 1940s; ‘‘gridlock,’’
‘‘strip malls,’’ and ‘‘suburban sprawl’’ came later. What equivalent will
describe the dark side of networked digital computing? And will those
Copyright © 2003. MIT Press. All rights reserved.

‘‘side effects’’ become evident only fifty years from now, as was the case
with automobiles? Can we anticipate them before it is too late or too
difficult to manage them?
Each transformation of digital computing was propelled by individuals
with an idealistic notion that computing, in its new form, would be a
liberating force that could redress many of the imbalances brought on
by the smokestack of the ‘‘second wave,’’ in Alvin Toffler’s phrase.
UNIVAC installations were accompanied by glowing predictions that
the ‘‘automation’’ they produced would lead to a reduced workweek. In
the mid-1960s enthusiasts and hackers saw the PDP-10 and PDP-8 as
machines that would liberate computing from the tentacles of the IBM

Ceruzzi, Paul E.. A History of Modern Computing, Second Edition, MIT Press, 2003. ProQuest Ebook Central,
http://ebookcentral.proquest.com/lib/uts/detail.action?docID=3338878.
Created from uts on 2023-11-13 12:07:27.
348 Conclusion: The Digitization of the World Picture
Copyright © 2003. MIT Press. All rights reserved.

Figure C.1
Digital Utopia, as depicted on the cover of Byte magazine ( January 1977). Byte’s
cover illustrations stood out among all the computer publications. (Source :
Robert Tinney.)

Ceruzzi, Paul E.. A History of Modern Computing, Second Edition, MIT Press, 2003. ProQuest Ebook Central,
http://ebookcentral.proquest.com/lib/uts/detail.action?docID=3338878.
Created from uts on 2023-11-13 12:07:27.
Conclusion: The Digitization of the World Picture 349

octopus. The Apple II reflected the Utopian visions of the San Francisco
Bay area in the early 1970s. And so it will be with universal access to the
Internet.
In each case the future has turned out to be more complex, and less
revolutionary, than its proponents imagined. The UNIVAC did not solve
the problem of unemployment. Personal computers did not put ordin-
ary individuals on an equal footing with those in positions of power. It
did find a market that exceeded all expectations—but in the office and
not the home, as a tool that assisted the functions of the corporate
workplace.8 Looking out over the polluted and decayed landscape of the
1970s-era industrial Rustbelt, young people programmed their personal
computers to model a middle landscape; one that gave its inhabitants all
the benefits of industrialization with none of the drawbacks. But the
social problems of the outside world remained. Utopia stayed inside the
computer screen and stubbornly refused to come out. Computer
modeling evolved into ‘‘virtual reality’’—a new variant of the mind-
altering drugs in vogue in the 1960s. Timothy Leary argued that virtual
reality was more effective than LSD as a way to bring humans back to the
Garden of Eden. So far that is not happening, and perhaps this is a good
thing, given the level of thought that characterizes most visions of what
Digital Utopia ought to look like.
We have seen that political and social forces have always shaped the
direction of digital computing. Now, with computing among the defin-
ing technologies of American society, those forces are increasingly out in
the open and part of public discussion. Politicians and judges as much as
engineers decide where highways and bridges get built, who may serve a
region with telephone service, and how much competition an electric
utility may have. These legislators and jurists rely upon industry lobbyists
or specialists on their staff to guide them through the technical dimen-
Copyright © 2003. MIT Press. All rights reserved.

sion of their policies. All the while, new technologies (such as direct
broadcast satellite television) disrupt their plans. But that does not stop
the process or shift decision-making away from these centers.
Computing is no different. The idea of politicians directing technol-
ogy is still distasteful to computer pioneers, many of whom are still alive
and retain a vivid memory of how they surmounted technical, not
political, challenges. But when a technology becomes integrated into
the affairs of ordinary daily life, it must acknowledge politics. Some
groups, such as the Electronic Frontier Foundation (founded by Mitch
Kapor), are doing this by stepping back to try to identify the digital
equivalents of ‘‘smog’’ and ‘‘gridlock.’’ But historically the United States

Ceruzzi, Paul E.. A History of Modern Computing, Second Edition, MIT Press, 2003. ProQuest Ebook Central,
http://ebookcentral.proquest.com/lib/uts/detail.action?docID=3338878.
Created from uts on 2023-11-13 12:07:27.
350 Conclusion: The Digitization of the World Picture

has promoted as rapid a deployment of technology as possible, and has


left it to future generations to deal with the consequences. It is not
surprising, therefore, that attempts to regulate or control the content of
the Internet have so far been clumsy and have failed. How that plays out
remains to be seen.
A century and a half ago, Henry David Thoreau observed with
suspicion the technophilic aspect of American character. Railroads
were the high technology of his day, but he did not share the public’s
enthusiasm for the Fitchburg line, whose tracks ran behind Walden
Pond. ‘‘We do not ride on the railroad; it rides on us,’’ he said. What the
nation needs is ‘‘a stern and more than Spartan simplicity of life.’’ A few
miles west of Thoreau’s cabin, the Fitchburg railroad built a branch to
serve the Assabet Mills, which by the time of the Civil War was one of the
country’s largest producers of woolen goods. A century later these same
mills were blanketing the Earth with PDP-8s. One wonders what Thoreau
would have made of this connection.9 Would he have seized the
opportunity to set up his own Walden Pond home page, to let others
know what he was up to? Or would he have continued to rely on the
pencils he made for himself?
We created the computer to serve us. The notion that it might become
our master has been the stuff of science fiction for decades, but it was
always hard to take those stories seriously when it took heroic efforts just
to get a computer to do basic chores. As we start to accept the World
Wide Web as a natural part of our daily existence, perhaps it is time to
revisit the question of control. My hope is that, with an understanding of
history and a dash of Thoreauvian skepticism, we can learn to use the
computer rather than allowing it to use us.
Copyright © 2003. MIT Press. All rights reserved.

Ceruzzi, Paul E.. A History of Modern Computing, Second Edition, MIT Press, 2003. ProQuest Ebook Central,
http://ebookcentral.proquest.com/lib/uts/detail.action?docID=3338878.
Created from uts on 2023-11-13 12:07:27.

You might also like