You are on page 1of 138

Companies

1947
Computer pioneers Presper Eckert and
John Mauchly founded the Eckert-
Mauchly Computer Corp. to construct
machines based on their experience
with ENIAC and EDVAC. The only
machine the company built was
BINAC. Before completing the
UNIVAC, the company became a
division of Remington Rand.

Eckert and Mauchly


with the ENIAC
1952
Heinz Nixdorf founded Nixdorf
Computer Corp. in Germany. It
remained an independent corporation
until merging with Siemens in 1990.
1956
Burroughs buys Electrodata.
Calculator manufacturer Burroughs
gained entry to the computer industry
by purchasing the southern California
company Electrodata Corporation. The
combined firm became a giant in the
calculating machine business and
expanded into electronics and digital
ElectroData computer computers when these technologies
in use, 1955 developed. Burroughs created many
computer systems in the 1960s and
1970s and eventually merged with the
makers of the Sperry Rand (maker of
Univac computers) to form Unisys.
1957
A group of engineers led by Ken Olsen
left MIT´s Lincoln Laboratory founded
a company based on the new
transistor technology. In August, they
formally created Digital Equipment
Corp. It initially set up shop in a
largely vacant woolen mill in Maynard,
Digital Equipment Mass., where all aspects of product
development — from management to
Corp.
manufacturing — took place.

In Minneapolis, the original


Engineering Research Associates
group led by Bill Norris left Sperry
Rand to form a new company, Control
Data Corp., which soon released its
CDC 1604 model 1604 computer.
1963
Tandy Radio Shack is founded. Tandy
Radio Shack (TRS) was formed by the
1963 merger of Tandy Leather
Company and Radio Shack. TRS began
by selling a variety of electronic
products, mainly to hobbyists. The
TRS-80 Model I computer, introduced
in 1977, was a major step in
introducing home computers to the
public. Like the Commodore PET and
the Apple II, which were introduced
within months of the TRS-80, the
computer came assembled and ready
to run.
1965
Commodore Business Machines (CBM)
is founded. Its founder Jack Tramiel
emigrated to the US after WWII
where he began repairing typewriters.
In 1965, he moved to Toronto and
established Commodore International
which also began making mechanical
and electronic calculators. In 1977,
Commodore released the Commodore
PET computer; in 1981 the VIC-20;
and, in 1982, the Commodore 64. CBM
Commodore Business purchased competitor Amiga
Machine founder Jack Corporation in 1984. Despite being
Tramiel the largest single supplier of
computers in the world at one time,
by 1984 internal disputes and market
pressures led to financial problems.
The company declared bankruptcy in
1994.
1968
Evans & Sutherland is formed. In
1968, David Evans and Ivan
Sutherland, both professors of
computer science, founded a company
to develop a special graphics
computer known as a frame buffer.
Ivan Sutherland and This device was a special high-speed
David Evans, 1969 memory used for capturing video.
Based in Salt Lake City, Utah, the two
founders trained a generation of
computer graphics pioneers—either at
E&S or at the University of Utah
computer science department.
Sutherland left the firm in 1975, and
Evans retired in the early 1990s, but E
& S continues today as a major
supplier of military and commercial
graphics systems.
1969
Xerox Corp. bought Scientific Data
Systems for nearly $1 billion — 90
times the latter´s earnings. The SDS
series of minicomputers in the early
Xerox
1960s logged more sales than did
Digital Equipment Corp. Xerox
changed the series to the XDS
computers but eventually closed the
division and ceased to manufacture
the equipment.
1970
Xerox opens Palo Alto Research
Center (PARC). In 1970, Xerox
Corporation hired Dr. George Pake to
lead a new research center in Palo
Alto, California. PARC attracted some
of the United States’ top computer
scientists, and produced many
Engineers at PARC groundbreaking inventions that
circa 1972 transformed computing—most notably
the personal computer graphical user
interface, Ethernet, the laser printer,
and object-oriented programming.
Xerox was unable to market the
inventions from PARC but others did,
including Steve Jobs (Apple), Bob
Metcalfe (3Com), as well as Charles
Geschke and John Warnock (Adobe)
1971
RCA sells its computer division. RCA
was founded in 1919 to make vacuum
tubes for radio, then a new invention.
RCA began designing and selling its
own computers in the early 1950s,
competing with IBM and several other
companies. By the 1970s, RCA, as well
as other computer makers, were
struggling to compete against IBM.
RCA made their machines IBM-
compatible, but ultimately even this
strategy proved unsuccessful. RCA
RCA Spectra 70 announced it would no longer build
advertisment computers in 1971, selling its
computer business to Sperry-Rand.
1973
IMSAI is founded. In 1973, Bill Millard
left his regular job in management to
found the consulting firm Information
Management Services or IMS. The
following year, while he was working
on a client’s project, he developed a
small computing system using the
then-new Intel 8080 microprocessor.
IMSAI 8080 System He realized this computer might
attract other buyers and so placed an
advertisement in the hobbyist
magazine “Popular Electronics,”
offering it in kit form. The IMSAI
8080, as it was known, sold briskly
and eventually about 20,000 units
were shipped. The company was
eventually purchased by one of its
dealers and is today a division of the
Fischer-Freitas Company, which still
offers reproductions of the original for
sale to hobbyists.
1975
Xerox closes its computer division.
After acquiring computer maker
Scientific Data Systems (SDS) in
1969, Xerox redesigned SDS’s well-
known Sigma line of computers. Xerox
struggled against competitors like
IBM and in 1975 closed the division.
Xerox Sigma-5 Most of the rights to the machines
were sold to Honeywell.
1980
Broderbund is founded. In 1980,
brothers Doug and Gary Carlston
formed a company to market the
games Doug had created. Their first
games were Galactic Empire, Galactic
Trader and Galactic Revolution. They
continued to have success with
Doug and Gary popular games such as Myst (1993)
Carlston at and Riven (1997) and a wide range of
Broderbund home products such as Print Shop,
language tutors, etc. In 1998,
Headquarters
Broderbund was acquired by The
Learning Company which, a year later,
was itself acquired by Mattel, Inc.
1983
Thinking Machines is founded.
Thinking Machines Corporation (TMC)
was formed by MIT graduate student
Danny Hillis and others to develop a
new type of supercomputer. Their idea
was to use many individual processors
of moderate power rather than one
Connection Machine 2 extremely powerful processor. Their
with DataVault first machine, called The Connection
Machine (CM-1), had 64,000
microprocessors, and began shipping
in 1986. TMC later produced several
larger computers with more powerful
—the CM-2 and CM-5. Competition
from more established supercomputer
firms forced them into bankruptcy in
1993.
1994
Netscape Communications
Corporation is founded. Netscape was
originally founded as Mosaic
Communications Corporation in April
of 1994 by Marc Andreessen, Jim
Clark and others. Its name was soon
changed to Netscape and it delivered
its first browser in October of 1994.
On the day of Netscape's initial public
offering in August of 1995, it’s share
Early Netscape price went from $28 to $54 in the first
few minutes of trading, valuing the
diskette
company at $2 billion. Netscape hired
many of Silicon Valley’s programmers
to provide new features and products
and began the Internet boom of the
1990s.
Yahoo is founded. Founded by
Stanford graduate students Jerry
Yang and David Filo, Yahoo started
out as "Jerry's Guide to the World
Wide Web" before being renamed.
Yahoo originally resided on two
machines, Akebono and Konishiki,
both named after famous Sumo
Yahoo! founders Jerry
wrestlers. Yahoo would quickly
Yang and David Filo, expand to become one of the
2000 Internet’s most popular search
engines.

Components
1947
The Williams tube won the race for a practical random-access memory.
Sir Frederick Williams of Manchester University modified a cathode-ray
tube to paint dots and dashes of phosphorescent electrical charge on the
screen, representing binary ones and zeros. Vacuum tube machines, such
as the IBM 701, used the Williams tube as primary memory.

Williams tube
On December 23, William Shockley, Walter Brattain, and John Bardeen
successfully tested this point-contact transistor, setting off the
semiconductor revolution. Improved models of the transistor, developed
at AT&T Bell Laboratories, supplanted vacuum tubes used on computers
at the time.

Point-contact transistor

1953
At MIT, Jay Forrester installed magnetic core memory on the Whirlwind
computer. Core memory made computers more reliable, faster, and
easier to make. Such a system of storage remained popular until the
development of semiconductors in the 1970s.

Core memory

1954
A silicon-based junction transistor, perfected by Gordon Teal of Texas
Instruments Inc., brought the price of this component down to $2.50. A
Texas Instruments news release from May 10, 1954, read, "Electronic
"brains" approaching the human brain in scope and reliability came much
closer to reality today with the announcement by Texas Instruments
Incorporated of the first commercial production of silicon transistors
kernel-sized substitutes for vacuum tubes."

The company became a household name when the first transistor radio
incorporated Teal´s invention. The radio, sold by Regency Electronics for
$50, launched the world into a global village of instant news and pop
music.

First production silicon junction


transistors

1955
Felker and Harris program TRADIC, AT&T Bell Laboratories announced
the first fully transistorized computer, TRADIC. It contained nearly 800
transistors instead of vacuum tubes. Transistors — completely cold,
highly efficient amplifying devices invented at Bell Labs — enabled the
machine to operate on fewer than 100 watts, or one-twentieth the power
required by comparable vacuum tube computers.

In this photograph, J. H. Felker (left) gives instructions to the TRADIC


computer by means of a plug-in unit while J. R. Harris places numbers
into the machine by flipping simple switches. The computer occupied only
3 cubic feet.

Felker and Harris program


TRADIC

1958
Jack Kilby created the first integrated circuit at Texas Instruments to
prove that resistors and capacitors could exist on the same piece of
semiconductor material. His circuit consisted of a sliver of germanium
with five components linked by wires.

Kilby integrated circuit

1959
Jean Hoerni's Planar process, invented at Fairchild Camera and
Instrument Corp., protects transistor junctions with a layer of oxide. This
improves reliability and, by allowing printing of conducting channels
directly on the silicon surface, enabled Robert Noyce's invention of the
monolithic integrated circuit.

First Planar transistor

1961
Fairchild Camera and Instrument Corp. invented the resistor-transistor
logic (RTL) product, a set/reset flip-flop and the first integrated circuit
available as a monolithic chip.

RTL integrated circuit

1962
Fairchild Camera and Instrument Corp. produced the first widely accepted
epitaxial gold-doped NPN transistor. The NPN transistor served as the
industry workhouse for discrete logic.

Fairchild NPN transistor

1967
Fairchild Camera and Instrument Corp. built the first standard metal
oxide semiconductor product for data processing applications, an eight-
bit arithmetic unit and accumulator. In a MOS chip, engineers treat the
semiconductor material to produce either of two varieties of transistors,
called n-type and p-type.

MOS semiconductor

Using integrated circuits, Medtronics constructed the first internal


pacemaker.
1971
The first advertisement for a microprocessor, the Intel 4004, appeared in
Electronic News. Developed for Busicom, a Japanese calculator maker,
the 4004 had 2250 transistors and could perform up to 90,000 operations
per second in four-bit chunks. Federico Faggin led the design and Ted
Hoff led the architecture.

Intel 4004

1972
Intel´s 8008 microprocessor made its debut. A vast improvement over its
predecessor, the 4004, its eight-bit word afforded 256 unique
arrangements of ones and zeros. For the first time, a microprocessor
could handle both uppercase and lowercase letters, all 10 numerals,
punctuation marks, and a host of other symbols.

Intel 8008

1976
Intel and Zilog introduced new microprocessors. Five times faster than its
predecessor, the 8008, the Intel 8080 could address four times as many
bytes for a total of 64 kilobytes. The Zilog Z-80 could run any program
written for the 8080 and included twice as many built-in machine
instructions.

Zilog Z-80

1979
The Motorola 68000 microprocessor exhibited a processing speed far
greater than its contemporaries. This high performance processor found
its place in powerful work stations intended for graphics-intensive
programs common in engineering.

Motorola 68000

California Institute of Technology professor Carver Mead and Xerox Corp.


computer scientist Lynn Conway wrote a manual of chip design,
"Introduction to VLSI Systems." Demystifying the planning of very large
scale integrated (VLSI) systems, the text expanded the ranks of
engineers capable of creating such chips. The authors had observed that
computer architects seldom participated in the specification of the
standard integrated circuits with which they worked. The authors
intended "Introduction to VLSI Systems" to fill a gap in the literature and
introduce all electrical engineering and computer science students to
integrated system architecture.

Introduction to VLSI Systems

1986
David Miller of AT&T Bell Labs patented the optical transistor, a
component central to digital optical computing. Called Self-ElectroOptic-
Effect Device, or SEED, the transistor involved a light-sensitive switch
built with layers of gallium arsenide and gallium aluminum arsenide.
Beams of light triggered electronic events that caused the light either to
be transmitted or absorbed, thus turning the switch on or off.

Within a decade, research on the optical transistor led to successful work


on the first all-optical processor and the first general-purpose all-optical
computer. Bell Labs researchers first demonstrated the processor there in
1990. A computer using the SEED also contained lasers, lenses, and fast
light switches, but it still required programming by a separate, non-optical
computer. In 1993, researchers at the University of Colorado unveiled the
first all-optical computer capable of being programmed and of
manipulating instructions internally.
Compaq beat IBM to the market when it announced the Deskpro 386, the
first computer on the market to use Intel´s new 80386 chip, a 32-bit
microprocessor with 275,000 transistors on each chip. At 4 million
operations per second and 4 kilobytes of memory, the 80386 gave PCs as
much speed and power as older mainframes and minicomputers.

The 386 chip brought with it the introduction of a 32-bit architecture, a


significant improvement over the 16-bit architecture of previous
microprocessors. It had two operating modes, one that mirrored the
segmented memory of older x86 chips, allowing full backward
compatibility, and one that took full advantage of its more advanced
technology. The new chip made graphical operating environments for IBM
PC and PC-compatible computers practical. The architecture that allowed
Windows and IBM OS/2 has remained in subsequent chips.

Compaq

1987
Motorola unveiled the 68030 microprocessor. A step up from the 68020,
it built on a 32-bit enhanced microprocessor with a central processing
unit core, a data cache, an instruction cache, an enhanced bus controller,
and a memory management unit in a single VLSI device — all operating
at speeds of at least 20 MHz.

Motorola 68030

1988
Compaq and other PC-clone makers developed enhanced industry
standard architecture — better than microchannel and retained
compatibility with existing machines. EISA used a 32-bit bus, or a means
by which two devices can communicate. The advanced data-handling
features of the EISA made it an improvement over the 16-bit bus of
industry standard architecture. IBM´s competitors developed the EISA as
a way to avoid paying a fee to IBM for its MCA bus.

1989
Intel released the 80486 microprocessor and the i860 RISC/coprocessor
chip, each of which contained more than 1 million transistors. The RISC
microprocessor had a 32-bit integer arithmetic and logic unit (the part of
the CPU that performs operations such as addition and subtraction), a
64-bit floating-point unit, and a clock rate of 33 MHz.

The 486 chips remained similar in structure to their predecessors, the


386 chips. What set the 486 apart was its optimized instruction set, with
an on-chip unified instruction and data cache and an optional on-chip
floating-point unit. Combined with an enhanced bus interface unit, the
microprocessor doubled the performance of the 386 without increasing
the clock rate.

Intel 80486
Motorola announced the 68040 microprocessor, with about 1.2 million
transistors. Due to technical difficulties, it didn´t ship until 1991,
although promised in January 1990. A 32-bit, 25-MHz microprocessor,
the 68040 integrated a floating-point unit and included instruction and
data caches. Apple used the third generation of 68000 chips in Macintosh
Quadra computers.

Motorola 68040

1993
The Pentium microprocessor is released. The Pentium was the fifth
generation of the ‘x86’ line of microprocessors from Intel, the basis for
the IBM PC and its clones. The Pentium introduced several advances that
made programs run faster such as the ability to execute several
instructions at the same time and support for graphics and music.

Intel Pentium Processor diagram


Companies
'39 '40 '41 '42 '43 '44 '45 '46 '47 '48
Components
'49 '50 '51 '52 '53 '54 '55 '56 '57 '58 Computers
Graphics & Games
'59 '60 '61 '62 '63 '64 '65 '66 '67 '68
Networking
'69 '70 '71 '72 '73 '74 '75 '76 '77 '78 People & Pop Culture
Robots & Artificial
'79 '80 '81 '82 '83 '84 '85 '86 '87 '88 Intelligence
Software & Languages
'89 '90 '91 '92 '93 '94
Storage

Computer
s
1939
Hewlett-Packard is Founded. David Packard and Bill Hewlett found
Hewlett-Packard in a Palo Alto, California garage. Their first product was
the HP 200A Audio Oscillator, which rapidly becomes a popular piece of
test equipment for engineers. Walt Disney Pictures ordered eight of the
200B model to use as sound effects generators for the 1940 movie
“Fantasia.”

David Packard and Bill Hewlett


in their Palo Alto, California
Garage

1940
The Complex Number Calculator (CNC) is completed. In 1939, Bell
Telephone Laboratories completed this calculator, designed by researcher
George Stibitz. In 1940, Stibitz demonstrated the CNC at an American
Mathematical Society conference held at Dartmouth College. Stibitz
stunned the group by performing calculations remotely on the CNC
(located in New York City) using a Teletype connected via special
telephone lines. This is considered to be the first demonstration of remote
access computing.
Companies
'39 '40 '41 '42 '43 '44 '45 '46 '47 '48
Components
'49 '50 '51 '52 '53 '54 '55 '56 '57 '58 Computers
Graphics & Games
'59 '60 '61 '62 '63 '64 '65 '66 '67 '68
Networking
'69 '70 '71 '72 '73 '74 '75 '76 '77 '78 People & Pop Culture
Robots & Artificial
'79 '80 '81 '82 '83 '84 '85 '86 '87 '88 Intelligence
Software & Languages
'89 '90 '91 '92 '93 '94
Storage

Graphics
&
Games
1963
DAC-1 computer aided design program is released. In 1959, the General
Motors Research Laboratories appointed a special research team to
investigate the use of computers in designing automobiles. In 1960, IBM
joined the project, producing the first commercially-available Computer
Aided Design program, known as DAC-1. Out of that project came the
IBM 2250 display terminal as well as many advances in computer
timesharing and the use of a single processor by two or more terminals.

The DAC-1 System at GM


Research Labs, 1965

1972
Pong is released. In 1966, Ralph Baer designed a ping-pong game for his
Odyssey gaming console. Nolan Bushnell played this game at a Magnavox
product show in Burlingame, California. Bushnell hired young engineer Al
Alcorn to design a car driving game, but when it became apparent that
this was too ambitious for the time, he had Alcorn to design a version of
ping-pong instead. The game was tested in bars in Grass Valley and
Sunnyvale, California where it proved very popular. Pong would
revolutionize the arcade industry and launch the modern video game era.

Original Atari Pong Gme


Screenshot

SuperPaint is completed. SuperPaint was the first digital computer


drawing system to use a frame buffer—a special high-speed memory—
and the ancestor of all modern paint programs. It could create
sophisticated animations, in up to 16.7 million colors, had adjustable
paintbrushes, video magnification, and used a graphics tablet for
drawing. It was designed by Richard Shoup and others at the Xerox Palo
Alto Research Center (PARC). Its designers won a technical Academy
Award in 1998 for their invention.

SuperPaint system in 1973

1977
Atari launches the Video Computer System game console. Atari released
the Atari Video Computer System (VCS) later renamed the Atari 2600.
The VCS was the first widely successful video game system, selling more
than twenty million units throughout the 1980s. The VCS used the 8-bit
MOS 6507 microprocessor and was designed to be connected to a home
television set. When the last of Atari’s 8-bit game consoles were made in
1990, more than 900 video game titles had been released.

Atari VCS Prototype

1986
Pixar is founded. Pixar was originally called the Special Effects Computer
Group at Lucasfilm (launched in 1979). The group created the computer
animated segments of films such as “Star Trek II: The Wrath of Khan”
and “Young Sherlock Holmes.” In 1986, Apple Computer co-founder
Steve Jobs paid 10 million dollars to Lucasfilm to purchase the Group and
renamed it Pixar. Over the next decade, Pixar made highly-successful
(and Oscar-winning) animated films. It was bought by Disney in 2006.

Pixar Headquarters

1989
The concept of virtual reality made a statement as the hot topic at
Siggraph´s 1989 convention in Boston. The Silicon Graphics booth
featured the new technology, designed by the computer-aided design
software company Autodesk and the computer company VPL. The term
describes a computer-generated 3-D environment that allows a user to
interact with the realities created there. The computer must calculate and
display sensory information quickly enough to fool the senses.

Howard Rheingold described, "shared and objectively present like the


physical world, composable like a work of art, and as unlimited and
harmless as a dream." First practical for accomplishing such tasks as
flight simulation, virtual reality soon spread much further, promising new
ground in video games, education, and travel. Computer users are placed
into the virtual environment in a variety of ways, from a large monitor to
a head-mounted display or a glove.

1990
Video Toaster is introduced by NewTek. The Video Toaster was a video
editing and production system for the Amiga line of computers and
included custom hardware and special software. Much more affordable
than any other computer-based video editing system, the Video Toaster
was not only for home use. It was popular with public access stations and
was even good enough to be used for broadcast television shows like
Home Improvement.

VideoToaster Installed at Local


Television Station

1992
“Terminator 2: Judgment Day” opens. Director James Cameron’s sequel
to his 1984 hit “The Terminator,” featured ground-breaking special
effects done by Industrial Light & Magic. Made for a record $100 million,
it was the most expensive movie ever made at the time. Most of this cost
was due to the expense of computer-generated special effects (such as
image morphing) throughout the film. Terminator 2 is one of many films
that critique civilization’s frequent blind trust in technology.
Original Movie Poster for
Terminator 2: Judgment Day

1993
“Doom” is released. id Software released Doom in late 1993. An
immersive first-person shooter-style game, Doom became popular on
many different platforms before losing popularity to games like Halo and
Counter-Strike. Doom players were also among the first to customize the
game’s levels and appearance. Doom would spawn several sequels and a
2005 film.

Box Art for Doom

T
i
m
e
li
n
e
o
f
C
o
m
p
u
t
e
r
H
i
s
t
o
r
y
About the Museum | Exhibits | Collections | What's Happening | Giving | About This Site
Privacy | Copyright | Feedback | Credits | Advanced Search | Site Map

© 2006 Computer History Museum. All rights reserved.


1401 N. Shoreline Blvd., Mountain View CA 94043 Ph 650-810-1010
Companies
'39 '40 '41 '42 '43 '44 '45 '46 '47 '48
Components
'49 '50 '51 '52 '53 '54 '55 '56 '57 '58 Computers
Graphics & Games
'59 '60 '61 '62 '63 '64 '65 '66 '67 '68
Networking
'69 '70 '71 '72 '73 '74 '75 '76 '77 '78 People & Pop Culture
Robots & Artificial
'79 '80 '81 '82 '83 '84 '85 '86 '87 '88 Intelligence
Software & Languages
'89 '90 '91 '92 '93 '94
Storage

Networking
1960
AT&T designed its Dataphone, the first commercial modem, specifically
for converting digital computer data to analog signals for transmission
across its long distance network. Outside manufacturers incorporated Bell
Laboratories´ digital data sets into commercial products. The
development of equalization techniques and bandwidth-conserving
modulation systems improved transmission efficiency in national and
global systems.

AT&T Dataphone

1964
Online transaction processing made its debut in IBM´s SABRE reservation
system, set up for American Airlines. Using telephone lines, SABRE linked
2,000 terminals in 65 cities to a pair of IBM 7090 computers, delivering
data on any flight in less than three seconds.
JOSS (Johnniac Open Shop System) conversational time-sharing service
began on Rand´s Johnniac. Time-sharing arose, in part, because the
length of batch turn-around times impeded the solution of problems.
Time sharing aimed to bring the user back into "contact" with the
machine for online debugging and program development.

JOSS configuration

1966
John van Geen of the Stanford Research Institute vastly improved the
acoustically coupled modem. His receiver reliably detected bits of data
despite background noise heard over long-distance phone lines. Inventors
developed the acoustically coupled modem to connect computers to the
telephone network by means of the standard telephone handset of the
day.

Acoustically coupled modem

1970
Citizens and Southern National Bank in Valdosta, Ga., installed the
country´s first automatic teller machine.

Computer-to-computer communication expanded when the Department


of Defense established four nodes on the ARPANET: the University of
California Santa Barbara and UCLA, SRI International, and the University
of Utah. Viewed as a comprehensive resource-sharing network, ARPANET
´s designers set out with several goals: direct use of distributed
hardware services; direct retrieval from remote, one-of-a-kind
databases; and the sharing of software subroutines and packages not
available on the users´ primary computer due to incompatibility of
hardware or languages.

ARPANET topology

1971
The first e-mail is sent. Ray Tomlinson of the research firm Bolt, Beranek
and Newman sent the first e-mail when he was supposed to be working
on a different project. Tomlinson, who is credited with being the one to
decide on the "@" sign for use in e-mail, sent his message over a military
network called ARPANET. When asked to describe the contents of the first
email, Tomlinson said it was “something like "QWERTYUIOP"”

Ray Tomlinson in 2001

1972
Wozniak´s "blue box", Steve Wozniak built his "blue box" a tone
generator to make free phone calls. Wozniak sold the boxes in
dormitories at the University of California Berkeley where he studied as
an undergraduate. "The early boxes had a safety feature — a reed switch
inside the housing operated by a magnet taped onto the outside of the
box," Wozniak remembered. "If apprehended, you removed the magnet,
whereupon it would generate off-frequency tones and be inoperable ...
and you tell the police: It´s just a music box."

Wozniak´s "blue box"

1973
Robert Metcalfe devised the Ethernet method of network connection at
the Xerox Palo Alto Research Center. He wrote: "On May 22, 1973, using
my Selectric typewriter ... I wrote ... "Ether Acquisition" ... heavy with
handwritten annotations — one of which was "ETHER!" — and with hand-
drawn diagrams — one of which showed `boosters´ interconnecting
branched cable, telephone, and ratio ethers in what we now call an
internet.... If Ethernet was invented in any one memo, by any one
person, or on any one day, this was it."

Robert M. Metcalfe, "How Ethernet Was Invented", IEEE Annals of the


History of Computing, Volume 16, No. 4, Winter 1994, p. 84.

Ethernet

1975
Telenet, the first commercial packet-switching network and civilian
equivalent of ARPANET, was born. The brainchild of Larry Roberts, Telenet
linked customers in seven cities. Telenet represented the first value-
added network, or VAN — so named because of the extras it offered
beyond the basic service of linking computers.

1976
The Queen of England sends first her e-mail. Elizabeth II, Queen of the
United Kingdom, sends out an e-mail on March 26 from the Royal Signals
and Radar Establishment (RSRE) in Malvern as a part of a demonstration
of networking technology.
1979
John Shoch and Jon Hupp at the Xerox Palo Alto Research Center
discover the computer "worm," a short program that searches a network
for idle processors. Initially designed to provide more efficient use of
computers and for testing, the worm had the unintended effect of
invading networked computers, creating a security threat.

Shoch took the term "worm" from the book "The Shockwave Rider," by
John Brunner, in which an omnipotent "tapeworm" program runs loose
through a network of computers. Brunner wrote: "No, Mr. Sullivan, we
can´t stop it! There´s never been a worm with that tough a head or that
long a tail! It´s building itself, don´t you understand? Already it´s
passed a billion bits and it´s still growing. It´s the exact inverse of a
phage — whatever it takes in, it adds to itself instead of wiping... Yes,
sir! I´m quite aware that a worm of that type is theoretically impossible!
But the fact stands, he´s done it, and now it´s so goddamn
comprehensive that it can´t be killed. Not short of demolishing the net!"
(247, Ballantine Books, 1975).

The Shockwave Rider

USENET established. USENET was invented as a means for providing mail


and file transfers using a communications standard known as UUCP. It
was developed as a joint project by Duke University and the University of
North Carolina at Chapel Hill by graduate students Tom Truscott, Jim Ellis,
and Steve Bellovin. USENET enabled its users to post messages and files
that could be accessed and archived. It would go on to become one of the
main areas for large-scale interaction for interest groups through the
1990s.

The first Multi-User Domain (or Dungeon), MUD1, is goes on-line. Richard
Bartle and Roy Trubshaw, two students at the University of Essex, write a
program that allows many people to play against each other on-line.
MUDs become popular with college students as a means of adventure
gaming and for socializing. By 1984, there are more than 100 active
MUDs and variants around the world.

Richard Bartle and Roy


Trubshaw circa 1999

1983
The ARPANET splits into the ARPANET and MILNET. Due to the success of
the ARPANET as a way for researchers in universities and the military to
collaborate, it was split into military (MILNET) and civilian (ARPANET)
segments. This was made possible by the adoption of TCP/IP, a
networking standard, three years earlier. The ARPANET was renamed the
“Internet” in 1995.

1985
The modern Internet gained support when the National Science
foundation formed the NSFNET, linking five supercomputer centers at
Princeton University, Pittsburgh, University of California at San Diego,
University of Illinois at Urbana-Champaign, and Cornell University. Soon,
several regional networks developed; eventually, the government
reassigned pieces of the ARPANET to the NSFNET. The NSF allowed
commercial use of the Internet for the first time in 1991, and in 1995, it
decommissioned the backbone, leaving the Internet a self-supporting
industry.

The NSFNET initially transferred data at 56 kilobits per second, an


improvement on the overloaded ARPANET. Traffic continued to increase,
though, and in 1987, ARPA awarded Merit Network Inc., IBM, and MCI a
contract to expand the Internet by providing access points around the
country to a network with a bandwidth of 1.5 megabits per second. In
1992, the network upgraded to T-3 lines, which transmit information at
about 45 megabits per second.

The Whole Earth 'Lectronic Link (WELL) is founded. Stewart Brand and
Larry Brilliant started an on-line Bulletin Board System (BBS) to build a
“virtual community” of computer users at low cost. Journalists were given
free memberships in the early days, leading to many articles about it and
helping it grow to thousands of members around the world.

Stewart Brand and Larry Brilliant


lecturing on the Well, 1999

1988
Robert Morris´ worm flooded the ARPANET. Then-23-year-old Morris, the
son of a computer security expert for the National Security Agency, sent
a nondestructive worm through the Internet, causing problems for about
6,000 of the 60,000 hosts linked to the network. A researcher at
Lawrence Livermore National Laboratory in California discovered the
worm. "It was like the Sorcerer´s Apprentice," Dennis Maxwell, then a
vice president of SRI, told the Sydney (Australia) Sunday Telegraph at
the time. Morris was sentenced to three years of probation, 400 hours of
community service, and a fine of $10,050.
Morris, who said he was motivated by boredom, programmed the worm
to reproduce itself and computer files and to filter through all the
networked computers. The size of the reproduced files eventually became
large enough to fill the computers´ memories, disabling them.

ARPANET worm

1990
The World Wide Web was born when Tim Berners-Lee, a researcher at
CERN, the high-energy physics laboratory in Geneva, developed
HyperText Markup Language. HTML, as it is commonly known, allowed
the Internet to expand into the World Wide Web, using specifications he
developed such as URL (Uniform Resource Locator) and HTTP
(HyperText Transfer Protocol). A browser, such as Netscape or Microsoft
Internet Explorer, follows links and sends a query to a server, allowing a
user to view a site.

Berners-Lee based the World Wide Web on Enquire, a hypertext system


he had developed for himself, with the aim of allowing people to work
together by combining their knowledge in a global web of hypertext
documents. With this idea in mind, Berners-Lee designed the first World
Wide Web server and browser — available to the general public in 1991.
Berners-Lee founded the W3 Consortium, which coordinates World Wide
Web development.

Berners-Lee proposal

1993
The Mosaic web browser is released. Mosaic was the first commercial
software that allowed graphical access to content on the internet.
Designed by Eric Bina and Marc Andreessen at the University of Illinois’s
National Center for Supercomputer Applications, Mosaic was originally
designed for a Unix system running X-windows. By 1994, Mosaic was
available for several other operating systems such as the Mac OS,
Windows and AmigaOS.

Screen Capture from Original


Mosaic Browser

T
i
m
e
li
n
e
o
f
C
o
m
p
u
t
e
r
H
i
s
t
o
r
y
About the Museum | Exhibits | Collections | What's Happening | Giving | About This Site
Privacy | Copyright | Feedback | Credits | Advanced Search | Site Map

© 2006 Computer History Museum. All rights reserved.


1401 N. Shoreline Blvd., Mountain View CA 94043 Ph 650-810-1010
Companies
'39 '40 '41 '42 '43 '44 '45 '46 '47 '48
Components
'49 '50 '51 '52 '53 '54 '55 '56 '57 '58 Computers
Graphics & Games
'59 '60 '61 '62 '63 '64 '65 '66 '67 '68
Networking
'69 '70 '71 '72 '73 '74 '75 '76 '77 '78 People & Pop Culture
Robots & Artificial
'79 '80 '81 '82 '83 '84 '85 '86 '87 '88 Intelligence
Software & Languages
'89 '90 '91 '92 '93 '94
Storage

People
& Pop
Culture
1945
On September 9th, Grace Hopper recorded the first actual computer
"bug" — a moth stuck between the relays and logged at 15:45 hours on
the Harvard Mark II. Hopper, a rear admiral in the U.S. Navy, enjoyed
successful careers in academia, business, and the military while making
history in the computer field. She helped program the Harvard Mark I and
II and developed the first compiler, A-0. Her subsequent work on
programming languages led to COBOL, a language specified to operate
on machines of different manufacturers.

Grace Hopper
1949
Thomas Watson Jr., speaking to an IBM sales meeting, predicted all
moving parts in machines would be replaced by electronics within a
decade.

1952
On election night, November 4, CBS News borrowed a UNIVAC to make a
scientific prediction of the outcome of the race for the presidency
between Dwight D. Eisenhower and Adlai Stevenson. The opinion polls
predicted a landslide in favor of Stevenson, but the UNIVAC´s analysis of
early returns showed a clear victory for Eisenhower. Its sharp divergence
from public opinion made newscasters Walter Cronkite and Charles
Collingwood question the validity of the computer´s forecast, so they
postponed announcing UNIVAC´s prediction until very late.

Cronkite with UNIVAC

1954
Alan Turing was found dead at age 42. He had published his seminal
paper, "On Computable Numbers," in 1936, as well as posing significant
questions about judging "human intelligence" and programming and
working on the design of several computers during the course of his
career.

A mathematical genius, Turing proved instrumental in code-breaking


efforts during World War II. His application of logic to that realm would
emerge even more significantly in his development of the concept of a
"universal machine."

Alan Turing

1955
First meeting of SHARE, the IBM users group, convened. User groups
became a significant educational force allowing companies to
communicate innovations and users to trade information.

1970
Vietnam War protesters attacked university computer centers. At the
University of Wisconsin, the toll was one human and four machines.

1982
Time magazine altered its annual tradition of naming a "Man of the Year,"
choosing instead to name the computer its "Machine of the Year." In
introducing the theme, Time publisher John A. Meyers wrote, "Several
human candidates might have represented 1982, but none symbolized
the past year more richly, or will be viewed by history as more significant,
than a machine: the computer.

His magazine, he explained, has chronicled the change in public opinion


with regard to computers. A senior writer contributed: "computers were
once regarded as distant, ominous abstractions, like Big Brother. In 1982,
they truly became personalized, brought down to scale, so that people
could hold, prod and play with them." At Time, the main writer on the
project completed his work on a typewriter, but Meyers noted that the
magazine's newsroom would upgrade to word processors within a year.
The use of computer-generated graphics in movies took a step forward
with Disney´s release of "Tron." One of the first movies to use such
graphics, the plot of "Tron" also featured computers - it followed the
adventures of a hacker split into molecules and transported inside a
computer. Computer animation, done by III, Abel, MAGI, and Digital
Effects, accounted for about 30 minutes of the film.

1984
In his novel "Neuromancer," William Gibson coined the term
"cyberspace." He also spawned a genre of fiction known as "cyberpunk"
in his book, which described a dark, complex future filled with intelligent
machines, computer viruses, and paranoia.

Gibson introduced cyberspace as: "A consensual hallucination


experienced daily by billions of legitimate operators, in every nation, by
children being taught mathematical concepts... A graphic representation
of data abstracted from the banks of every computer in the human
system. Unthinkable complexity. Lines of light ranged in the nonspace of
the mind, clusters and constellations of data. Like city lights, receding..."
(p. 51).

Gibson´s Neuromancer

1988
Pixar´s "Tin Toy" became the first computer-animated film to win an
Academy Award, taking the Oscar for best animated short film. A wind-up
toy first encountering a boisterous baby narrated "Tin Toy." To illustrate
the baby´s facial expressions, programmers defined more than 40 facial
muscles on the computer controlled by the animator.

Founded in 1986, one of Pixar´s primary projects involved a renderer,


called Renderman, the standard for describing 3-D scenes. Renderman
describes objects, light sources, cameras, atmospheric effects, and other
information so that a scene can be rendered on a variety of systems. The
company continued on to other successes, including 1995´s "Toy Story,"
the first full-length feature film created entirely by computer animation.
Still from Pixar's Tin Toy

T
i
m
e
li
n
e
o
f
C
o
m
p
u
t
e
r
H
i
s
t
o
r
y
About the Museum | Exhibits | Collections | What's Happening | Giving | About This Site
Privacy | Copyright | Feedback | Credits | Advanced Search | Site Map

© 2006 Computer History Museum. All rights reserved.


1401 N. Shoreline Blvd., Mountain View CA 94043 Ph 650-810-1010

Companies
'39 '40 '41 '42 '43 '44 '45 '46 '47 '48
Components
'49 '50 '51 '52 '53 '54 '55 '56 '57 '58 Computers
Graphics & Games
'59 '60 '61 '62 '63 '64 '65 '66 '67 '68
Networking
'69 '70 '71 '72 '73 '74 '75 '76 '77 '78 People & Pop Culture
Robots & Artificial
'79 '80 '81 '82 '83 '84 '85 '86 '87 '88 Intelligence
Software & Languages
'89 '90 '91 '92 '93 '94
Storage

Robots &
Artificial
Intelligence
1948
Norbert Wiener published "Cybernetics," a major influence on later
research into artificial intelligence. He drew on his World War II
experiments with anti-aircraft systems that anticipated the course of
enemy planes by interpreting radar images. Wiener coined the term
"cybernetics" from the Greek word for "steersman."

In addition to "cybernetics," historians note Wiener for his analysis of


brain waves and for his exploration of the similarities between the human
brain and the modern computing machine capable of memory
association, choice, and decision making.

Norbert Wiener

1959
MIT´s Servomechanisms Laboratory demonstrated computer-assisted
manufacturing. The school´s Automatically Programmed Tools project
created a language, APT, used to instruct milling machine operations. At
the demonstration, the machine produced an ashtray for each attendee.

APT ashtray

1961
UNIMATE, the first industrial robot, began work at General Motors.
Obeying step-by-step commands stored on a magnetic drum, the 4,000-
pound arm sequenced and stacked hot pieces of die-cast metal.

The brainchild of Joe Engelberger and George Devol, UNIMATE originally


automated the manufacture of TV picture tubes.

UNIMATE

1963
Researchers designed the Rancho Arm at Rancho Los Amigos Hospital in
Downey, California as a tool for the handicapped. The Rancho Arm´s six
joints gave it the flexibility of a human arm. Acquired by Stanford
University in 1963, it holds a place among the first artificial robotic arms
to be controlled by a computer.

Rancho Arm

1965
A Stanford team led by Ed Feigenbaum created DENDRAL, the first expert
system, or program designed to execute the accumulated expertise of
specialists. DENDRAL applied a battery of "if-then" rules in chemistry and
physics to identify the molecular structure of organic compounds.

1968
Marvin Minsky developed the Tentacle Arm, which moved like an octopus.
It had twelve joints designed to reach around obstacles. A PDP-6
computer controlled the arm, powered by hydraulic fluids. Mounted on a
wall, it could lift the weight of a person.

Tentacle Arm

1969
Victor Scheinman´s Stanford Arm made a breakthrough as the first
successful electrically powered, computer-controlled robot arm. By 1974,
the Stanford Arm could assemble a Ford Model T water pump, guiding
itself with optical and contact sensors. The Stanford Arm led directly to
commercial production. Scheinman went on to design the PUMA series of
industrial robots for Unimation, robots used for automobile assembly and
other industrial tasks.

Stanford Arm

1970
SRI International´s Shakey became the first mobile robot controlled by
artificial intelligence. Equipped with sensing devices and driven by a
problem-solving program called STRIPS, the robot found its way around
the halls of SRI by applying information about its environment to a route.
Shakey used a TV camera, laser range finder, and bump sensors to
collect data, which it then transmitted to a DEC PDP-10 and PDP-15. The
computer radioed back commands to Shakey — who then moved at a
speed of 2 meters per hour.

SRI´s Shakey

1974
David Silver at MIT designed the Silver Arm, a robotic arm to do small-
parts assembly using feedback from delicate touch and pressure sensors.
The arm´s fine movements corresponded to those of human fingers.

Silver Arm

1976
Shigeo Hirose´s Soft Gripper could conform to the shape of a grasped
object, such as this wine glass filled with flowers. The design Hirose
created at the Tokyo Institute of Technology grew from his studies of
flexible structures in nature, such as elephant trunks and snake spinal
cords.

Hirose´s Soft Gripper

1978
Texas Instruments Inc. introduced Speak & Spell, a talking learning aid
for ages 7 and up. Its debut marked the first electronic duplication of the
human vocal tract on a single chip of silicon. Speak & Spell utilized linear
predictive coding to formulate a mathematical model of the human vocal
tract and predict a speech sample based on previous input. It
transformed digital information processed through a filter into synthetic
speech and could store more than 100 seconds of linguistic sounds.

Shown here are the four individuals who began the Speak & Spell
program: From left to right, Gene Frantz, Richard Wiggins, Paul
Breedlove, and George Brantingham.

Speak & Spell creators


1979
In development since 1967, the Stanford Cart successfully crossed a
chair-filled room without human intervention in 1979. Hans Moravec
rebuilt the Stanford Cart in 1977, equipping it with stereo vision. A
television camera, mounted on a rail on the top of the cart, took pictures
from several different angles and relayed them to a computer. The
computer gauged the distance between the cart and obstacles in its path.

Stanford Cart

1983
The Musical Instrument Digital Interface was introduced at the first North
American Music Manufacturers show in Los Angeles. MIDI is an industry-
standard electronic interface that links electronic music synthesizers. The
MIDI information tells a synthesizer when to start and stop playing a
specific note, what sound that note should have, how loud it should be,
and other information.

Raymond Kurzweil, a pioneer in developing the electronic keyboard,


predicts MIDI and other advances will make traditional musical
instruments obsolete in the future. In the 21st century, he wrote in his
book, "The Age of Intelligent Machines," "There will still be acoustic
instruments around, but they will be primarily of historical interest, much
like harpsichords are today.... While the historically desirable sounds of
pianos and violins will continue to be used, most music will use sounds
with no direct acoustic counterpart.... There will not be a sharp division
between the musician and nonmusician."

MIDI

T
i
m
e
li
n
e
o
f
C
o
m
p
u
t
e
r
H
i
s
t
o
r
y
About the Museum | Exhibits | Collections | What's Happening | Giving | About This Site
Privacy | Copyright | Feedback | Credits | Advanced Search | Site Map

© 2006 Computer History Museum. All rights reserved.


1401 N. Shoreline Blvd., Mountain View CA 94043 Ph 650-810-1010

Companies
'39 '40 '41 '42 '43 '44 '45 '46 '47 '48
Components
'49 '50 '51 '52 '53 '54 '55 '56 '57 '58 Computers
Graphics & Games
'59 '60 '61 '62 '63 '64 '65 '66 '67 '68
Networking
'69 '70 '71 '72 '73 '74 '75 '76 '77 '78 People & Pop Culture
Robots & Artificial
'79 '80 '81 '82 '83 '84 '85 '86 '87 '88 Intelligence
Software & Languages
'89 '90 '91 '92 '93 '94
Storage

Software
&
Languages
1945
Konrad Zuse began work on Plankalkul (Plan Calculus), the first
algorithmic programming language, with an aim of creating the
theoretical preconditions for the formulation of problems of a general
nature. Seven years earlier, Zuse had developed and built the world´s
first binary digital computer, the Z1. He completed the first fully
functional program-controlled electromechanical digital computer, the Z3,
in 1941. Only the Z4 — the most sophisticated of his creations —
survived World War II.

Konrad Zuse

1948
Claude Shannon´s "The Mathematical Theory of Communication" showed
engineers how to code data so they could check for accuracy after
transmission between computers. Shannon identified the bit as the
fundamental unit of data and, coincidentally, the basic unit of
computation.

Claude Shannon

1952
Grace Hopper completes the A-0 Compiler. In 1952, mathematician
Grace Hopper completed what is considered to be the first compiler, a
program that allows a computer user to use English-like words instead of
numbers. Other compilers based on A-0 followed: ARITH-MATIC, MATH-
MATIC and FLOW-MATIC [software]

Grace Hopper with UNIVAC 1

1953
John Backus completed speedcoding for IBM´s 701 computer. Although
speedcoding demanded more memory and compute time, it trimmed
weeks off of the programming schedule.

IBM 701

1955
Herbert Simon and Allen Newell unveiled Logic Theorist software that
supplied rules of reasoning and proved symbolic logic theorems. The
release of Logic Theorist marked a milestone in establishing the field of
artificial intelligence.

1956
In the mid-fifties resources for scientific and engineering computing were
in short supply and were very precious. The first operating system for the
IBM 704 reflected the cooperation of Bob Patrick of General Motors
Research and Owen Mock of North American Aviation. Called the GM-NAA
I/O System, it provided batch processing and increased the number of
completed jobs per shift with no increase in cost. Some version of the
system was used in about forty 704 installations.

At MIT, researchers began experimentation on direct keyboard input on


computers, a precursor to today´s normal mode of operation. Doug Ross
wrote a memo advocating direct access in February; five months later,
the Whirlwind aided in such an experiment.

MIT Whirlwind

1957
Sperry Rand released a commercial compiler for its UNIVAC. Developed
by Grace Hopper as a refinement of her earlier innovation, the A-0
compiler, the new version was called MATH-MATIC. Earlier work on the A-
0 and A-2 compilers led to the development of the first English-language
business data processing compiler, B-0 (FLOW-MATIC), also completed in
1957. FLOW-MATIC served as a model on which to build with input from
other sources.

UNIVAC MATH-MATIC

A new language, FORTRAN (short for FORmula TRANslator), enabled a


computer to perform a repetitive task from a single set of instructions by
using loops. The first commercial FORTRAN program ran at
Westinghouse, producing a missing comma diagnostic. A successful
attempt followed.

IBM 704 FORTRAN

1959
ERMA, the Electronic Recording Method of Accounting, digitized checking
for the Bank of America by creating a computer-readable font. A special
scanner read account numbers preprinted on checks in magnetic ink.

ERMA characters
1960
A team drawn from several computer manufacturers and the Pentagon
developed COBOL, Common Business Oriented Language. Designed for
business use, early COBOL efforts aimed for easy readability of computer
programs and as much machine independence as possible. Designers
hoped a COBOL program would run on any computer for which a compiler
existed with only minimal modifications.

Howard Bromberg, an impatient member of the committee in charge of


creating COBOL, had this tombstone made out of fear that the language
had no future. However, COBOL has survived to this day.

COBOL design team

LISP made its debut as the first computer language designed for writing
artificial intelligence programs. Created by John McCarthy, LISP offered
programmers flexibility in organization.

LISP Programmer´s Reference


Quicksort is developed. Working for the British computer company Elliott
Brothers, C. A. R. Hoare developed Quicksort, an algorithm that would go
on to become the most used sorting method in the world. Quicksort used
a series of elements called “pivots” that allowed for fast sorting. C.A.R.
Hoare was knighted by Queen Elizabeth II in 2000.

C.A.R. Hoare circa 1965

1962
MIT students Slug Russell, Shag Graetz, and Alan Kotok wrote
SpaceWar!, considered the first interactive computer game. First played
at MIT on DEC´s PDP-1, the large-scope display featured interactive,
shoot´em-up graphics that inspired future video games. Dueling players
fired at each other´s spaceships and used early versions of joysticks to
manipulate away from the central gravitational force of a sun as well as
from the enemy ship.

Spacewar! on PDP-1

1963
Ivan Sutherland published Sketchpad, an interactive, real time computer
drawing system, as his MIT doctoral thesis. Using a light pen and
Sketchpad, a designer could draw and manipulate geometric figures on
the screen.

Sketchpad document

ASCII — American Standard Code for Information Interchange —


permitted machines from different manufacturers to exchange data.
ASCII consists of 128 unique strings of ones and zeros. Each sequence
represents a letter of the English alphabet, an Arabic numeral, an
assortment of punctuation marks and symbols, or a function such as a
carriage return.

ASCII code

1964
Thomas Kurtz and John Kemeny created BASIC, an easy-to-learn
programming language, for their students at Dartmouth College.

BASIC manual

1965
Object-oriented languages got an early boost with Simula, written by
Kristen Nygaard and Ole-John Dahl. Simula grouped data and instructions
into blocks called objects, each representing one facet of a system
intended for simulation.

1967
Seymour Papert designed LOGO as a computer language for children.
Initially a drawing program, LOGO controlled the actions of a mechanical
"turtle," which traced its path with pen on paper. Electronic turtles made
their designs on a video display monitor.

Papert emphasized creative exploration over memorization of facts:


"People give lip service to learning to learn, but if you look at curriculum
in schools, most of it is about dates, fractions, and science facts; very
little of it is about learning. I like to think of learning as an expertise that
every one of us can acquire."

Seymour Papert

1968
Edsger Dijkstra´s "GO TO considered harmful" letter, published in
Communications of the ACM, fired the first salvo in the structured
programming wars. The ACM considered the resulting acrimony
sufficiently harmful that it established a policy of no longer printing
articles taking such an assertive position against a coding practice.

1969
The RS-232-C standard for communication permitted computers and
peripheral devices to transmit information serially — that is, one bit at a
time. The RS-232-C protocol spelled out a purpose for a serial plug´s 25
connector pins.
AT&T Bell Laboratories programmers Kenneth Thompson and Dennis
Ritchie developed the UNIX operating system on a spare DEC
minicomputer. UNIX combined many of the timesharing and file
management features offered by Multics, from which it took its name.
(Multics, a projects of the mid-1960s, represented the first effort at
creating a multi-user, multi-tasking operating system.) The UNIX
operating system quickly secured a wide following, particularly among
engineers and scientists.

UNIX "license plate"

1972
Nolan Bushnell introduced Pong and his new company, Atari video games.

1976
Gary Kildall developed CP/M, an operating system for personal
computers. Widely adopted, CP/M made it possible for one version of a
program to run on a variety of computers built around eight-bit
microprocessors.

CP/MCP/M

1977
The U.S. government adopted IBM´s data encryption standard, the key to
unlocking coded messages, to protect confidentiality within its agencies.
Available to the general public as well, the standard required an eight-
number key for scrambling and unscrambling data. The 70 quadrillion
possible combinations made breaking the code by trial and error unlikely.

1979
Harvard MBA candidate Daniel Bricklin and programmer Robert Frankston
developed VisiCalc, the program that made a business machine of the
personal computer, for the Apple II. VisiCalc (for Visible Calculator)
automated the recalculation of spreadsheets. A huge success, more than
100,000 copies sold in one year.

Bob Frankston and Dan Brinklin


1981
The MS-DOS, or Microsoft Disk Operating System, the basic software for
the newly released IBM PC, established a long partnership between IBM
and Microsoft, which Bill Gates and Paul Allen had founded only six years
earlier.

1982
Mitch Kapor developed Lotus 1-2-3, writing the software directly into the
video system of the IBM PC. By bypassing DOS, it ran much faster than
its competitors. Along with the immense popularity of the IBM´s
computer, Lotus owed much of its success to its working combination of
spreadsheet capabilities with graphics and data retrieval capabilities.

Kapor, who received his bachelor´s degree in an individually designed


cybernetics major from Yale University in 1971, started Lotus
Development Corp. to market his spreadsheet and served as its president
and CEO from 1982 to 1986. He also has worked to develop policies that
maximize openness and competitiveness in the computer industry.

Lotus 1-2-3

1983
Microsoft announced Word, originally called Multi-Tool Word, and
Windows. The latter doesn´t ship until 1985, although the company said
it would be on track for an April 1984 release. In a marketing blitz,
Microsoft distributed 450,000 disks demonstrating its Word program in
the November issue of PC World magazine.

1985
Aldus announced its PageMaker program for use on Macintosh
computers, launching an interest in desktop publishing. Two years later,
Aldus released a version for IBMs and IBM-compatible computers.
Developed by Paul Brainerd, who founded Aldus Corp., PageMaker
allowed users to combine graphics and text easily enough to make
desktop publishing practical.

Chuck Geschke of Adobe Systems Inc., a company formed in 1994 by the


merger of Adobe and Aldus, remembered: "John Sculley, a young fellow
at Apple, got three groups together — Aldus, Adobe, and Apple — and
out of that came the concept of desktop publishing. Paul Brainerd of
Aldus is probably the person who first uttered the phrase. All three
companies then took everybody who could tie a tie and speak two
sentences in a row and put them on the road, meeting with people in the
printing and publishing industry and selling them on this concept. The net
result was that it turned around not only the laser printer but, candidly,
Apple Computer. It really turned around that whole business.

Aldus PageMaker

The C++ programming language emerged as the dominant object-


oriented language in the computer industry when Bjarne Stroustrup
published "The C++ Programming Language." Stroustrup, at AT&T Bell
Laboratories, said his motivation stemmed from a desire to write event-
driven simulations that needed a language faster than Simula. He
developed a preprocessor that allowed Simula style programs to be
implemented efficiently in C.

Stroustrup wrote in the preface to "The C++ Programming Language":


"C++ is a general purpose programming language designed to make
programming more enjoyable for the serious programmer. Except for
minor details, C++ is a superset of the C programming language. In
addition to the facilities provided by C, C++ provides flexible and efficient
facilities for defining new types.... The key concept in C++ is class. A
class is a user-defined type. Classes provide data hiding, guaranteed
initialization of data, implicit type conversion for user-defined types,
dynamic typing, user-controlled memory management, and mechanisms
for overloading operators.... C++ retains C's ability to deal efficiently with
the fundamental objects of the hardware (bits, bytes, words, addresses,
etc.). This allows the user-defined types to be implemented with a
pleasing degree of efficiency."

1987
Apple engineer William Atkinson designed HyperCard, a software tool that
simplifies development of in-house applications. HyperCard differed from
previous programs of its sort because Atkinson made it interactive rather
than language-based and geared it toward the construction of user
interfaces rather than the processing of data. In HyperCard, programmers
built stacks with the concept of hypertext links between stacks of pages.
Apple distributed the program free with Macintosh computers until 1992.

Hypercard users could look through existing HyperCard stacks as well as


add to or edit the stacks. As a stack author, a programmer employed
various tools to create his own stacks, linked together as a sort of slide
show. At the lowest level, the program linked cards sequentially in
chronological ordered, but the HyperTalk programming language allowed
more sophisticated links.

1989
Maxis released SimCity, a video game that helped launch of series of
simulators. Maxis cofounder Will Wright built on his childhood interest in
plastic models of ships and airplanes, eventually starting up a company
with Jeff Braun and designing a computer program that allowed the user
to create his own city. A number of other Sims followed in the series,
including SimEarth, SimAnt, and SimLife.

In SimCity, a player starts with an untouched earth. As the mayor of a


city or city planner, he creates a landscape and then constructs buildings,
roads, and waterways. As the city grows, the mayor must provide basic
services like health care and education, as well as making decisions about
where to direct money and how to build a revenue base. Challenges
come in the form of natural disasters, airplane crashes, and monster
attacks.

Box Art for SimCity

1990
Microsoft shipped Windows 3.0 on May 22. Compatible with DOS
programs, the first successful version of Windows finally offered good
enough performance to satisfy PC users. For the new version, Microsoft
revamped the interface and created a design that allowed PCs to support
large graphical applications for the first time. It also allowed multiple
programs to run simultaneously on its Intel 80386 microprocessor.

Microsoft released Windows amid a $10 million publicity blitz. In addition


to making sure consumers knew about the product, Microsoft lined up a
number of other applications ahead of time that ran under Windows 3.0,
including versions of Microsoft Word and Microsoft Excel. As a result, PCs
moved toward the user-friendly concepts of the Macintosh, making IBM
and IBM-compatible computers more popular.

1991
The Linux operating system is introduced. Designed by Finnish university
student Linus Torvalds, Linux was released to several Usenet newsgroups
on September 17th, 1991. Almost immediately, enthusiasts began
developing and improving Linux, such as adding support for peripherals
and improving its stability. Linux is now one of several open source Unix-
like operating systems.

Linus Torvalds, 1991

Pretty Good Privacy is introduced. Pretty Good Privacy, or PGP, is an e-


mail encryption program. Its inventor, software engineer Phil
Zimmermann, created it as a tool for people to protect themselves from
intrusive governments around the world. Zimmermann posted PGP on
the Internet in 1991 where it was available as a free download. The
United States government, concerned about the strength of PGP, which
rivaled some of the best secret codes in use at the time, prosecuted
Zimmermann but dropped its investigation in 1996. PGP is now the most
widely used encryption system for e-mail in the world.

T
i
m
e
li
n
e
o
f
C
o
m
p
u
t
e
r
H
i
s
t
o
r
y
About the Museum | Exhibits | Collections | What's Happening | Giving | About This Site
Privacy | Copyright | Feedback | Credits | Advanced Search | Site Map

© 2006 Computer History Museum. All rights reserved.


1401 N. Shoreline Blvd., Mountain View CA 94043 Ph 650-810-1010
Companies
'39 '40 '41 '42 '43 '44 '45 '46 '47 '48
Components
'49 '50 '51 '52 '53 '54 '55 '56 '57 '58 Computers
Graphics & Games
'59 '60 '61 '62 '63 '64 '65 '66 '67 '68
Networking
'69 '70 '71 '72 '73 '74 '75 '76 '77 '78 People & Pop Culture
Robots & Artificial
'79 '80 '81 '82 '83 '84 '85 '86 '87 '88 Intelligence
Software & Languages
'89 '90 '91 '92 '93 '94
Storage

Storage
1956
The era of magnetic disk storage dawned with IBM´s shipment of a 305
RAMAC to Zellerbach Paper in San Francisco. The IBM 350 disk file served
as the storage component for the Random Access Method of Accounting
and Control. It consisted of 50 magnetically coated metal platters with 5
million bytes of data. The platters, stacked one on top of the other,
rotated with a common drive shaft.

IBM's Rey Johnson in front of


RAMAC 350 Disk File

1961
IBM 1301 Disk Storage Unit is released. The IBM 1301 Disk Drive was
announced on June 2nd, 1961 for use with IBM’s 7000-series of
mainframe computers. Maximum capacity was 28 million characters and
the disks rotated at 1,800 R.P.M. The 1301 leased for $2,100 per month
or could be purchased for $115,500. The drive had one read/write arm
for each disk as well as flying heads, both of which are still used in
today’s disk drives.

The IBM 1301 Disk Storage


System
1962
Virtual memory emerged from a team under the direction of Tom Kilburn
at the University of Manchester. Virtual memory permitted a computer to
use its storage capacity to switch rapidly among multiple programs or
users and is a key requirement for timesharing.

Tom Kilburn in front of


Manchester Atlas console

IBM 1311 Disk Storage Drive is announced. Announced on October 11,


1962, the IBM 1311 was the first disk drive IBM made with a removable
disk pack. Each pack weighed about ten pounds, held six disks, and had a
capacity of 2 million characters. The disks would rotate at 1,500 RPM and
were accessed by a hydraulic actuator with one head per disk. [storage]
The 1311 offered some of the advantages of both tapes and disks.

Four Views of the IBM 1311


Including Removable Disk Pack

1967
IBM 1360 Photo-Digital Storage System is delivered. In 1967, IBM
delivered the first of its photo-digital storage systems to Lawrence
Livermore National Laboratory. The system could read and write up to a
trillion bits of information—the first such system in the world.. The 1360
used thin strips of film which were developed with bit patterns via a
photographic developing system housed in the machine. The system used
sophisticated error correction and a pneumatic robot to move the film
strips to and from a storage unit. Only five were built.

The IBM Photo Digital Storage


System, code-named Cypress

1971
An IBM team, originally led by David Noble, invented the 8-inch floppy
diskette. It was initially designed for use in loading microcode into the
controller for the "Merlin" (IBM 3330) disk pack file. It quickly won
widespread acceptance as a program and data-storage medium. Unlike
hard drives, a user could easily transfer a floppy in its protective jacket
from one drive to another.

IBM 23FD 8

1978
The 5 1/4" flexible disk drive and diskette were introduced by Shugart
Associates in 1976. This was the result of a request by Wang Laboratories
to produce a disk drive small enough to use with a desktop computer,
since 8" floppy drives were considered too large for that purpose. By
1978, more than 10 manufacturers were producing 5 1/4" floppy drives.

Original Shugart SA400 5 1/4"


Floppy Disk Drive

1980
Seagate Technology created the first hard disk drive for microcomputers,

the ST506. The disk held 5 megabytes of data, five times as much as a

standard floppy disk, and fit in the space of a floppy disk drive. The hard

disk drive itself is a rigid metallic platter coated on both sides with a thin

layer of magnetic material that stores digital data.

Seagate Technology grew out of a 1979 conversation between Alan

Shugart and Finis Conner, who had worked together at IBM. The two men

Shugart ST506 5MB Hard Disk decided to found the company after developing the idea of scaling down a
Drive hard disk drive to the same size as the then-standard 5 1/4-inch floppies.

Upon releasing its first product, Seagate quickly drew such big-name

customers as Apple Computer and IBM. Within a few years, it had sold 4

million units.

1981
The first optical data storage disk had 60 times the capacity of a 5 1/4-

inch floppy disk. Developed by Phillips, the disk stored data as marks
burned by a laser that could not be overwritten — making it useful for

storing large quantities of information that didn't need to change. Two

years later, Phillips created an erasable optical disk using special material,

a laser, and magnetism to combine the capacity of an optical disk with

the convenience of an option to erase and rewrite--the magneto-optical

(MO) system. MO drives had a short life as they were replaced by

rewritable CD and DVDs.

Sony introduced and shipped the first 3 1/2" floppy drives and diskettes
in 1981. The first signficant company to adopt the 3 1/2" floppy for
general use was Hewlett-Packard in 1982, an event which was critical in
establishing momentum for the format and which helped it prevail over
the other contenders for the microfloppy standard, including 3", 3 1/4",
and 3.9" formats.

Sony 3 1/2

1983
The Bernoulli Box is released. Using disks that included the read/write
head inside them, the Bernoulli Box was a special type of disk drive that
allowed people to move large files between computers when few
alternatives (such as a network) existed. Allowing for almost twenty
times the amount of storage afforded by a regular floppy disk, the
cartridges came in capacities ranging from 35MB to 230MB.
Original Bernoulli Box

1984
Able to hold 550 megabytes of prerecorded data, CD-ROMs grew out of
regular CDs on which music is recorded. The first general-interest CD-
ROM product released after Philips and Sony announced the CD-ROM in
1984 was "Grolier´s Electronic Encyclopedia," which came out in 1985.
The 9 million words in the encyclopedia only took up 12 percent of the
available space. The same year, computer and electronics companies
worked together to set a standard for the disks so any computer would be
able to access the information.

1994
The Iomega Zip Disk is released. The Zip Disk was based on Iomega’s
Bernoulli Box system of file storage that used special removable disk
packs. Using the Bernoulli system as a model, the initial Zip system
allowed 100MB to be stored on a cartridge roughly the size of a 3 ½ inch
floppy disk. Later versions increased the capacity of a single disk from
100Mbytes to 2GB.

Early Zip Drive with Disks

T
i
m
e
li
n
e
o
f
C
o
m
p
u
t
e
r
H
i
s
t
o
r
y
About the Museum | Exhibits | Collections | What's Happening | Giving | About This Site
Privacy | Copyright | Feedback | Credits | Advanced Search | Site Map

© 2006 Computer History Museum. All rights reserved.


1401 N. Shoreline Blvd., Mountain View CA 94043 Ph 650-810-1010

The history of MS-DOS is surprisingly long. It started off as QDOS (Quick and Dirty
Operating System) which was developed by Seattle Computer Products to run on IBM's
new PC. This list is fairly comprehensive although a number of the more obscure
versions of DOS have been omitted.

Version Date Comments


The original version of MS-DOS. This was a renamed version of QDOS
1.0 1981
which had been purchased by an upstart company called Microsoft.
This added support for double-sided disks. Previously the disk had to be
1.25 1982
turned over to use the other side
This added support for IBM's 10 MB hard disk, directories and double-
2.0 1983
density 5.25" floppy disks with capacities of 360 KB
2.11 1983 Support for foreign and extended characters was added.
Support for high-density (1.2 MB) floppy disks and 32 MB hard disks was
3.0 1984
added.
3.1 1984 Network support was added.
3.3 1987 This release was written to take advantage of IBM's PS/2 computer
range. It added support for high density 3.5" floppy disks, more than one
partition on hard disks (allowing use of disks bigger than 32 MB) and
code pages.
This version provided XMS support, support for partitions on hard disks
4.0 1988 up to 2 GB and a graphical shell. It also contained a large number of
bugs and many programs refused to run on it.
4.01 1989 The bugs in version 4.0 were fixed.
This was a major upgrade. It allowed parts of DOS to load itself in the
high memory area and certain device drivers and TSRs to run in the
unused parts of the upper memory area between 640K and 1024K. This
version also added support for IBM's new 2.88 MB floppy disks. An
5.0 1991
improved BASIC interpreter and text editor were included, as was a disk
cache, an undelete utility and a hard-disk partition-table backup program.
After the problems with MS-DOS 4, it also provided a utility to make
programs think they were running on a different version of MS-DOS.
This was a minor bug fix which dealt with possibly catastrophic problems
5.0a 1992/3
with UNDELETE and CHKDSK.
This was a catch-up with Novell's DR-DOS 6. It added a disk-
compression utility called DoubleSpace, a basic anti-virus program and a
disk defragmenter. It also finally included a MOVE command, an
improved backup program, MSBACKUP and multiple boot configurations.
6.0 1993
Memory management was also improved by the addition of
MEMMAKER. A number of older utilities, such as JOIN and RECOVER
were removed. The DOS Shell was released separately as Microsoft felt
that there were too many disks.
Extra security was built into DoubleSpace following complaints of data
6.2 1993 loss. A new disk checker, SCANDISK, was also introduced, as well as
improvements to DISKCOPY and SmartDrive.
Following legal action by Stac Electronics, Microsoft released this version
6.21 1993 which had DoubleSpace removed. It came with a voucher for an
alternative disk compression program.
Microsoft licenced a disk-compression package called DoubleDisk from
6.22 1994 VertiSoft Systems and renamed it DriveSpace, which was included in this
version.
This version is part of the original version of Windows 95. It provides
support for long filenames when Windows is running, but removes a large
7.0 1995
number of utilities, some of which are on the Windows 95 CD in the
\other\oldmsdos directory.
This version is part of OEM Service Release 2 and later of Windows 95.
7.1 1997 The main change is support for FAT 32 hard disks, a more efficient and
robust way of storing data on large drives.

Microsoft
first began development of the Interface Manager (subsequently renamed Microsoft
Windows) in September 1981.
Although the first prototypes used Multiplan and Word-like menus at the bottom of the
screen, the interface was changed in 1982 to use pull-down menus and dialogs, as used
on the Xerox Star.
Microsoft finally announced Windows in November 1983, with pressure from just-
released VisiOn and impending TopView.
This was after the release of the Apple Lisa, and before Digital Research announced
GEM, and DESQ from Quarterdeck and the Amiga Workbench , or
GEOS/GeoWorks Ensemble, IBM OS/2, NeXTstep or even
DeskMate from Tandy.
Windows promised an easy-to-use graphical interface, device-independent graphics and
multitasking support.
The development was delayed several times, however, and the Windows 1.0 hit the store
shelves in November 1985. The selection of applications was sparse, however, and
Windows sales were modest.

Windows 1.0 package, included:

MS-DOS Executive, Calendar, Cardfile, Notepad, Terminal, Calculator, Clock, Reversi,


Control Panel, PIF (Program Information File) Editor, Print Spooler, Clipboard,
RAMDrive, Windows Write, Windows Paint.
-
Windows 1.0

Click here for more photos off WINDOWS 1.01 Thanks to Oliver Schade.

Windows 1.0 advertising:

"Windows will instantly deliver you a more productive present.


And a leap into the future."
Windows 2.0, introduced in the fall of 1987, provided
significant useability improvements to Windows. With the
addition of icons and overlapping windows, Windows
became a viable environment for development of major
applications (such as Excel, Word for Windows, Corel
Draw!, Ami, PageMaker and Micrografx Designer), and
the sales were spurred by the runtime ("Single Application
Environment") versions supplied by the independent
software vendors.
In late 1987 Microsoft released Windows/386. While it
was functionally equivalent to its sibling, Windows/286,
When Windows/386 was released, in running Windows applications, it provided the
Microsoft renamed Windows 2.0 to capability to run multiple DOS applications
Windows/286 for consistency. simultaneously in the extended memory.

Windows/286 >>

Windows 3.0, released in May, 1990, was a complete overhaul of the Windows
environment. With the capability to address memory beyond 640K and a much more
powerful user interface, independent software vendors started developing Windows
applications with vigor. The powerful new applications helped Microsoft sell more than
10 million copies of Windows, making it the best-selling graphical user interface in the
history of computing. Windows 3.1

Windows 3.1, released in April, 1992 provides significant improvements to Windows


3.0. In its first two months on the market, it sold over 3 million copies, including
upgrades from Windows 3.0.
Windows 3.11, added no new features but corrects some existing, mostly network-related
problems. It is replacing Windows 3.1 at the retail and OEM levels, and the upgrade was
available free from ftp.microsoft.com.

Windows for Workgroups 3.1 , released in October, 1992, was the


first integrated Windows and networking package offered by
Microsoft. It provided peer-to-peer file and printer sharing
capabilities highly integrated into the Windows environment. The
simple-to-use-and-install networking allows the user to specify
which files on the user's machine should be made accessible to
others. The files can then be accessed from other machines running
either Windows or DOS.
Windows for Workgroups also includes two additional applications:
Microsoft Mail, a network mail package, and Schedule+, a
workgroup scheduler.
On November, 1993 Microsoft ships Windows for Workgroups
3.11.

Windows NT 3.1, 94-03-01 is Microsoft's platform of choice for high-end systems. It is


intended for use in network servers, workstations and software development machines; it
will not replace Windows for DOS. While Windows NT's user interface is very similar to
that of Windows 3.1, it is based on an entirely new operating system kernel.
Windows NT 3.5, 94-04-12 provides OLE 2.0, improved performance and reduced
memory requirements. It was released in September 1994. Windows NT 3.5 Workstation
replaces Windows NT 3.1, while Windows NT 3.5 Server replaces the Windows NT 3.1
Advanced Server.
Windows NT 4.0, ("Cairo") 94-03-15 Microsoft's project for object-oriented Windows,
and a successor to the "Daytona" release of Windows NT.

Windows 95, released in August of 1995. A 32-bit system


providing full pre-emptive multitasking, advanced file systems,
threading, networking and more. Includes MS-DOS 7.0, but takes
over from DOS completely after starting. Also includes a
completely revised user interface.

Windows 95/98 >>

Click here to see Windows 98 crash in Bill Gates face.


(1.7megs Quick Time movie)

Windows CE has the look and feel of Windows 95 and NT. Users familiar with either of
these operating systems are able to instantly use Handheld PCs and Palm-size PCs.
Windows CE 1.0 devices appeared in November 1996. Over the next year,
approximately 500,000 Handheld PC units were sold worldwide.
Windows CE 2.0>>
Click here for more on the Windows CE/OS Windows CE 2.0 became
available in early 1998 addresses most of the problems experienced by Windows CE 1.0
users and also added features to the operating system that make it more viable for use by
corporate rather than home users.
Windows CE 3.0 Availability June 15, 2000 -- Embedded operating system and its
comprehensive development tools -- Platform Builder 3.0 and eMbedded Visual Tools
3.0 -- which enable developers to build rich embedded devices that demand dynamic
applications and Internet services. Windows CE 3.0 combines the flexibility and the
reliability of an embedded platform with the power of Windows and the Internet.

Windows 98, released in June of 1998. Integrated Web Browsing


gives your desktop a browser-like interface. You will 'browse'
everything, including stuff on your local computer. Active Desktop
allows you to setup your desktop to be your personal web page,
complete with links and any web content. You can also place active
desktop items, such as a stock ticker, that will update automatically.
Internet Explorer 4.0 New browser that supports HTML 4.0 and has an
enhanced user interface. ACPI supports OnNow specs for better
power management of PCs. FAT32 with Conversion utility Enhanced
& Efficient support for larger hard drives. Includes a utility to convert
your FAT16 to a FAT32 partition. Multiple Display Support can
expand your desktop onto up to 8 connected monitors. New Hardware
support will support the latest technology such as DVD, Firewire,
USB, and AGP. Win32 Driver model Uses same driver model as
Windows NT 5.0 Disk Defragmentor Wizard Enhanced hard drive
defragmentor to speed up access to files and applications.

Windows NT 5.0 will include a host of new features. Like Windows 98, it will integrate
Internet Explorer 4.0 into the operating system. This new interface will be matched up
with the Distributed File System, which Microsoft says will provide "a logical way to
organize and navigate the huge volume of information an enterprise assembles on
servers, independent of where the servers are physically located.
As of november 1998, NT 5.0 will be known as Windows 2000, making NT a
"mainstream" operating system.
Feb. 17 2000, Windows 2000 provides an impressive platform of
Internet, intranet, extranet, and management applications that
integrate tightly with Active Directory. You can set up virtual
private networks - secure, encrypted connections across the
Internet - with your choice of protocol. You can encrypt data on
the network or on-disk. You can give users consistent access to
the same files and objects from any network-connected PC. You
can use the Windows Installer to distribute software to users over
the LAN.
Thursday Sep. 14, 2000 Microsoft released Windows Me, short
for Millenium Edition, which is aimed at the home user. The Me
operating system boasts some enhanced multimedia features, such
as an automated video editor and improved Internet plumbing. But
unlike Microsoft's Windows 2000 OS which offers advanced
security, reliability, and networking features Windows Me is
basically just an upgrade to the DOS-based code on which
previous Windows versions have been built.

WINDOWS XP Microsoft officially launches it on October 25th. 2001.


XP is a whole new kind of Windows for consumers. Under the hood, it
contains the 32-bit kernel and driver set from Windows NT and
Windows 2000. Naturally it has tons of new features that no previous
version of Windows has, but it also doesn't ignore the past--old DOS
and Windows programs will still run, and may even run better.

XP comes in two flavors: Home and Professional. XP Home is a $99


upgrade ($199 for the full version) and Professional is a $199 upgrade
($299 for the full version). Recognizing that many homes have more
than one PC, Microsoft also plans to offer discounts of $8 to $12 off
the price of additional upgrades for home users (the Open Licensing
Program is still available for business or home users who need 5 or
more copies). That's fortunate because you'll need the additional
licenses since the Product Activation feature makes it all but
impossible to install a single copy on more than one PC.

WINDOWS XP
And in 2002 comes!
LindowsOS SPX - the first "Broadband OS"

An operating system-- built to take full advantage of broadband technology.

LindowsOS SPX is designed to fully utilize the world of tomorrow, where Internet connectivity is
bountiful and cheap, and computers are ubiquitous. For tomorrow's computing needs, computer
users need a computing solution that's affordable and beneficial, a system where software is
digitally transmitted, easy to deploy and highly customizable. Computing needs to be effortless, so
people spend less time working on computers and more time having computers work for them;
LindowsOS SPX, the broadband operating system, does all of this.

LindowsOS SPX provides an advanced digital experience at an affordable price. Applications can
all be digitally downloaded and installed at the click of a mouse. Individual machines can be
customized quickly and easily.
LINDOWSOS BRIDGES THE GAP TO COMPUTER OWNERSHIP WITH MICROTEL PCs
AND WALMART.COM

Microtel Computer Systems, Pre-Installed with LindowsOS, to Cost Less Than $300 at
Walmart.com

SAN DIEGO –June 17, 2002 — Lindows.com, Inc., whose mantra has been “Bringing Choice to
Your Computer,” is now delivering on its promises of choice by partnering with Microtel
Computer Systems to ship Lindows.com’s Operating System, LindowsOS, pre-installed on their
personal computers. For less than $300, computer-buyers can take advantage of LindowsOS and
Microtel’s offering at Walmart.com (NYSE and PCX: WMT) therefore bringing computer
ownership closer to those with limited resources.

LindowsOS, a Linux®-based operating system, formerly only available to Lindows.com Insiders


(www.lindows.com/signup), is now publicly available for the first time on Microtel PCs
Brian Dennis Ken Linus Richard
Bill Joy Steve Jobs
Kernighan Ritchie Thompson Torvalds Stallman

Computer History Images

Charles Babbage designed the first


computer, starting in 1823. Though not completed until 1990 (?), his Difference Engine
worked. Ada King, Countess of Lovelace and daughter of Lord Byron, wrote programs
for the Difference Engine, thus becoming the world's first programmer.

The ENIAC was the first successful electronic


digital computer. The Fiftieth Anniversary of ENIAC is fast approaching.

The IBM SSEC is something I know nothing about.

The IBM 360 was a revolutionary advance in computer system


architecture, enabling a family of computers covering a wide range of price and
performance.

The LGP30 was built by Litton General Precision in the mid 1950's. It was
implemented with vacuum tubes and drum memory. It used a Flexowriter for I/O. The
instructions had three addresses, two for the operands and one for the next instruction.
. Digital Equipment Corporation's first computer
was the PDP-1.

Spacewar is the first video game and was written by Steve "Slug" Russell at
MIT in 1960-61.

The PDP-6 was DEC's first 36-bit computer.

The PDP-8 was the world's first minicomputer. It was priced at the amazingly low
price of $20,000.00.

The DEC PDP-11 was a wildly successful minicomputer.

The DEC VAX 11/780 brought mainframe capability to the minicomputer


market.

HISTORY OF ELECTRONIC AND COMPUTER MUSIC


INCLUDING AUTOMATIC INSTRUMENTS AND COMPOSITION MACHINES

compiled and annotated by Dr. Kristine H. Burns

2nd century, BC. The Hydraulis was invented by Ktesibios sometime in the
second century B.C. Ktesibios, the son of a Greek barber, was fascinated by
pneumatics and wrote an early treatise on the use of hydraulic systems for
powering mechanical devices. His most famous invention, the Hydraulis, used
water to regulate the air pressure inside an organ. A small cistern called the
pnigeus was turned upside down and placed inside a barrel of water. A set of
pumps forced air into the pnigeus, forming an air reservoir, and that air was
channeled up into the organ's action.

Greek Aeolian harp. This may be considered the first automatic instrument. It
was named for Aeolus, the Greek god of the wind. The instrument had two
bridges over which the strings passed. The instrument was placed in a window
where air current would pass, and the strings were activated by the wind current.
Rather than being of different lengths, the strings were all the same length and
tuned to the same pitch, but because of different string thicknesses, varying
pitches could be produced.

5th-6th centuries BC, Pythagorus discovered numerical ratios corresponding to


intervals of the musical scale. He associated these ratios with what he called
"harmony of the spheres."

890 AD Banu Musa was an organ-building treatise; this was the first written
documentation of an automatic instrument.

ca. 995-1050, Guido of Arezzo, a composer, developed an early form of


solmization that used a system of mnemonics to learn "unknown songs." The
method involved the assignment of alphabetic representations, syllables, to
varying joints of the human hand. This system of mnemonics was apparently
adapted from a technique used by almanac makers of the time.

1400s The hurdy-gurdy, an organ-grinder-like instrument, was developed.

Isorhythmic motets were developed. These songs made use of patterns of


rhythms and pitches to define the composition. Composers like Machaut (14th
century), Dufay and Dunstable, (15th century) composed isorhythmic motets.
Duration and melody patterns, the talea and the color respectively, were not of
identical length. Music was developed by the different permutations of pitch and
rhythmic values. So if there were 5 durations and 7 pitches, the pitches were
lined up with the durations. Whatever pitches were 'leftover,' got moved to the
first duration values. The composer would permute through all pitches and
durations before the original pattern would begin again.

Soggetto cavato, a technique of mapping letters of the alphabet into pitches, was
developed. This technique was used Josquin's Mass based on the name of
Hercules, the Duke of Ferrara. One application of soggetto cavato would involve
be to take the vowels in Hercules as follows: e=re=D; u=ut=C (in the solfege
system of do, re, mi, fa, etc., ut was the original do syllable); e=re=D. This pattern
of vowel-mapping could continue for first and last names, as well as towns and
cities.
1500s The first mechanically driven organs were built; water organs called
hydraulis were in existence.

Don Nicola Vicentino (1511-1572), Italian composer and theorist, invented


Archicembalo, a harpsichord-like instrument with six keyboards and thirty-one
steps to an octave.

1600s Athanasius Kircher, described in his book, Musurgia Universalis (1600), a


mechanical device that composed music. He used number and arithmetic-
number relationships to represent scale, rhythm, and tempo relations, called the
Arca Musarithmica.

1624 English philosopher and essayist, Francis Bacon wrote about a scientific
utopia in the New Atlantis. He stated "we have sound-houses, where we practice
and demonstrate all sounds, and their generation. We have harmonies which you
have not, of quarter-sounds, and less slides of sounds."

1641Blaise Pascal develops the first calculating machine.

1644 The Nouvelle invention de lever, an hydraulic engine produced musical


sounds.

1738 Mechanical singing birds and barrel organs were in existence.

The Industrial Revolution flourished. There were attempts to harness steam


power to mechanical computation machines

1761 Abbe Delaborde constructed a Clavecin Electrique, Paris, France.

Benjamin Franklin perfected the Glass Harmonica.

Maelzel, inventor of the metronome, and friend of Beethoven invented the


Panharmonicon, a keyboard instrument.

1787 Mozart composed the Musikalisches Wurfelspiel (Musical Dice Game). This
composition was a series of precomposed measures arranged in random eight-
bar phrases to build the composition. Each throw of a pair of dice represented an
individual measure, so after eight throws the first phrase was determined.

1796 Carillons, "a sliver of steel, shaped, polished, tempered and then screwed
into position so that the projections on a rotating cylinder could pluck at its free
extremity," were invented.

1830 Robert Schumann composer the Abegg Variations, op. 1. This composition
was named for one of his girlfriends. The principal theme is based on the letters
of her name: A-B-E-G-G--this was a later application of a soggetto cavato
technique.

1832 Samuel Morse invented the telegraph.

1833-34 Charles Babbage, a British scientist builds the Difference Enginer, a


large mechanical computer. In 1834, he imagines the Analytical Engine, a
machine that was never realized. Ada Lovelace, daughter of Lord Byron, assisted
in the documentation of these fantastic devices.

1835 Schumann composed the Carnaval pieces, op. 9 , twenty-one short pieces
for piano. Each piece is based on a different character.

1850 D.D. Parmelee patented the first key-driven adding machine.

1859 David E. Hughes invented a typewriting telegraph utilizing a piano-like


keyboard to activate the mechanism.

1863 Hermann Helmholtz wrote the book, On the Sensations of Tone as a


Physiological Basis for the Theory of Music. Historically this book was one of the
foundations of modern acoustics (this book completed the earlier work of Joseph
Sauveur).

1867 Hipps invented the Electromechanical Piano in Neuchatel, Switzerland. He


was the director of the telegraph factory there.

1876 Elisha Gray (an inventor of a telephone, along with Bell) invented the
Electroharmonic or Electromusical Piano; this instrument transmitted musical
tones over wires.

Koenig's Tonametric was invented. This instrument divided four octaves into 670
equal parts--this was an early instrument that made use of microtuning.

1877 Thomas Edison (1847-1931) invented the phonograph. To record, an


indentation on a moving strip of paraffin coated paper tape was made by means
of a diaphragm with an attached needle. This mechanism eventually lead to a
continuously grooved, revolving metal cylinder wrapped in tin foil.

Emile Berliner (1851-1929) developed and patented the cylindrical and disc
phonograph system, simultaneously with Edison.

Dorr E. Felti, perfected a calculator with key-driven ratchet wheels which could
be moved by one or more teeth at a time.
1880 Alexander Graham Bell (1847-1922) financed his own laboratory in
Washington, D.C. Together with Charles S. Tainter, Bell devised and patented
several means for transmitting and recording sound.

1895 Julian Carillo's theories of microtones, 96 tone scale, constructed


instruments to reproduce divisions as small as a sixteenth tone. He
demonstrated his instruments in New York, 1926. The instruments included an
Octavina for eighth tones and an Arpa Citera for sixteenth tones. There are
several recordings of Carillo's music, especially the string quartets.

1897 E.S. Votey invented the Pianola, an instrument that used a pre-punched,
perforated paper roll moved over a capillary bridge. The holes in the paper
corresponded to 88 openings in the board.

1898 Valdemar Poulson (1869-1942) patented his "Telegraphone," the first


magnetic recording machine.

1906 Thaddeus Cahill invented the Dynamophone, a machine that produced


music by an alternating current running dynamos. This was the first additive
synthesis device. The Dynamophone was also known as the Telharmonium. The
instrument weighed over 200 tons and was designed to transmit sound over
telephone wires; however, the wires were too delicate for all the signals. You can
sort of consider him the 'Father of Muzak.' The generators produced pure tones
of various frequencies and intensity; volume control supplied dynamics. Articles
appeared in McClure's Magazine that stated "democracy in music...the musician
uses keys and stops to build up voices of flute or clarinet, as the artist uses his
brushes for mixing color to obtain a certain hue...it may revolutionize our musical
art..."

Lee De Forest (1873-1961) invented the Triode or Audion tube, the first vacuum
tube.

1907 Ferruccio Busoni (1866-1924) believed that the current musical system was
severely limited, so he stated that instrumental music was dead. His treatise on
aesthetics, Sketch of a New Music, discussed the future of music.

1910 The first radio broadcast in NYC (first radio station was built in 1920, also in
NYC).

1912 The Italian Futurist movement was founded by Luigi Russolo (1885-1947),
a painter, and Filippo Marinetti, a poet. Marinetti wrote the manifesto, Musica
Futurista; the Futurist Movement's creed was "To present the musical soul of the
masses, of the great factories, of the railways, of the transatlantic liners, of the
battleships, of the automobiles and airplanes. To add to the great central themes
of the musical poem the domain of the machines and the victorious kingdom of
Electricity."
Henry Cowell (1897-1965) introduced tone clusters in piano music. The Banshee
and Aeolian Harp are good examples.

1914 The first concert of Futurist music took place. The "art of noises" concert
was presented by Marinetti and Russolo in Milan, Italy.

1920 Lev (Leon) Theremin, Russia, invented the Aetherophone (later called the
Theremin or Thereminovox). The instrument used 2 vacuum tube oscillators to
produce beat notes. Musical sounds were created by "heterodyning" from
oscillators which varied pitch. A circuit was altered by changing the distance
between 2 elements. The instrument had a radio antenna to control dynamics
and a rod sticking out the side that controlled pitch. The performer would move
his/her hand along the rod to change pitch, while simultaneously moving his/her
other hand in proximity to the antenna. Many composers used this instrument
including Varese.

1922 Darius Milhaud (b. 1892) experimented with vocal transformation by


phonograph speed changes.

Ottorino Respighi (1879-1936) called for a phonograph recording of nightingales


in his Pini di Roma (Pines of Rome).

1926 Jorg Mager built an electronic instrument, the Spharophon. The instrument
was first presented at the Donaueschingen Festival (Rimsky-Korsakov composed
some experimental works for this instrument). Mager later developed a
Partiturophon and a Kaleidophon, both used in theatrical productions. All of these
instruments were destroyed in W.W.II.

George Antheil (1900-1959) composed Ballet Mechanique. Antheil was an


expatriate American living in France. The work was scored for pianos,
xylophones, pianola, doorbells, and an airplane propeller.

1928 Maurice Martenot (b. 1928, France) built the Ondes Martenot (first called
the Ondes Musicales). The instrument used the same basic idea as the
Theremin, but instead of a radio antenna, it utilized a moveable electrode was
used to produce capacitance variants. Performers wore a ring that passed over
the keyboard. The instrument used subtractive synthesis. Composers such as
Honegger, Messiaen, Milhaud, Dutilleux, and Varese all composed for the
instrument.

Friedrich Trautwein (1888-1956, Germany) built the Trautonium. Composers


such as Hindemith, Richard Strauss, and Varese wrote for it, although no
recordings can be found.

1929 Laurens Hammond (b. 1895, USA), built instruments such as the
Hammond Organ, Novachord, Solovox, and reverb devices in the United States.
The Hammond Organ used 91 rotary electromagnetic disk generators driven by a
synchronous motor with associated gears and tone wheels. It used additive
synthesis.

1931 Ruth Crawford Seeger's String Quartet 1931 was composed. This is one of
the first works to employ extended serialism, a systematic organization of pitch,
rhythm, dynamics, and articulation.

Henry Cowell worked with Leon Theremin to build the Rhythmicon, an instrument
which could play metrical combinations of virtually unlimited complexity. With this
instrument Cowell composed the Rhythmicana Concerto.

Jorg Mager (Germany) was commissioned to create electronic bell sounds for
the Bayreuth production of Parsifal

1935 Allegemeine Elektrizitats Gesellschaft (AEG), built and demonstrated the


first Magnetophon (tape recorder).

1937 "War of the Worlds" was directed by Orson Welles. Welles was the first
director to use the fade and dissolve technique, first seen in "Citizen Kane." To
date, most film directors used blunt splices instead.

Electrochord (the electroacoustic piano) was built.

1938 Novachord built.

1939 Stream of consciousness films came about.

John Cage (1912-1992) began experimenting with indeterminacy. In his


composition, Imaginary Landscape No. 1, multiple performers are asked to
perform on multiple record players, changing the variable speed settings.

1930s Plastic audio tape was developed.

The Sonorous Cross (an instrument like a Theremin) was built.

1941 Joseph Schillinger wrote the The Schillinger System of Musical


Composition. This book offered prescriptions for composition--rhythms, pitches,
harmonies, etc. Schilllinger's principal students was George Gershwin and Glenn
Miller.

The Ondioline was built.

1944 Percy Grainger and Burnett Cross patented a machine that "freed" music
from the constraints of conventional tuning systems and rhythmic inadequacies
of human performers. Mechanical invention for composing "Free Music" used
eight oscillators and synchronizing equipment in conjunction with photo-sensitive
graph paper with the intention that the projected notation could be converted into
sound.

1947 Bell Labs developed and produced the solid state transistor.

Milton Babbitt's Three Compositions for Piano serialized all aspects of pitch,
rhythm, dynamics, and articulation.

The Solovox and the Clavioline were built.

1948 John Scott Trotter built a composition machine for popular music.

Hugh LeCaine (Canada) built the Electronic Sakbutt, an instrument that actually
sounded like a cello.

Pierre Schaeffer (b. 1910), a sound technician working at Radio-diffusion-


Television Francaise (RTF) in Paris, produced several short studies in what he
called Musique concrete. October, 1948, Schaeffer's early studies were
broadcast in a "concert of noises."

Joseph Schillinger wrote The Mathematical Basis of the Arts.

1949 Pierre Schaeffer and engineer Jacques Poullin worked on experiments in


sound which they titled "Musique concrete." 1949-50 Schaeffer and Henry (1927-
96), along with Poullin composed Symphonie pour un homme seul (Symphony
for a Man Alone); the work actually premiered March 18, 1950.

Olivier Messiaen composed his Mode de valeurs et d'intensities (Mode of


Durations and Intensities), a piano composition that "established 'scales' not only
of pitch but also of duration, loudness, and attack."

The Melochord was invented by H. Bode.

1940s The following instruments were built: the Electronium Pi (actually used by
a few German composers, including: Brehme, Degen, and Jacobi), the
Multimonica, the Polychord organ, the Tuttivox, the Marshall organ, and other
small electric organs.

1950 The Milan Studio was established by Luciano Berio (b. 1925, Italy).

1951-> Clara Rockmore performed on the Theremin in worldwide concerts.

Variations on a Door and a Sigh was composed by Pierre Henry.


The RTF studio was formally established as the Groupe de Musique Concrete,
the group opened itself to other composers, including Messiaen and his pupils
Pierre Boulez, Karlheinz Stockhausen, and George Barraque. Boulez and
Stockhausen left soon after because Schaeffer was not interested in using
electronically-generated sounds, but rather wanted to do everything based on
recordings.

John Cage's use of indeterminacy culminated with Music of Changes, a work


based on the charts from the I Ching, the Chinese book of Oracles.

Structures, Book Ia was one of Pierre Boulez' earliest attempts at employing a


small amount of musical material, called cells (whether for use as pitches,
durations, dynamics, or attack points), in a highly serialized structure.

1951-53 Eimert and Beyer (b. 1901) produced the first compositions using
electronically-generated pitches. The pieces used a mechanized device that
produced melodies based on Markov analysis of Stephen Foster tunes.

1952 The Cologne station of Nordwestdeutscher Rundfunk (later Westdeutscher


Rundfunk) was founded by Herbert Eimert. He was soon joined by Stockhausen,
and they set out to create what they called Elektronische Musik.

John Cage's 4'33" was composed. The composer was trying to liberate the
performer and the composer from having to make any conscious decisions,
therefore, the only sounds in this piece are those produce by the audience.

1953Robert Beyer, Werner Meyer-Eppler (b. 1913) and Eimert began


experimenting with electronically-generated sounds. Eimert and Meyer-Eppler
taught at Darmstadt Summer School (Germany), and gave presentations in Paris
as well.

Louis and Bebe Baron set up a private studio in New York, and provided
soundtracks for sci-fi films like Forbidden Planet (1956) and Atlantis that used
electronic sound scores.

Otto Luening (b. 1900, USA; d. 1996, USA) and Vladimir Ussachevsky (b. 1911,
Manchuria; d. 1990, USA) present first concert at the Museum of Modern Art in
New York, October 28. The program included Ussachevsky's Sonic Contours
(created from piano recordings), and Luening's Fantasy in Space (using flute
recordings). Following the concert, they were asked to be on the Today Show
with Dave Garroway. Musicians Local 802 raised a fuss because Luening and
Ussachevsky were not members of the musicians' union.

1953-4 Karlheinz Stockhausen (b. 1928) used Helmholtz' research as the basis
of his Studie I and Studie II. He tried to build increasingly complex synthesized
sounds from simple pure frequencies (sine waves).
1954 The Cologne Radio Series "Music of Our Time" (October 19) used only
electronically-generated sounds by Stockhausen, Eimert, Pousseur, etc. The
pieces used strict serial techniques.

Dripsody was composed by Hugh LeCaine. The single sound source for this
concrete piece is a drip of water.

1955 Harry Olson and Belar, both working for RCA, invent the Electronic Music
Synthesizer, aka the Olson-Belar Sound Synthesizer. This synth used sawtooth
waves that were filtered for other types of timbres. The user programmed the
synthesizer with a typewriter-like keyboard that punched commands into a 40-
channel paper tape using binary code.

The Columbia-Princeton Studio started, with its beginnings mostly in the living
room of Ussachevsky and then the apartment of Luening.

Lejaren Hiller (1924-92) and Leonard Isaacson, from the University of Illinois
composed the Illiac String Quartet, the first piece of computer-generated music.
The piece was so named because it used a Univac computer and was composed
at the University of Illinois.

1955-56 Karlheinz Stockhausen composed Gesang der Junglinge. This work


used both concrete recordings of boys' voices and synthesized sounds. The
original version was composed for five loudspeakers, but was eventually reduced
to four. The text from the Benedicite (O all ye works of the Lord, bless ye the
Lord), which appears in Daniel as the canticle sung by the three young Jews
consigned to the fiery furnace by Nebuchadnezzar.

1956 Martin Klein and Douglas Bolitho used a Datatron computer called Push-
Button Bertha to compose music. This computer was used to compose popular
tunes; the tunes were derived from random numerical data that was sieved, or
mapped, into a preset tonal scheme.

Tokyo at Japanese Radio, an electronic studio established.

Luening and Ussachevsky wrote incidental music for Orson Welles' King Lear ,
City Center, New York.

1957 Of Wood and Brass was composed by Luening. Sound sources included
trumpets, trombones and marimbas.

Scambi, composed by Henri Pousseur, was created at the Milan Studio, Italy.

Warsaw at Polish Radio, an electronic studio established.

Munich, the Siemens Company, an electronic studio established.


Eindhoven, the Philips Company, an electronic studio established.

David Seville created the Chipmunks, by playing recordings of human voices at


double speed. Electronic manipulation was never really used again in rock for
about ten years.

1958 Edgard Varese (1883-1965) composed Poeme Electronique for the World's
Fair, Brussels. The work was composed for the Philips Pavilion, a building
designed by the famous architect, Le Corbusier who was assisted by Iannis
Xenakis (who later became well-known as a composer rather than an architect).
The work was performed on ca. 425 loudspeakers, and was accompanied by
projected images. This was truly one of the first large-scale multimedia
productions.

Iannis Xenakis (b.1922) composed Concret PH. This work was also composed
for the Brussels World's Fair. It made use of a single sound source: amplified
burning charcoal.

Max Mathews, of Bell Laboratories, generated music by computers.

John Cage composed Fontana Mix at the Milan Studio.

London, BBC Radiophonic Workshop, an electronic studio established.

Stockholm, Swedish Radio, an electronic studio established.

The Studio for Experimental Music at the University of Illinois established,


directed by Lejaren Hiller.

Pierre Henry leaves the Group de Musique Concrete; they reorganize as the
Groupe de Recherches Musicales (GRM)

Gordon Mumma and Robert Ashley founded the Cooperative Studio for
Electronic Music, Ann Arbor , MI (University of Michigan).

Luciano Berio composedThema-omaggio a Joyce. The sound source is woman


reading from Joyce's Ulysses.

1958-60 Stockhausen composed Kontakte (Contacts) for four-channel tape.


There was a second version for piano, percussion and tape.

1958-9 Mauricio Kagel, an Argentinian composer, composed Transicion II, the


first piece to call for live tape recorder as part of performance. The work was
realized in Cologne. Two musicians perform on a piano, one in the traditional
manner, the other playing on the strings and wood. Two other performers use
tape recorders so that the work can unites its present of live sounds with its
future of pre-recorded materials from later on and its past of recordings made
earlier in the performance.

Max Mathews, at Bell Labs, began experimenting with computer programs to


create sound material. Mathews and Joan Miller also at Bell Labs, write MUSIC4,
the first wide-spread computer sound synthesis program. Versions I through III
were experimental versions written in assemble language. Music IV and Music V
were written in FORTRAN. MUSIC4 did not allow reentrant instruments (same
instrument becoming active again when it is already active), MUSIC5 added this.
MUSIC4 required as many different instruments as the thickest chord, while
MUSIC5 allowed a score to refer to an instrument as a template, which could
then be called upon as many times as was necessary.

The Columbia-Princeton Electronic Music Center was formally established. The


group had applied through the Rockefeller Foundation, and suggested the
creation of a University Council for Electronic Music. They asked for technical
assistants, electronic equipment, space and materials available to other
composers free of charge. A grant of $175,000 over five years was made to
Columbia and Princeton Universities. In January, 1959, under the direction of
Luening and Ussachevsky of Columbia, and Milton Babbitt and Roger Sessions
of Princeton, the Center was formally established.

The RCA Mark II synthesizer was built at Columbia-Princeton Electronic Music


Center (the original version was built for the artificial creation of human speech).
The Mark II contained oscillators and noise generators. The operator had to give
the synthesizer instructions on a punched paper roll to control pitch, volume,
duration and timbre. The synth used a conventional equal-tempered twelve-note
scale.

1960 Composers of more traditional orchestral music began to rebel. Many


composers tried to get quasi-electronic sounds out of traditional instruments.
Bruno Bartelozzi, wrote new book on extended instrumental techniques.

Morton Subotnick, Pauline Oliveros, and Ramon Sender established the San
Francisco Tape Music Center.

John Cage composed Cartridge Music, an indeterminate score for several


performers applying gramophone cartridges and contact mics to various objects.

1961 The first electronic music concerts at the Columbia-Princeton Studio were
held; the music was received with much hostility from other faculty members.

Varese finally completed Deserts at the Columbia-Princeton Studio.

Fortran-based Music IV was used in the generation of "Bicycle Built for Two"
(Mathews).
The production of integrated circuits and specifically VLSI-very large scale
integration.

Robert Moog met Herbert Deutsch, and together they created a voltage-
controlled synthesizer.

Luciano Berio composed Visage. This radio composition is based on the idea of
non-verbal communication. There are many word-like passages, but only one
word is spoken during the entire composition (actually heard twice), parole
(Italian for 'word'). Cathy Berberian, the composer's wife, was the performer.

The theoretical work, Meta+Hodos, written in 1961 by James Tenney (META


Meta+Hodos, 1975 followed).

1962 Bell Labs mass produces transistors, professional amplifiers and suppliers.

PLF 2 was developed by James Tenney. This computer program was used to
write Four Stochastic Studies, Ergodos and others.

Iannis Xenakis composed Bohor for eight tracks of sound.

Milton Babbitt composed Ensembles for Synthesizer (1962-64) at the Columbia-


Princeton Studio.

At the University of Illinois, Kenneth Gaburo composed Antiphony III, for chorus
and tape.

Paul Ketoff built the synket. This synthesizer was built for composer John Eaton
and was designed specifically as a live performance instrument.

1963 Lejaren Hiller and Robert Baker composed the Computer Cantata.

Babbitt composed Philomel at the Columbia-Princeton Studio. The story is about


Philomel, a woman without a tongue, who is transformed into a nightingale
(based on a story by Ovid).

Mario Davidovsky composed Synchronism I for flute and tape. Davidovsky has
since written many "synchronism" pieces. These works are all written for live
instrument(s) and tape. They explore the synchronizing of events between the
live and tape.

1964 The fully developed Moog was released. The modular idea came from the
miniaturization of electronics.

Gottfried Michael Koenig used PR-1 (Project 1), a computer program that was
written in Fortran and implemented on an IBM 7090 computer. The purpose of
the program was to provide data to calculate structure in musical composition;
written to perform algorithmic serial operations on incoming data. The second
version of PR-1 completed, 1965.

Karlheinz Stockhausen composed Mikrophonie I, a piece that required six


musicians to generate. Two performers play a large tam-tam, while two others
move microphones around the instrument to pick up different timbres, and the
final two performers are controlling electronic processing.

Ilhan Mimaroglu, a Turkish-American composer, wrote Bowery Bum. This is a


concrete composition, and used rubber band as single source. It was based on a
painting by Dubuffet.

1965 Hi-fi gear is commercially produced.

The first commercially-available Moog.

Varese died.

Karlheinz Stockhausen composed Solo. The composition used a tape recorder


with moveable heads to redefine variations in delay between recording and
playback, live manipulation during performance.

Karlheinz Stockhausen composed Mikrophonie II for choir, Hammond organ,


electronics and tape.

Steve Reich composed It's gonna rain. This is one of the first phase pieces.

1966 The Moog Quartet offered world-wide concerts of (mainly) parlor music.

Herbert Brun composed Non Sequitur VI

Steve Reich composed Come out, another phase piece.

1967 Walter Carlos (later Wendy) composed Switched on Bach using a Moog
synthesizer.

Iannis Xenakis wrote Musiques Formelles (Formalized Music). The first


discussion of granular synthesis and the clouds and grains of sound is presented
in this book.

Leon Kirschner composed String Quartet No. 3, the first piece with electronics to
win the Pulitzer Prize.

Kenneth Gaburo composed Antiphony IV, a work for trombone, piccolo, choir and
tape.
Morton Subotnick composed Silver Apples of the Moon (title from Yeats), the first
work commissioned specifically for the recorded medium.

The Grateful Dead released Anthem of the Sun and Frank Zappa and the
Mothers of Invention released Uncle Meat. Both albums made extensive use of
electronic manipulation.

1968 Lejaren Hiller and John Cage composed HPSCHD.

Morton Subotnick composed The Wild Bull

Hugh Davies compiled an international catalogue of electronic music.

1969 Terry Riley composed Rainbow in Curved Air

late 1960s The Sal-Mar Construction was built. The instrument was named for
composer Salvatore Martirano and designed by him. The Sal-Mar Construction
weighed over fifteen hundred pounds and consisted of "analog circuits controlled
by internal digital circuits controlled by the composer/performer via a touch-
control keyboard with 291 touch-sensitive keys."

Godfrey Winham and Hubert Howe adapted MUSIC IV for the IBM 7094 as
MUSIC4B was written in assembly language; MUSIC4BF (a Fortran-language
adaptation of MUSIC4B, one version was written by Winham, another was
written by Howe).

Music V variants include MUSIC360 and MUSIC11 for the IBM360 and the
PDP11 computers, these were written by Barry Vercoe, Roger Hale, and Carl
Howe at MIT, respectively.

GROOVE was developed by Mathews and F. Richard Moore at Bell Labs, and
was used to control analog synthesizers.

1970 Charles Wuorinen composed "Times Encomium," the first Pulitzer Prize
winner for entirely electronic composition.

Charles Dodge composed Earth's Magnetic Field. This is a great example of


mapping numerical statistics into musical data.

Steve Reich composed Four Organs.

1972 Pink Floyd's album The Dark Side of the Moon was released; it used
ensembles of synthesizers, also used concrete tracks as interludes between
tunes.
1973 SAWDUST, a language by Herbert Brun, used functions including:
ELEMENT, LINK, MINGLE, MERGER, VARY, and TURN.

1974 The Mellotron was built. The instrument was an early sample player that
used tape loops. There were versions that played string sounds or flute sounds,
and the instrument was used in movie soundtracks and on recordings.

Clara Rockmore releases Theremin recordings.

1976 Composer Philip Glass collaborated with librettist Robert Wilson on


Einstein on the Beach. This was a large-scale multimedia 'opera' in the
minimalist style.

1977 The Institut de Recherche et Coordination Acoustique/Musique (IRCAM),


Paris, under direction of Pierre Boulez.

Systems Concepts Digital Synthesizer (SCDS), built by Peter Samson for


CCRMA, signal generating and processing elements all executing in parallel, and
capable of running in real time. There are 256 digital oscillators, 128 signal
modifiers (filters, reverb, amplitude scalers), a scratch-pad memory for
communicating values between processing elements, and a large memory for
reverberation and table storage.

1980 Philip Glass composed Satyagraha, another full scale opera in the
minimalist style.

1981 Larry Austin composed Canadian Coastlines, a composition that used a


land map of Canada in order to determine textural, rhythmic, and melodic
content.

Music V variants: newer developments include Cmusic (by F.R. Moore), so


named because it is written entirely in C programming language.

1985 HMSL, Hierarchical Music Specification Language was released. The basic
organization of HMSL is a series of data structures called "morphs" (named for
the flexible or morphological design of the software). Within the superstructure of
these morphs there exist other data substructures named shapes, collections,
structures, structures, productions, jobs, players, and actions. These secondary
types of morphs are used to control aspects of higher level scheduling and
routines.

Interactor, by Morton Subotnick and Mark Coniglio, was designed specifically for
live performance and score-following capabilities.

1986 Another Music V variant was release--CSound, by Barry Vercoe of MIT.


Jam Factory written by programmer David Zicarelli. He was trying to create a
program that would listen to MIDI input and 'improvise' immediately at some level
of proficiency, while allowing (Zicarelli) to improve its ability.

Joel Chadabe, Offenhartz, Widoff, and Zicarelli began work on an algorithmic


program that could be used as an improvisation environment. The performer
could be seated at the computer and shape data in real time by "a set of scroll
bars that changed the parameters of this algorithm, such as the size of the jump
from one note to another, the lowest and highest note, etc." The original version
was to be named "Maestro," then "RMan" (Random Manager), and finally, "M."

Music Mouse, written by Laurie Speigel, was designed to be a stand-alone


performance system. It may be used as a MIDI controller or as a performance
station using the Macintosh internal sound. Unlike other programs for the
Macintosh environment, Music Mouse was not intended to be used as a
recorder/player program. Instead, the program enables the programmer to "play"
the computer. Check out the software at:
http://www.dorsai.org/~spiegel/ls_programs.html

The Max program was written in the C language and was developed at IRCAM
by Miller Puckette. It was later scheduled for distribution by Intelligent Music (the
company that also distributed M and Jam Factory), but it was the Opcode
company that eventually released it. Miller Puckette's original intention was to
build a language that could control IRCAM's 4X synthesizer, and there was no
need for the graphical implementation. The graphics were added after a version
of Max for Macintosh computer using MIDI was proposed. Since 1989, David
Zicarelli has updated and expanded the program for the Macintosh environment.

Dolby SR introduced

R-DAT spec announced

Mackie Designs Inc. founded

Sonic Solutions founded

1987 Apple introduced MacII

first consumer DAT decks available

1988 Steve Reich composed Different Trains for string quartet and tape.

1989 Digidesign introduces Sound Tools

Mackie 1604 debuts


1990 Sony introduces writeable CD

1991 Sony develops MiniDisc

Alesis ADAT introduced

1992 Sony announces multimedia CD-ROM

Emagic founded

Macromedia founded

Spirit by Soundcraft introduced

1994 DVD introduced

1996 first MiniDisc multitracks introduced

1997 DVD-Audio standard develops

Computer Science Lab

An Illustrated History of Computers


Part 1
________________________________
___
John Kopplin © 2002

The first computers were people! That is, electronic computers (and the earlier
mechanical computers) were given this name because they performed the work
that had previously been assigned to people. "Computer" was originally a job
title: it was used to describe those human beings (predominantly women) whose
job it was to perform the repetitive calculations required to compute such things
as navigational tables, tide charts, and planetary positions for astronomical
almanacs. Imagine you had a job where hour after hour, day after day, you were
to do nothing but compute multiplications. Boredom would quickly set in, leading
to carelessness, leading to mistakes. And even on your best days you wouldn't
be producing answers very fast. Therefore, inventors have been searching for
hundreds of years for a way to mechanize (that is, find a mechanism that can
perform) this task.

This picture shows what were known as "counting tables" [photo


courtesy IBM]

A typical computer operation back when computers were people.

The abacus was an early aid for mathematical computations. Its only value is
that it aids the memory of the human performing the calculation. A skilled abacus
operator can work on addition and subtraction problems at the speed of a person
equipped with a hand calculator (multiplication and division are slower). The
abacus is often wrongly attributed to China. In fact, the oldest surviving abacus
was used in 300 B.C. by the Babylonians. The abacus is still in use today,
principally in the far east. A modern abacus consists of rings that slide over rods,
but the older one pictured below dates from the time when pebbles were used for
counting (the word "calculus" comes from the Latin word for pebble).

A very old abacus


A more modern abacus. Note how the abacus is really just a
representation of the human fingers: the 5 lower rings on
each rod represent the 5 fingers and the 2 upper rings
represent the 2 hands.

In 1617 an eccentric (some say mad) Scotsman named John Napier invented
logarithms, which are a technology that allows multiplication to be performed via
addition. The magic ingredient is the logarithm of each operand, which was
originally obtained from a printed table. But Napier also invented an alternative to
tables, where the logarithm values were carved on ivory sticks which are now
called Napier's Bones.

An original set of Napier's Bones [photo courtesy IBM]

A more modern set of Napier's Bones


Napier's invention led directly to the slide rule, first built in England in 1632 and
still in use in the 1960's by the NASA engineers of the Mercury, Gemini, and
Apollo programs which landed men on the moon.

A slide rule

Leonardo da Vinci (1452-1519) made drawings of gear-driven calculating


machines but apparently never built any.

A Leonardo da Vinci drawing showing gears arranged for computing

The first gear-driven calculating machine to actually be built was probably the
calculating clock, so named by its inventor, the German professor Wilhelm
Schickard in 1623. This device got little publicity because Schickard died soon
afterward in the bubonic plague.
Schickard's Calculating Clock

In 1642 Blaise Pascal, at age 19, invented the Pascaline as an aid for his father
who was a tax collector. Pascal built 50 of this gear-driven one-function
calculator (it could only add) but couldn't sell many because of their exorbitant
cost and because they really weren't that accurate (at that time it was not
possible to fabricate gears with the required precision). Up until the present age
when car dashboards went digital, the odometer portion of a car's speedometer
used the very same mechanism as the Pascaline to increment the next wheel
after each full revolution of the prior wheel. Pascal was a child prodigy. At the
age of 12, he was discovered doing his version of Euclid's thirty-second
proposition on the kitchen floor. Pascal went on to invent probability theory, the
hydraulic press, and the syringe. Shown below is an 8 digit version of the
Pascaline, and two views of a 6 digit version:
Pascal's Pascaline [photo © 2002 IEEE]

A 6 digit model for those who couldn't afford the 8 digit model
A Pascaline opened up so you can observe the gears and cylinders
which rotated to display the numerical result

Click on the "Next" hyperlink below to read about the punched card system that
was developed for looms for later applied to the U.S. census and then to
computers...

Computer Science Lab Next


Previous Computer Science Lab Next

An Illustrated History of Computers


Part 2
________________________________
___
John Kopplin © 2002

Just a few years after Pascal, the German Gottfried Wilhelm Leibniz (co-inventor
with Newton of calculus) managed to build a four-function (addition, subtraction,
multiplication, and division) calculator that he called the stepped reckoner
because, instead of gears, it employed fluted drums having ten flutes arranged
around their circumference in a stair-step fashion. Although the stepped reckoner
employed the decimal number system (each drum had 10 flutes), Leibniz was the
first to advocate use of the binary number system which is fundamental to the
operation of modern computers. Leibniz is considered one of the greatest of the
philosophers but he died poor and alone.

Leibniz's Stepped Reckoner (have you ever heard "calculating"


referred to as "reckoning"?)

In 1801 the Frenchman Joseph Marie Jacquard invented a power loom that could
base its weave (and hence the design on the fabric) upon a pattern automatically
read from punched wooden cards, held together in a long row by rope.
Descendents of these punched cards have been in use ever since (remember
the "hanging chad" from the Florida presidential ballots of the year 2000?).
Jacquard's Loom showing the threads and the punched cards
By selecting particular cards for Jacquard's loom you defined the
woven pattern [photo © 2002 IEEE]
A close-up of a Jacquard card

This tapestry was woven by a Jacquard loom

Jacquard's technology was a real boon to mill owners, but put many loom
operators out of work. Angry mobs smashed Jacquard looms and once attacked
Jacquard himself. History is full of examples of labor unrest following
technological innovation yet most studies show that, overall, technology has
actually increased the number of jobs.
By 1822 the English mathematician Charles Babbage was proposing a steam
driven calculating machine the size of a room, which he called the Difference
Engine. This machine would be able to compute tables of numbers, such as
logarithm tables. He obtained government funding for this project due to the
importance of numeric tables in ocean navigation. By promoting their commercial
and military navies, the British government had managed to become the earth's
greatest empire. But in that time frame the British government was publishing a
seven volume set of navigation tables which came with a companion volume of
corrections which showed that the set had over 1000 numerical errors. It was
hoped that Babbage's machine could eliminate errors in these types of tables.
But construction of Babbage's Difference Engine proved exceedingly difficult and
the project soon became the most expensive government funded project up to
that point in English history. Ten years later the device was still nowhere near
complete, acrimony abounded between all involved, and funding dried up. The
device was never finished.
A small section of the type of mechanism employed in Babbage's
Difference Engine [photo © 2002 IEEE]

Babbage was not deterred, and by then was on to his next brainstorm, which he
called the Analytic Engine. This device, large as a house and powered by 6
steam engines, would be more general purpose in nature because it would be
programmable, thanks to the punched card technology of Jacquard. But it was
Babbage who made an important intellectual leap regarding the punched cards.
In the Jacquard loom, the presence or absence of each hole in the card
physically allows a colored thread to pass or stops that thread (you can see this
clearly in the earlier photo). Babbage saw that the pattern of holes could be used
to represent an abstract idea such as a problem statement or the raw data
required for that problem's solution. Babbage saw that there was no requirement
that the problem matter itself physically pass thru the holes.

Furthermore, Babbage realized that punched paper could be employed as a


storage mechanism, holding computed numbers for future reference. Because of
the connection to the Jacquard loom, Babbage called the two main parts of his
Analytic Engine the "Store" and the "Mill", as both terms are used in the weaving
industry. The Store was where numbers were held and the Mill was where they
were "woven" into new results. In a modern computer these same parts are
called the memory unit and the central processing unit (CPU).

The Analytic Engine also had a key function that distinguishes computers from
calculators: the conditional statement. A conditional statement allows a program
to achieve different results each time it is run. Based on the conditional
statement, the path of the program (that is, what statements are executed next)
can be determined based upon a condition or situation that is detected at the
very moment the program is running.

You have probably observed that a modern stoplight at an intersection between a


busy street and a less busy street will leave the green light on the busy street
until a car approaches on the less busy street. This type of street light is
controlled by a computer program that can sense the approach of cars on the
less busy street. That moment when the light changes from green to red is not
fixed in the program but rather varies with each traffic situation. The conditional
statement in the stoplight program would be something like, "if a car approaches
on the less busy street and the more busy street has already enjoyed the green
light for at least a minute then move the green light to the less busy street". The
conditional statement also allows a program to react to the results of its own
calculations. An example would be the program that the I.R.S uses to detect tax
fraud. This program first computes a person's tax liability and then decides
whether to alert the police based upon how that person's tax payments compare
to his obligations.

Babbage befriended Ada Byron, the daughter of the famous poet Lord Byron
(Ada would later become the Countess Lady Lovelace by marriage). Though she
was only 19, she was fascinated by Babbage's ideas and thru letters and
meetings with Babbage she learned enough about the design of the Analytic
Engine to begin fashioning programs for the still unbuilt machine. While Babbage
refused to publish his knowledge for another 30 years, Ada wrote a series of
"Notes" wherein she detailed sequences of instructions she had prepared for the
Analytic Engine. The Analytic Engine remained unbuilt (the British government
refused to get involved with this one) but Ada earned her spot in history as the
first computer programmer. Ada invented the subroutine and was the first to
recognize the importance of looping. Babbage himself went on to invent the
modern postal system, cowcatchers on trains, and the ophthalmoscope, which is
still used today to treat the eye.

The next breakthrough occurred in America. The U.S. Constitution states that a
census should be taken of all U.S. citizens every 10 years in order to determine
the representation of the states in Congress. While the very first census of 1790
had only required 9 months, by 1880 the U.S. population had grown so much that
the count for the 1880 census took 7.5 years. Automation was clearly needed for
the next census. The census bureau offered a prize for an inventor to help with
the 1890 census and this prize was won by Herman Hollerith, who proposed and
then successfully adopted Jacquard's punched cards for the purpose of
computation.

Hollerith's invention, known as the Hollerith desk, consisted of a card reader


which sensed the holes in the cards, a gear driven mechanism which could count
(using Pascal's mechanism which we still see in car odometers), and a large wall
of dial indicators (a car speedometer is a dial indicator) to display the results of
the count.

An operator working at a Hollerith Desk like the one below


Preparation of punched cards for the U.S. census
A few Hollerith desks still exist today [photo courtesy The Computer
Museum]

The patterns on Jacquard's cards were determined when a tapestry was


designed and then were not changed. Today, we would call this a read-only
form of information storage. Hollerith had the insight to convert punched cards to
what is today called a read/write technology. While riding a train, he observed
that the conductor didn't merely punch each ticket, but rather punched a
particular pattern of holes whose positions indicated the approximate height,
weight, eye color, etc. of the ticket owner. This was done to keep anyone else
from picking up a discarded ticket and claiming it was his own (a train ticket did
not lose all value when it was punched because the same ticket was used for
each leg of a trip). Hollerith realized how useful it would be to punch (write) new
cards based upon an analysis (reading) of some other set of cards. Complicated
analyses, too involved to be accomplished during a single pass thru the cards,
could be accomplished via multiple passes thru the cards using newly printed
cards to remember the intermediate results. Unknown to Hollerith, Babbage had
proposed this long before.
Hollerith's technique was successful and the 1890 census was completed in only
3 years at a savings of 5 million dollars. Interesting aside: the reason that a
person who removes inappropriate content from a book or movie is called a
censor, as is a person who conducts a census, is that in Roman society the
public official called the "censor" had both of these jobs.

Hollerith built a company, the Tabulating Machine Company which, after a few
buyouts, eventually became International Business Machines, known today as
IBM. IBM grew rapidly and punched cards became ubiquitous. Your gas bill
would arrive each month with a punch card you had to return with your payment.
This punch card recorded the particulars of your account: your name, address,
gas usage, etc. (I imagine there were some "hackers" in these days who would
alter the punch cards to change their bill). As another example, when you
entered a toll way (a highway that collects a fee from each driver) you were given
a punch card that recorded where you started and then when you exited from the
toll way your fee was computed based upon the miles you drove. When you
voted in an election the ballot you were handed was a punch card. The little
pieces of paper that are punched out of the card are called "chad" and were
thrown as confetti at weddings. Until recently all Social Security and other checks
issued by the Federal government were actually punch cards. The check-out slip
inside a library book was a punch card. Written on all these cards was a phrase
as common as "close cover before striking": "do not fold, spindle, or mutilate". A
spindle was an upright spike on the desk of an accounting clerk. As he
completed processing each receipt he would impale it on this spike. When the
spindle was full, he'd run a piece of string through the holes, tie up the bundle,
and ship it off to the archives. You occasionally still see spindles at restaurant
cash registers.

Two types of computer punch cards


Incidentally, the Hollerith census machine was the first machine to ever be
featured on a magazine cover.
Click on the "Next" hyperlink below to read about the first computers such as the
Harvard Mark 1, the German Zuse Z3 and Great Britain's Colossus...

Previous Computer Science Lab Next

Previous Computer Science Lab Next


An Illustrated History of Computers
Part 3
________________________________
___
John Kopplin © 2002

IBM continued to develop mechanical calculators for sale to businesses to help


with financial accounting and inventory accounting. One characteristic of both
financial accounting and inventory accounting is that although you need to
subtract, you don't need negative numbers and you really don't have to multiply
since multiplication can be accomplished via repeated addition.

But the U.S. military desired a mechanical calculator more optimized for scientific
computation. By World War II the U.S. had battleships that could lob shells
weighing as much as a small car over distances up to 25 miles. Physicists could
write the equations that described how atmospheric drag, wind, gravity, muzzle
velocity, etc. would determine the trajectory of the shell. But solving such
equations was extremely laborious. This was the work performed by the human
computers. Their results would be published in ballistic "firing tables" published in
gunnery manuals. During World War II the U.S. military scoured the country
looking for (generally female) math majors to hire for the job of computing these
tables. But not enough humans could be found to keep up with the need for new
tables. Sometimes artillery pieces had to be delivered to the battlefield without
the necessary firing tables and this meant they were close to useless because
they couldn't be aimed properly. Faced with this situation, the U.S. military was
willing to invest in even hair-brained schemes to automate this type of
computation.

One early success was the Harvard Mark I computer which was built as a
partnership between Harvard and IBM in 1944. This was the first programmable
digital computer made in the U.S. But it was not a purely electronic computer.
Instead the Mark I was constructed out of switches, relays, rotating shafts, and
clutches. The machine weighed 5 tons, incorporated 500 miles of wire, was 8
feet tall and 51 feet long, and had a 50 ft rotating shaft running its length, turned
by a 5 horsepower electric motor. The Mark I ran non-stop for 15 years, sounding
like a roomful of ladies knitting. To appreciate the scale of this machine note the
four typewriters in the foreground of the following photo.
The Harvard Mark I: an electro-mechanical computer

You can see the 50 ft rotating shaft in the bottom of the prior photo. This shaft
was a central power source for the entire machine. This design feature was
reminiscent of the days when waterpower was used to run a machine shop and
each lathe or other tool was driven by a belt connected to a single overhead shaft
which was turned by an outside waterwheel.
A central shaft driven by an outside waterwheel and connected to
each machine by overhead belts was the customary power
source for all the machines in a factory

Here's a close-up of one of the Mark I's four paper tape readers. A paper tape
was an improvement over a box of punched cards as anyone who has ever
dropped -- and thus shuffled -- his "stack" knows.
One of the four paper tape readers on the Harvard Mark I (you can
observe the punched paper roll emerging from the bottom)

One of the primary programmers for the Mark I was a woman, Grace Hopper.
Hopper found the first computer "bug": a dead moth that had gotten into the Mark
I and whose wings were blocking the reading of the holes in the paper tape. The
word "bug" had been used to describe a defect since at least 1889 but Hopper is
credited with coining the word "debugging" to describe the work to eliminate
program faults.

The first computer bug [photo © 2002 IEEE]

In 1953 Grace Hopper invented the first high-level language, "Flow-matic". This
language eventually became COBOL which was the language most affected by
the infamous Y2K problem. A high-level language is designed to be more
understandable by humans than is the binary language understood by the
computing machinery. A high-level language is worthless without a program --
known as a compiler -- to translate it into the binary language of the computer
and hence Grace Hopper also constructed the world's first compiler. Grace
remained active as a Rear Admiral in the Navy Reserves until she was 79
(another record).
The Mark I operated on numbers that were 23 digits wide. It could add or
subtract two of these numbers in three-tenths of a second, multiply them in four
seconds, and divide them in ten seconds. Forty-five years later computers could
perform an addition in a billionth of a second! Even though the Mark I had three
quarters of a million components, it could only store 72 numbers! Today, home
computers can store 30 million numbers in RAM and another 10 billion numbers
on their hard disk. Today, a number can be pulled from RAM after a delay of only
a few billionths of a second, and from a hard disk after a delay of only a few
thousandths of a second. This kind of speed is obviously impossible for a
machine which must move a rotating shaft and that is why electronic computers
killed off their mechanical predecessors.

On a humorous note, the principal designer of the Mark I, Howard Aiken of


Harvard, estimated in 1947 that six electronic digital computers would be
sufficient to satisfy the computing needs of the entire United States. IBM had
commissioned this study to determine whether it should bother developing this
new invention into one of its standard products (up until then computers were
one-of-a-kind items built by special arrangement). Aiken's prediction wasn't
actually so bad as there were very few institutions (principally, the government
and military) that could afford the cost of what was called a computer in 1947. He
just didn't foresee the micro-electronics revolution which would allow something
like an IBM Stretch computer of 1959:
(that's just the operator's console, here's the rest of its 33 foot length:)
to be bested by a home computer of 1976 such as this Apple I which sold for
only $600:

The Apple 1 which was sold as a do-it-yourself kit (without the lovely
case seen here)

Computers had been incredibly expensive because they required so much hand
assembly, such as the wiring seen in this CDC 7600:
Typical wiring in an early mainframe computer [photo courtesy The
Computer Museum]
The microelectronics revolution is what allowed the amount of hand-crafted
wiring seen in the prior photo to be mass-produced as an integrated circuit
which is a small sliver of silicon the size of your thumbnail .
An integrated circuit ("silicon chip") [photo courtesy of IBM]
The primary advantage of an integrated circuit is not that the transistors
(switches) are miniscule (that's the secondary advantage), but rather that millions
of transistors can be created and interconnected in a mass-production process.
All the elements on the integrated circuit are fabricated simultaneously via a
small number (maybe 12) of optical masks that define the geometry of each
layer. This speeds up the process of fabricating the computer -- and hence
reduces its cost -- just as Gutenberg's printing press sped up the fabrication of
books and thereby made them affordable to all.

The IBM Stretch computer of 1959 needed its 33 foot length to hold the 150,000
transistors it contained. These transistors were tremendously smaller than the
vacuum tubes they replaced, but they were still individual elements requiring
individual assembly. By the early 1980s this many transistors could be
simultaneously fabricated on an integrated circuit. Today's Pentium 4
microprocessor contains 42,000,000 transistors in this same thumbnail sized
piece of silicon.

It's humorous to remember that in between the Stretch machine (which would be
called a mainframe today) and the Apple I (a desktop computer) there was an
entire industry segment referred to as mini-computers such as the following
PDP-12 computer of 1969:
The DEC PDP-12
Sure looks "mini", huh? But we're getting ahead of our story.

One of the earliest attempts to build an all-electronic (that is, no gears, cams,
belts, shafts, etc.) digital computer occurred in 1937 by J. V. Atanasoff, a
professor of physics and mathematics at Iowa State University. By 1941 he and
his graduate student, Clifford Berry, had succeeded in building a machine that
could solve 29 simultaneous equations with 29 unknowns. This machine was the
first to store data as a charge on a capacitor, which is how today's computers
store information in their main memory (DRAM or dynamic RAM). As far as its
inventors were aware, it was also the first to employ binary arithmetic. However,
the machine was not programmable, it lacked a conditional branch, its design
was appropriate for only one type of mathematical problem, and it was not further
pursued after World War II. It's inventors didn't even bother to preserve the
machine and it was dismantled by those who moved into the room where it lay
abandoned.

The Atanasoff-Berry Computer [photo © 2002 IEEE]

Another candidate for granddaddy of the modern computer was Colossus, built
during World War II by Britain for the purpose of breaking the cryptographic
codes used by Germany. Britain led the world in designing and building
electronic machines dedicated to code breaking, and was routinely able to read
coded Germany radio transmissions. But Colossus was definitely not a general
purpose, reprogrammable machine. Note the presence of pulleys in the two
photos of Colossus below:
Two views of the code-breaking Colossus of Great Britain

The Harvard Mark I, the Atanasoff-Berry computer, and the British Colossus all
made important contributions. American and British computer pioneers were still
arguing over who was first to do what, when in 1965 the work of the German
Konrad Zuse was published for the first time in English. Scooped! Zuse had built
a sequence of general purpose computers in Nazi Germany. The first, the Z1,
was built between 1936 and 1938 in the parlor of his parent's home.
The Zuse Z1 in its residential setting

Zuse's third machine, the Z3, built in 1941, was probably the first operational,
general-purpose, programmable (that is, software controlled) digital computer.
Without knowledge of any calculating machine inventors since Leibniz (who lived
in the 1600's), Zuse reinvented Babbage's concept of programming and decided
on his own to employ binary representation for numbers (Babbage had
advocated decimal). The Z3 was destroyed by an Allied bombing raid. The Z1
and Z2 met the same fate and the Z4 survived only because Zuse hauled it in a
wagon up into the mountains. Zuse's accomplishments are all the more
incredible given the context of the material and manpower shortages in Germany
during World War II. Zuse couldn't even obtain paper tape so he had to make his
own by punching holes in discarded movie film. Because these machines were
unknown outside Germany, they did not influence the path of computing in
America. But their architecture is identical to that still in use today: an arithmetic
unit to do the calculations, a memory for storing numbers, a control system to
supervise operations, and input and output devices to connect to the external
world. Zuse also invented what might be the first high-level computer language,
"Plankalkul", though it too was unknown outside Germany.

Click on the "Next" hyperlink below to read about Eniac, Univac, IBM
mainframes, and the IBM PC...
Previous Computer Science Lab Next

Previous Computer Science Lab

An Illustrated History of Computers


Part 4
________________________________
___
John Kopplin © 2002

The title of forefather of today's all-electronic digital computers is usually awarded


to ENIAC, which stood for Electronic Numerical Integrator and Calculator. ENIAC
was built at the University of Pennsylvania between 1943 and 1945 by two
professors, John Mauchly and the 24 year old J. Presper Eckert, who got
funding from the war department after promising they could build a machine that
would replace all the "computers", meaning the women who were employed
calculating the firing tables for the army's artillery guns. The day that Mauchly
and Eckert saw the first small piece of ENIAC work, the persons they ran to bring
to their lab to show off their progress were some of these female computers (one
of whom remarked, "I was astounded that it took all this equipment to multiply 5
by 1000").

ENIAC filled a 20 by 40 foot room, weighed 30 tons, and used more than 18,000
vacuum tubes. Like the Mark I, ENIAC employed paper card readers obtained
from IBM (these were a regular product for IBM, as they were a long established
part of business accounting machines, IBM's forte). When operating, the ENIAC
was silent but you knew it was on as the 18,000 vacuum tubes each generated
waste heat like a light bulb and all this heat (174,000 watts of heat) meant that
the computer could only be operated in a specially designed room with its own
heavy duty air conditioning system. Only the left half of ENIAC is visible in the
first picture, the right half was basically a mirror image of what's visible.
Two views of ENIAC: the "Electronic Numerical Integrator and
Calculator" (note that it wasn't even given the name of
computer since "computers" were people) [U.S. Army photo]
To reprogram the ENIAC you had to rearrange the patch cords that you can
observe on the left in the prior photo, and the settings of 3000 switches that you
can observe on the right. To program a modern computer, you type out a
program with statements like:

Circumference = 3.14 * diameter

To perform this computation on ENIAC you had to rearrange a large number of


patch cords and then locate three particular knobs on that vast wall of knobs and
set them to 3, 1, and 4.
Reprogramming ENIAC involved a hike [U.S. Army photo]

Once the army agreed to fund ENIAC, Mauchly and Eckert worked around the
clock, seven days a week, hoping to complete the machine in time to contribute
to the war. Their war-time effort was so intense that most days they ate all 3
meals in the company of the army Captain who was their liaison with their military
sponsors. They were allowed a small staff but soon observed that they could hire
only the most junior members of the University of Pennsylvania staff because the
more experienced faculty members knew that their proposed machine would
never work.

One of the most obvious problems was that the design would require 18,000
vacuum tubes to all work simultaneously. Vacuum tubes were so notoriously
unreliable that even twenty years later many neighborhood drug stores provided
a "tube tester" that allowed homeowners to bring in the vacuum tubes from their
television sets and determine which one of the tubes was causing their TV to fail.
And television sets only incorporated about 30 vacuum tubes. The device that
used the largest number of vacuum tubes was an electronic organ: it
incorporated 160 tubes. The idea that 18,000 tubes could function together was
considered so unlikely that the dominant vacuum tube supplier of the day, RCA,
refused to join the project (but did supply tubes in the interest of "wartime
cooperation"). Eckert solved the tube reliability problem through extremely careful
circuit design. He was so thorough that before he chose the type of wire cabling
he would employ in ENIAC he first ran an experiment where he starved lab rats
for a few days and then gave them samples of all the available types of cable to
determine which they least liked to eat. Here's a look at a small number of the
vacuum tubes in ENIAC:
Even with 18,000 vacuum tubes, ENIAC could only hold 20 numbers at a time.
However, thanks to the elimination of moving parts it ran much faster than the
Mark I: a multiplication that required 6 seconds on the Mark I could be performed
on ENIAC in 2.8 thousandths of a second. ENIAC's basic clock speed was
100,000 cycles per second. Today's home computers employ clock speeds of
1,000,000,000 cycles per second. Built with $500,000 from the U.S. Army,
ENIAC's first task was to compute whether or not it was possible to build a
hydrogen bomb (the atomic bomb was completed during the war and hence is
older than ENIAC). The very first problem run on ENIAC required only 20
seconds and was checked against an answer obtained after forty hours of work
with a mechanical calculator. After chewing on half a million punch cards for six
weeks, ENIAC did humanity no favor when it declared the hydrogen bomb
feasible. This first ENIAC program remains classified even today.

Once ENIAC was finished and proved worthy of the cost of its development, its
designers set about to eliminate the obnoxious fact that reprogramming the
computer required a physical modification of all the patch cords and switches. It
took days to change ENIAC's program. Eckert and Mauchly's next teamed up
with the mathematician John von Neumann to design EDVAC, which pioneered
the stored program. Because he was the first to publish a description of this
new computer, von Neumann is often wrongly credited with the realization that
the program (that is, the sequence of computation steps) could be represented
electronically just as the data was. But this major breakthrough can be found in
Eckert's notes long before he ever started working with von Neumann. Eckert
was no slouch: while in high school Eckert had scored the second highest math
SAT score in the entire country.

After ENIAC and EDVAC came other computers with humorous names such as
ILLIAC, JOHNNIAC, and, of course, MANIAC. ILLIAC was built at the University
of Illinois at Champaign-Urbana, which is probably why the science fiction author
Arthur C. Clarke chose to have the HAL computer of his famous book "2001: A
Space Odyssey" born at Champaign-Urbana. Have you ever noticed that you can
shift each of the letters of IBM backward by one alphabet position and get HAL?

ILLIAC II built at the University of Illinois (it is a good thing


computers were one-of-a-kind creations in these days, can
you imagine being asked to duplicate this?)
HAL from the movie "2001: A Space Odyssey". Look at the previous
picture to understand why the movie makers in 1968
assumed computers of the future would be things you walk
into.

JOHNNIAC was a reference to John von Neumann, who was unquestionably a


genius. At age 6 he could tell jokes in classical Greek. By 8 he was doing
calculus. He could recite books he had read years earlier word for word. He
could read a page of the phone directory and then recite it backwards. On one
occasion it took von Neumann only 6 minutes to solve a problem in his head that
another professor had spent hours on using a mechanical calculator. Von
Neumann is perhaps most famous (infamous?) as the man who worked out the
complicated method needed to detonate an atomic bomb.

Once the computer's program was represented electronically, modifications to


that program could happen as fast as the computer could compute. In fact,
computer programs could now modify themselves while they ran (such programs
are called self-modifying programs). This introduced a new way for a program to
fail: faulty logic in the program could cause it to damage itself. This is one source
of the general protection fault famous in MS-DOS and the blue screen of
death famous in Windows.

Today, one of the most notable characteristics of a computer is the fact that its
ability to be reprogrammed allows it to contribute to a wide variety of endeavors,
such as the following completely unrelated fields:
• the creation of special effects for movies,
• the compression of music to allow more minutes of music to fit within the
limited memory of an MP3 player,
• the observation of car tire rotation to detect and prevent skids in an anti-
lock braking system (ABS),
• the analysis of the writing style in Shakespeare's work with the goal of
proving whether a single individual really was responsible for all these
pieces.

By the end of the 1950's computers were no longer one-of-a-kind hand built
devices owned only by universities and government research labs. Eckert and
Mauchly left the University of Pennsylvania over a dispute about who owned the
patents for their invention. They decided to set up their own company. Their first
product was the famous UNIVAC computer, the first commercial (that is, mass
produced) computer. In the 50's, UNIVAC (a contraction of "Universal Automatic
Computer") was the household word for "computer" just as "Kleenex" is for
"tissue". The first UNIVAC was sold, appropriately enough, to the Census
bureau. UNIVAC was also the first computer to employ magnetic tape. Many
people still confuse a picture of a reel-to-reel tape recorder with a picture of a
mainframe computer.
A reel-to-reel tape drive [photo courtesy of The Computer Museum]

ENIAC was unquestionably the origin of the U.S. commercial computer industry,
but its inventors, Mauchly and Eckert, never achieved fortune from their work and
their company fell into financial problems and was sold at a loss. By 1955 IBM
was selling more computers than UNIVAC and by the 1960's the group of eight
companies selling computers was known as "IBM and the seven dwarfs". IBM
grew so dominant that the federal government pursued anti-trust proceedings
against them from 1969 to 1982 (notice the pace of our country's legal system).
You might wonder what type of event is required to dislodge an industry
heavyweight. In IBM's case it was their own decision to hire an unknown but
aggressive firm called Microsoft to provide the software for their personal
computer (PC). This lucrative contract allowed Microsoft to grow so dominant
that by the year 2000 their market capitalization (the total value of their stock)
was twice that of IBM and they were convicted in Federal Court of running an
illegal monopoly.

If you learned computer programming in the 1970's, you dealt with what today
are called mainframe computers, such as the IBM 7090 (shown below), IBM
360, or IBM 370.

The IBM 7094, a typical mainframe computer [photo courtesy of IBM]


There were 2 ways to interact with a mainframe. The first was called time
sharing because the computer gave each user a tiny sliver of time in a round-
robin fashion. Perhaps 100 users would be simultaneously logged on, each
typing on a teletype such as the following:
The Teletype was the standard mechanism used to interact with a
time-sharing computer

A teletype was a motorized typewriter that could transmit your keystrokes to the
mainframe and then print the computer's response on its roll of paper. You typed
a single line of text, hit the carriage return button, and waited for the teletype to
begin noisily printing the computer's response (at a whopping 10 characters per
second). On the left-hand side of the teletype in the prior picture you can observe
a paper tape reader and writer (i.e., puncher). Here's a close-up of paper tape:

Three views of paper tape


After observing the holes in paper tape it is perhaps obvious why all computers
use binary numbers to represent data: a binary bit (that is, one digit of a binary
number) can only have the value of 0 or 1 (just as a decimal digit can only have
the value of 0 thru 9). Something which can only take two states is very easy to
manufacture, control, and sense. In the case of paper tape, the hole has either
been punched or it has not. Electro-mechanical computers such as the Mark I
used relays to represent data because a relay (which is just a motor driven
switch) can only be open or closed. The earliest all-electronic computers used
vacuum tubes as switches: they too were either open or closed. Transistors
replaced vacuum tubes because they too could act as switches but were smaller,
cheaper, and consumed less power.

Paper tape has a long history as well. It was first used as an information storage
medium by Sir Charles Wheatstone, who used it to store Morse code that was
arriving via the newly invented telegraph (incidentally, Wheatstone was also the
inventor of the accordion).

The alternative to time sharing was batch mode processing, where the
computer gives its full attention to your program. In exchange for getting the
computer's full attention at run-time, you had to agree to prepare your program
off-line on a key punch machine which generated punch cards.
An IBM Key Punch machine which operates like a typewriter except it
produces punched cards rather than a printed sheet of paper

University students in the 1970's bought blank cards a linear foot at a time from
the university bookstore. Each card could hold only 1 program statement. To
submit your program to the mainframe, you placed your stack of cards in the
hopper of a card reader. Your program would be run whenever the computer
made it that far. You often submitted your deck and then went to dinner or to bed
and came back later hoping to see a successful printout showing your results.
Obviously, a program run in batch mode could not be interactive.

But things changed fast. By the 1990's a university student would typically own
his own computer and have exclusive use of it in his dorm room.
The original IBM Personal Computer (PC)

This transformation was a result of the invention of the microprocessor. A


microprocessor (uP) is a computer that is fabricated on an integrated circuit (IC).
Computers had been around for 20 years before the first microprocessor was
developed at Intel in 1971. The micro in the name microprocessor refers to the
physical size. Intel didn't invent the electronic computer. But they were the first to
succeed in cramming an entire computer on a single chip (IC). Intel was started
in 1968 and initially produced only semiconductor memory (Intel invented both
the DRAM and the EPROM, two memory technologies that are still going strong
today). In 1969 they were approached by Busicom, a Japanese manufacturer of
high performance calculators (these were typewriter sized units, the first shirt-
pocket sized scientific calculator was the Hewlett-Packard HP35 introduced in
1972). Busicom wanted Intel to produce 12 custom calculator chips: one chip
dedicated to the keyboard, another chip dedicated to the display, another for the
printer, etc. But integrated circuits were (and are) expensive to design and this
approach would have required Busicom to bear the full expense of developing 12
new chips since these 12 chips would only be of use to them.
A typical Busicom desk calculator

But a new Intel employee (Ted Hoff) convinced Busicom to instead accept a
general purpose computer chip which, like all computers, could be
reprogrammed for many different tasks (like controlling a keyboard, a display, a
printer, etc.). Intel argued that since the chip could be reprogrammed for
alternative purposes, the cost of developing it could be spread out over more
users and hence would be less expensive to each user. The general purpose
computer is adapted to each new purpose by writing a program which is a
sequence of instructions stored in memory (which happened to be Intel's forte).
Busicom agreed to pay Intel to design a general purpose chip and to get a price
break since it would allow Intel to sell the resulting chip to others. But
development of the chip took longer than expected and Busicom pulled out of the
project. Intel knew it had a winner by that point and gladly refunded all of
Busicom's investment just to gain sole rights to the device which they finished on
their own.

Thus became the Intel 4004, the first microprocessor (uP). The 4004 consisted of
2300 transistors and was clocked at 108 kHz (i.e., 108,000 times per second).
Compare this to the 42 million transistors and the 2 GHz clock rate (i.e.,
2,000,000,000 times per second) used in a Pentium 4. One of Intel's 4004 chips
still functions aboard the Pioneer 10 spacecraft, which is now the man-made
object farthest from the earth. Curiously, Busicom went bankrupt and never
ended up using the ground-breaking microprocessor.

Intel followed the 4004 with the 8008 and 8080. Intel priced the 8080
microprocessor at $360 dollars as an insult to IBM's famous 360 mainframe
which cost millions of dollars. The 8080 was employed in the MITS Altair
computer, which was the world's first personal computer (PC). It was personal
all right: you had to build it yourself from a kit of parts that arrived in the mail. This
kit didn't even include an enclosure and that is the reason the unit shown below
doesn't match the picture on the magazine cover.
The Altair 8800, the first PC

A Harvard freshman by the name of Bill Gates decided to drop out of college so
he could concentrate all his time writing programs for this computer. This early
experienced put Bill Gates in the right place at the right time once IBM decided to
standardize on the Intel microprocessors for their line of PCs in 1981. The Intel
Pentium 4 used in today's PCs is still compatible with the Intel 8088 used in
IBM's first PC.

If you've enjoyed this history of computers, I encourage you to try your own hand
at programming a computer. That is the only way you will really come to
understand the concepts of looping, subroutines, high and low-level languages,
bits and bytes, etc. I have written a number of Windows programs which teach
computer programming in a fun, visually-engaging setting. I start my students on
a programmable RPN calculator where we learn about programs, statements,
program and data memory, subroutines, logic and syntax errors, stacks, etc.
Then we move on to an 8051 microprocessor (which happens to be the most
widespread microprocessor on earth) where we learn about microprocessors,
bits and bytes, assembly language, addressing modes, etc. Finally, we graduate
to the most powerful language in use today: C++ (pronounced "C plus plus").
These Windows programs are accompanied by a book's worth of on-line
documentation which serves as a self-study guide, allowing you to teach yourself
computer programming! The home page (URL) for this collection of software is
www.computersciencelab.com.

Bibliography:
"ENIAC: The Triumphs and Tragedies of the World's First Computer" by Scott
McCartney.

Previous Computer Science Lab