You are on page 1of 5

The history of computers is not just the story of a specialized machine.

It's also the story of a


great idea and the people who made it happen. Many historians place the origin of the modern
computer with Charles Babbage. He created the analytical engine in the late 1860's. It was
essentially a mechanical calculator made from gears, levers, and rods. Like modern calculators
it could add, subtract, multiply and divide but it was not a computer. My definition of computer
is anything that fulfills the four main functions that any computer has which is computational
possibilities, the ability to literally take numbers and do stuff with them, memory, the ability to
remember and to enact upon other parts, a storage, and input and output.

Over time many people would improve on Babbage's ideas until December of 1941 when Konrad
Zuse a German engineer completed his z3 calculator. It was the first general-purpose calculator
that was program controlled and it worked. The z3 was electromechanical it used telephone
relays made of iron rods and wire coils. These magnetic relays acted as switches in memory and
computation. The z3 used a binary system of ones and zeros each unique combination of the
digits 1 & 0 equal letters of the alphabet or numbers in the decimal system. A bit is one of these
binary digits. The z3s internal operation was based on Boolean logic, which has simple true and
false questions that contain operations like: and, or, and not. The binary system and Boolean
logic are common to nearly all computers today. While Zuse was quietly building the z3 John
Atanasoff at Iowa State University was getting tired of doing complicated calculations on adding
machines. So Atanasoff started looking for a solution. Atanasoff: “There was no machine in
existence suitable for my purpose and, so I commenced to study the possibility of building a new
machine”. One-night professor Atanasoff became very frustrated. He was trying to think of a
quick and effortless way to do calculations. He jumped in his car and drove for hours while trying
to think of a solution. By the end of his drive he had created in his head a blueprint for a
computer. During next year or so I commenced to invent the elements of this structure.
Atanasoff completed his computer the ABC in the spring of 1942. It had a rotating drum filled
with capacitors for memory. Capacitors are electronic components that hold electrical charges
It used vacuum tubes as the switches that could represent numbers by turning off and on. A
switch in one position can equal to zero and in the other a one. So, it used the binary system
and Boolean logic. But the ABC was not easy to program. The first easily programmable
calculator in the United States was developed at Harvard University. The IBM automatic
sequence-controlled calculator became operational in August of 1944. It was more commonly
known as the Harvard mark 1. Because of World War 2 IBM and Harvard were unaware of Zuse's
work so they assumed the Harvard mark 1 was the first program-controlled calculator. It was 51
feet long, 8 feet high and contains 17,468 vacuum tubes. It was primarily used for solving military
problems such as firing tables and atomic bomb calculations. The Navy sent lieutenant Grace
Hopper to program the Harvard mark 1 and later the mark 2 and 3. She was a pioneer in
computer programming. Once one of her programs was running on the mark 2 when it suddenly
stopped. Hopper and her co-workers discovered a dead moth caught in one of the mechanical
relays. This event created the term debugging a program.

The ENIAC was a phenomenal machine if it was not quite a computer it was a super calculator.
The ENIAC project involved John P Eckert as chief engineer and John Mauchly as senior
consultant. They began working on the ENIAC in 1943 at the University of Pennsylvania. The
ENIAC or electronic numerical integrator analyzer and computer could perform 5,000 additions
and 360 multiplications per second. ENIAC filled a room about 30 by 50 feet and had weighed
nearly 30 tons. It was programmed by using switches and plug boards. Even though by hand
doing a firing table might take you 40 days, on any ENIAC you could take you a week or two to
set up the problem but then just 20 minutes to crank the problem out. It took a huge amount of
electricity to run it. Many scientists claimed that just turning ENIAC on would dim the lights of
Philadelphia. ENIAC was completed in February of 1946. Many historians say it was not the first
true computer, but it did open the door to the computer age.
Computer historians are still arguing which computer was first. Was it Z3, ABC, ENIAC, EDVAC,
EDSAC or the Manchester mark 1? Before ENIAC was completed J Presper Eckert and John
Mauchly teamed up with the famous mathematician John von Neumann to work on a computer
they called EDVAC. EDVAC or electronic discrete variable automatic computer utilized 3,600
vacuum tubes and was supposed to be the first stored-program computer. That means the
programs are stored in memory right along with the data to be processed. The EDVAC project
was begun in 1944 long before any other computer but its construction took a very long time.
While EDVAC was being built the University of Manchester in England began and completed
their computer the Manchester mark 1. It was built by Max Newman, Freddie Williams and
others they used many of the ideas of Eckert Mauchly and von Neumann in designing and
building what many believe was the first truly programmable computer. The first program was
run on June 21st, 1948. It would be four more years before EDVAC was completed. So, in
principle the Manchester mark 1 was the first running programmable computer but it was a
prototype. It's commercial version the Ferranti mark 1 wouldn't operate until February of 1951.
Some historians consider another British computer the EDSAC to be the first stored-program
computer. Its construction was directed by Maurice Wilkes at Cambridge University. Wilkes took
a trip to the United States where he talked with Eckert Lockley and others about computer
design. Wilkes: “and I gradually found myself being gripped by the idea that come what come
May, we were going to build a computer in Cambridge”. The EDSAC was based on the EDVAC
design. That's why Wilkes chose a similar name EDSAC was made operational in May of 1949
after the Manchester mark 1 prototype but before the Ferrante mark 1 or the EDVAC.

J Presper Eckert and John Mark Lee left the EDVAC project in 1947. They formed a company to
make UNIVAC: the universal automatic computer. But before they could build the UNIVAC
computer Northrop Aircraft Company hired them to build a different computer: the BINAC. It
was the first stored-program computer in the United States, but it was not a general-purpose
computer it was designed for military use. BINAC was two computers which carried out
operations simultaneously and compared the results. They ran in real time and responded to
input immediately. This was ideal for flight simulation. The BINAC computer was completed and
demonstrated in August of 1949. With BINAC completed Eckert and Mauchly could finish
building their own UNIVA. UNIVAC was not a government-funded computer it was the first
general-purpose electronic stored-program computer to be produced and sold to businesses as
well as the government. The first was sold to the US Census Bureau it was delivered on March
user-friendly programming language. It was developed by Grace Hopper who worked to make
high-level languages the standard for programmers. Standards make life much easier for
everybody. In the computer world they make it better for the vendors for the users and for
everybody concerned with the computers. UNIVAC was the first computer to come equipped
with a magnetic tape drive and it was the first to use buffer memory. It took a presidential
election to make Univac a household name in 1952 it accurately predicted Eisenhower as the
next president. J Presper Eckert saw the computers would become smaller faster cheaper and a
big part of our lives. Eckert: “It will be possible for machines to take over many of the boring and
repetitive tasks which are properly the work of machines and human beings will then be left to
use their creative ability to make use of the fruits of this more productive society”.

Back in 1949, Jay Forrester conceived of magnetic or iron core memory. A magnet can have a
northern polarity or a southern polarity. One polarity would represent a 1 and the other a 0.
After four years of development Forrester installed the new memory in an MIT designed
computer called Whirlwind. Its operating speed doubled. In the late 1940s Thomas Watson
senior, the chairman of IBM, thought that a handful of big and powerful computers was all our
nation would ever need. So, he kept IBM out of the computer business. His son Thomas Watson
junior realized the potential market for computers. He eventually convinced his father to build
one in 1953. IBM introduced the model 701. They were ready to compete with UNIVAC. Iron
core memory was too new for IBM so like the UNIVAC. The 701 used vacuum tubes. IBM called
the 701 an electronic data processing machine. They wanted to avoid using the UNIVAC name
of computer. García: “Univac was computer so if you said UNIVAC, you what you meant was the
concept of the computer. So, people would say oh look a Los Alamos just bought a new IBM
UNIVAC”. In May of 1954, IBM announced the model 704. It used the new iron core memory
and it had floating-point arithmetic. The 704 had a programming language created for it called
Fortran this would allow customers to do their own programming. Fortran is still in use today.

The transistor was invented at Bell Labs in December of 1947. But it would take years to refine
it for computer use. García: “Transistor works as a switch, so it can be in an on or off position
Much like a vacuum tube of earlier generations. In 1957 FILCO completed the first general-
purpose transistorized computer it was called SOLO and was built for the National Security
Agency there are some real benefits to using transistors. They're reliable, inexpensive and small.
Digital Equipment Corporation was the first company to take advantage of the transistors size.
They introduced the PDP 1 in 1960. It was the world's first interactive and transistorized
minicomputer. It was about the size of a refrigerator while DEC was thinking small with its PDP
1, IBM was thinking big.

In April of 1964 IBM announced the system/360 computer this was a major step for IBM in
becoming the world leader in the business computer market. The system/360 was a family of
nine processors and 70 peripheral devices. Companies could buy a small system and upgrade it
as they grew. By 1968 IBM had installed over 14,000 system/360 computers. This gave IBM the
lead in computer manufacturing and no other company could come close not even Univac.

DEC specialized in many computers they introduced the PDP 8 in 1965 it was designed for
smaller companies and institutions that didn't need or couldn't afford a mainframe computer.
PDP 8 or the programmable data processor 8 from my Digital Equipment Corporation was really
an important machine in that it was the first transistorized mass-produced machine to be widely
available. Businesses universities even high schools wanted them because they were so
affordable about $18,000. Transistors would eventually be revolutionized into integrated
circuits. In the summer of 1958 Jack Kilby started working at Texas Instruments. Most of the
workers were on vacation and he hadn't been assigned to a project so to avoid boredom he
came up with his own project. He designed an integrated circuit that had components like
transistors resistors and capacitors all built into one block of silicon. Kilby's innovation would
revolutionize the computer world. Integrated circuits would increase the power and efficiency
of future computers while greatly reducing their size.

Texas Instruments built the first integrated circuit computer for the air force. It weighed 10
ounces but had the same power as a 15-pound transistor computer. But it wouldn't be until
1965 that the integrated circuit or chip would be used in a commercial computer. They needed
refinement. That refinement would come from a company called Fairchild Semiconductor.
Robert Noyes led a team that found a practical way of designing and manufacturing integrated
circuits. Gordon Moore who was also on the Fairchild team determined that every 18 months
the number of components that could be placed on an integrated circuit would double. This is
known as Moore's law. Throughout the 1960's integrated circuits were used in Minuteman
missiles and for the Apollo space missions which had a goal of landing a human on the moon.
One small step for man one giant leap.

As integrated circuits became smaller it was realized that an entire computer could be placed
on a chip. In 1971, Ted Hoff of Intel Corporation produced the first commercially available
microprocessor: The Intel 4004. It contained approximately 2,500 transistors and it sparked the
personal computer age. There were several machines that tried to be the first successful
personal computer, but it was the one called the Altair that caught on. García: “Altair is the first
real mainstream personal computer. The one that got out the largest was sold mass quantities
and was available nationally”. The Altair was invented in 1974 at MITS an electronics company
in Albuquerque New Mexico. It was created by the owner of the company H Edward Roberts.
He's the person that coined the term personal computer. Roberts used the new Intel 8080
microprocessor and built the Altair around it. It had 4kbytes of memory and sold as a kit for 397
or fully-assembled for $498. It was as powerful as any minicomputer available but personal
computers had three drawbacks: they needed mass storage, a simple program language and an
operating system. The storage problem was overcome by using the IBM floppy disk. The
software problem was overcome by two programmers: Paul Allen and Bill Gates. They wrote a
version of BASIC (Beginner’s All-purpose Symbolic Instruction Code) that worked well on the
ALTAIR. Gates and Allen formed a software company called Microsoft. The operating system for
the PC was provided by Gary Kildall. He created CP/M or control program for micros. By the end
of 1976 all the pieces were in place for a personal computer boom. The first ripple of the boom
came from a newly formed company called Apple computers. The two founders of Apple were
Steve Jobs and Steve Wozniak. They produce the Apple 1 single board computer. It was simple
yet effective and sold for the curious amount of $660,66. Encouraged by sales of their board
Jobs and Wozniak unveiled the Apple 2 computer in April of 1977. It was the first personal
computer to generate color graphics. The Apple 2 included a keyboard power supply and
attractive case. It came standard with 48k memory and sold for $1298. García: “The Apple 2 was
an important machine because it was easy to use so easy that children were introduced to it
typically at an elementary school level and even, even earlier cases of personal users”. Apple
had competition from nearly 30 other companies with so much competition there was no single
leader. The one company that could take control of the personal computer business was IBM
but they weren't interested in PCs yet. It was four years later in August of 1981 when IBM
introduced the IBM personal computer. The IBM PC came with 64 kilobytes of memory and sold
for $1364 or $2880 fully equipped. The IBM PC also had a new operating system called PC-dos
also known as MS-dos. The MS stands for Microsoft the same company that provided basic for
the Altair.

Sales of all pcs were high, and everyone was making money through most of 1983 but it didn't
last long. By late 1983 sales were falling fast. Personal computers had been oversold. The
software did not live up to expectations and the computers were not user-friendly. They were
disappointing. To overcome this downturn in sales Apple computers came up with a risky plan.
They put the fate of the company on one product. In January of 1984 Apple unveiled the
Macintosh computer. The Mac sold for $2495 dollars. It had a mouse, keyboard graphics and
icons. The graphic user interface looked something like an electronic office and was user-
friendly. Susan Kerr was the graphic artist that gave the Mac that friendly look. She created its
various icons such as the grab hand pointer and watch. The mouse was invented by Doug
Engelbart at Stanford Research Institute let's see if the idea for a mouse came to me around
1960 that all sort of unfolded for me in a period of about four or five minutes. The Mac seemed
revolutionary but most of its features had been invented or developed years earlier. Primarily
at the Xerox Palo Alto Research Center. Xerox didn't know what to do with these features, but
Apple did. In 1985 IBM answered Apple with an operating system for their PCs one that would
compete with the Mac they hired Microsoft to create Windows. Windows made all PCs easier
to use and more in demand because there were more programs being written for IBM
compatible pcs than for Mac's.

In the years that followed computers continued growing in power, decreasing in cost and
shrinking in size. We have notebook sized computers that offer full PC functionality in a package
measuring just 8 and a half by 11 inches. We have pocket pcs and even wristwatch pcs and of
course cell phones that connect us to the internet. The internet began as ARPANET in 1970.
ARPANET was a system for scientific computers to share data. Eventually this led to the internet
which is a worldwide communication business and information system. Bill Gates: “it's very key
phenomena that when something gets popular enough it starts to reinforce its own popularity
the more people who use the internet the greater content is there the more there's great
content the more people use it.”

García: “The future of computers is pretty much on a line, to go towards that ultimate, fast,
cheap small. The question that of course becomes well what that will allow us to do”. Artificial
intelligence is when a computer can communicate with a human like a human someday we
might have a computer friend. By people I assume you mean persons not People magazine.
Artificial intelligence requires a huge amount of memory holographic memory is one option a
piece of plastic the size of a CD holds 400 gigabytes of information. That's about 100 feature-
length films. The future computer may be a quantum computer which uses laser-controlled ions
as bits. A quantum computer has a quantum bit or a qubit where the state can be 1 or 0 or it can
be both 1 and 0 at the same time. Then it effectively allows you to run all these calculations at
the same time. The future computer may not be one computer but a network of unseen
specialized computers surrounding you and working together to serve you.

You might also like