You are on page 1of 5

Anne Lorraine C.

Tugab Introduction to Computer

CS 1

A Brief History of Computers

Webster's Dictionary defines "computer" as any programmable
electronic device that can store, retrieve, and process data. The basic
idea of computing develops in the 1200's when a Moslem cleric
proposes solving problems with a series of written procedures.

As early as the 1640's mechanical calculators are manufactured for

sale. Records exist of earlier machines, but Blaise Pascal invents the
first commercial calculator, a hand powered adding machine. Although
attempts to multiply mechanically were made by Gottfried Liebnitz in
the 1670s the first true multiplying calculator appears in Germany
shortly before the American Revolution.

In 1801 a Frenchman, Joseph-Marie Jacquard builds a loom that weaves

by reading punched holes stored on small sheets of hardwood. These
plates are then inserted into the loom which reads (retrieves) the
pattern and creates(process) the weave. Powered by water, this
"machine" came 140 years before the development of the modern

The 1890 census is tabulated on punch cards similar to the ones used
90 years earlier to create weaves. Developed by Herman Hollerith of
MIT, the system uses electric power(non-mechanical). The Hollerith
Tabulating Company is a forerunner of today's IBM.

Just prior to the introduction of Hollerith's machine the first printing

calculator is introduced. In 1892 William Burroughs, a sickly ex-teller,
introduces a commercially successful printing calculator. Although
hand-powered, Burroughs quickly introduces an electronic model.

In 1925, unaware of the work of Charles Babbage, Vannevar Bush of

MIT builds a machine he calls the differential analyzer. Using a set of
gears and shafts, much like Babbage, the machine can handle simple
calculus problems, but accuracy is a problem.

The period from 1935 through 1952 gets murky with claims and
counterclaims of who invents what and when. Part of the problem lies
in the international situation that makes much of the research secret.
Other problems include poor record-keeping, deception and lack of

In 1935, Konrad Zuse, a German construction engineer, builds a

mechanical calculator to handle the math involved in his profession.
Shortly after completion, Zuse starts on a programmable electronic
device which he completes in 1938.

John Vincent Atanasoff begins work on a digital computer in 1936 in

the basement of the Physics building on the campus of Iowa State. A
graduate student, Clifford (John) Berry assists. The "ABC" is designed
to solve linear equations common in physics. It displays some early
features of later computers including electronic calculations. He shows
it to others in 1939 and leaves the patent application with attorneys for
the school when he leaves for a job in Washington during World War II.
Unimpressed, the school never files and ABC is cannibalized by

The Enigma, a complex mechanical encoder is used by the Germans

and they believe it to be unbreakable. Several people involved, most
notably Alan Turing, conceive machines to handle the problem, but
none are technically feasible. Turing proposes a "Universal Machine"
capable of "computing" any algorithm in 1937. That same year George
Steblitz creates his Model K(itchen), a conglomeration of otherwise
useless and leftover material, to solve complex calculations. He
improves the design while working at Bell Labs and on September 11,
1940, Steblitz uses a teletype machine at Dartmouth College in New
Hampshire to transmit a problem to his Complex Number Calculator in
New York and receives the results. It is the first example of a network.


First Generation - 1940-1956: Vacuum Tubes

The first computers used vacuum tubes for circuitry and magnetic
drums for memory, and were often enormous, taking up entire rooms.
They were very expensive to operate and in addition to using a great
deal of electricity, generated a lot of heat, which was often the cause
of malfunctions.
First generation computers relied on machine language, the lowest-
level programming language understood by computers, to perform
operations, and they could only solve one problem at a time. Input was
based on punched cards and paper tape, and output was displayed on

The UNIVAC and ENIAC computers are examples of first-generation

computing devices. The UNIVAC was the first commercial computer
delivered to a business client, the U.S. Census Bureau in 1951.

Second Generation - 1956-1963: Transistors

Transistors replaced vacuum tubes and ushered in the second
generation of computers. The transistor was invented in 1947 but did
not see widespread use in computers until the late 50s. The transistor
was far superior to the vacuum tube, allowing computers to become
smaller, faster, cheaper, more energy-efficient and more reliable than
their first-generation predecessors. Though the transistor still
generated a great deal of heat that subjected the computer to
damage, it was a vast improvement over the vacuum tube. Second-
generation computers still relied on punched cards for input and
printouts for output.

Second-generation computers moved from cryptic binary machine

language to symbolic, or assembly, languages, which allowed
programmers to specify instructions in words. High-level programming
languages were also being developed at this time, such as early
versions of COBOL and FORTRAN. These were also the first computers
that stored their instructions in their memory, which moved from a
magnetic drum to magnetic core technology.

The first computers of this generation were developed for the atomic
energy industry.

Third Generation - 1964-1971: Integrated Circuits

The development of the integrated circuit was the hallmark of the third
generation of computers. Transistors were miniaturized and placed on
silicon chips, called semiconductors, which drastically increased the
speed and efficiency of computers.

Instead of punched cards and printouts, users interacted with third

generation computers through keyboards and monitors and interfaced
with an operating system, which allowed the device to run many
different applications at one time with a central program that
monitored the memory. Computers for the first time became accessible
to a mass audience because they were smaller and cheaper than their
Fourth Generation - 1971-Present: Microprocessors
The microprocessor brought the fourth generation of computers, as
thousands of integrated circuits were built onto a single silicon chip.
What in the first generation filled an entire room could now fit in the
palm of the hand. The Intel 4004 chip, developed in 1971, located all
the components of the computer - from the central processing unit and
memory to input/output controls - on a single chip.

In 1981 IBM introduced its first computer for the home user, and in
1984 Apple introduced the Macintosh. Microprocessors also moved out
of the realm of desktop computers and into many areas of life as more
and more everyday products began to use microprocessors.

As these small computers became more powerful, they could be linked

together to form networks, which eventually led to the development of
the Internet. Fourth generation computers also saw the development of
GUIs, the mouse and handheld devices.

Fifth Generation - Present and Beyond: Artificial Intelligence

Fifth generation computing devices, based on artificial intelligence, are
still in development, though there are some applications, such as voice
recognition, that are being used today. The use of parallel processing
and superconductors is helping to make artificial intelligence a reality.
Quantum computation and molecular and nanotechnology will radically
change the face of computers in years to come. The goal of fifth-
generation computing is to develop devices that respond to natural
language input and are capable of learning and self-organization.


ABACUS-A manual computing device consisting of a frame holding
parallel rods strung with movable counters.Architecture. A slab on the
top of the capital of a column.

NAPIER’S BONES-A set of graduated rods used to perform multiplication



PASCAL’S CALCULATOR-A calculating machine developed in 1642 by

French mathematician Blaise Pascal. It could only add and subtract, but
gained attention because 50 units were placed in prominent locations
throughout Europe. Accountants expressed grave concern that they
might be replaced by