You are on page 1of 5

A Brief History of the Computer

Computers and computer applications are on almost every aspect of our daily lives. As like many
ordinary objects around us, we may need clearer understanding of what they are. You may ask "What
is a computer?" or "What is a software", or "What is a programming language?" First, let's examine
the history.

1. The history of computers starts out about 2000 years ago, at the birth of the abacus, a wooden rack
holding two horizontal wires with beads strung on them.

2. Blaise Pascal is usually credited for building the first digital computer in 1642. It added numbers
entered with dials and was made to help his father, a tax collector.
The basic principle of his
calculator is still used today
in water meters and modern-
day odometers.

(Charles Babbage, Tommy Flowers and Alan Turing contributions are to be recalled from
the BBC documentary of 2004 Inventions that changed the world- Jeremy Clarkson

The history of computer development is divided into five main generations where each generation
of computer is characterized by a major technological development that fundamentally changed
the way computers operate, resulting in increasingly smaller, cheaper, powerful, efficient and
reliable devices.

First Generation - 1940-1956: Vacuum Tubes

"First Generation" computers were built, working on zeros and ones i.e., binary function. All of
these early computers used vacuum tubes to perform their calculations. One development among
these first computers was the use of an internally stored program. Binary program were
electronically stored in these computers.

Vacuum tubes were also used for circuitry and magnetic

drums for memory, and were often enormous, taking up
entire rooms. They were very expensive to operate and in
addition to using a great deal of electricity, generated a lot of
heat, which was often the cause of malfunctions. First
generation computers relied on machine language to perform
operations, and they could only solve one problem at a time.
Input was based on punched cards and paper tape, and output was displayed on printouts. The
UNIVAC computers with command line interface are examples of first-generation computing
Second Generation Computers- 1956-1963: Transistors
In 1947, Bell Laboratories invented the
transistor. This creation sparked the
production of a wave of "Second
Generation" computers. Using silicon
was an improvement because it could
withstand higher temperatures. By using
transistors in place of vacuum tubes,
manufacturers could produce more reliable
computers. Using transistors was also less
expensive than building a computer with
vacuum tubes.

People soon found that computers were very good at data processing. By feeding the computer
input via punch cards, the computer could easily sort the data and then print out the sorted
material. Computer companies started to produce two different types of computers. For scientists
and engineers, they built large powerful computers which were good at performing calculations.
For banks and insurance companies, computer manufacturers produced smaller, fast computers
which were good at sorting and printing Computer companies found that it was expensive to
produce two different lines of computers, so they set to work to develop a computer which could
perform both calculations and
data processing equally well.

These were also the first

computers that stored their
instructions in their memory,
which moved from a magnetic
drum to magnetic core
technology. The first computers
of this generation were
developed for the atomic energy

Third Generation Computers- 1964-1971: Integrated Circuits
In 1958, the first integrated circuit
was made in "Third
Generation" computers. This
invention has led to the
widespread use of computers
today. Scientists found a way to
reduce the size of transistors so
they could place hundreds of them
on a small silicon chip. This
enabled computer manufacturers
to build smaller computers.
At about this same time, the concept of a programming
language was developed. Programmer could now tell the
computer to add two numbers by simply using the add
command in the language. The introduction of
programming languages enabled this third generation of
computers to contain something called an operating
system. The operating system keeps the various pieces of
the computer running together smoothly.

Fourth Generation Computers- 1971-Present: Microprocessors

In 1971 Intel created the first microprocessor.
The microprocessor was a large scale
integrated circuit which contained thousands
of transistors. The transistors on this one chip
were capable of performing all of the
functions of a computer's central processing
unit. The reduced size, reduced cost, and
increased speed of the microprocessor led to
the creation of the first personal computers.

In 1976, first Apple- Macintosh computer was built. Then, in 1981, IBM introduced its first
personal computer. The hardware was changed, the computers were faster having more memory,
and were relatively inexpensive. But the large increase in home use of computers has come about
as the result of an increase in the quantity and
quality of the software available.

Originally, there was no software available and

so some people wrote their own codes.
Companies now produce software to help
people do word processing, balance their check
book, and store phone numbers. In addition to
the many programs designed for adults, many
software products are geared towards children,
in particular, video games. The first video
games appeared in 1975, but they were nothing
like the games of today. The increased processing speed and memory in computers has led to an
increase in the quality of computer graphics.

As these small computers became more powerful, they could be linked together to form networks,
which eventually led to the development of the Internet. Fourth generation computers also saw the
development of Graphical User Interface (GUI), Bluetooth, Infrared and handheld devices.

Fifth Generation - Present and Beyond: Artificial Intelligence

Fifth generation computing devices, based on artificial intelligence, are still in development,
though there are some applications, such as voice recognition, that are being used today. The use
artificial intelligence technology has radically change the face
of computers in years to come. The goal of fifth-generation
computing is to develop devices that respond to natural
language input and are capable of learning and self-