You are on page 1of 8

Brief History of Computers

The history of computers goes back thousands of years. The first is the abacus. In fact,
the earliest abacus known as the Sumerian abacus dates from around 2700 BC. A native of
Mesopotamia. But first we should ask the question, What is a computer?, A computer is
simply a machine that executes logical or arithmetic function sequences in accordance with a
set of instructions. But when we think of modern computers, we don't just think of them as
calculators that do things. However, at their core, they are precisely that.

There are 5 generations of computers which are;

1st Generation: This was from the period of 1940 to 1955. This was when machine language
was developed for the use of computers. They used vacuum tubes for the circuitry. For the
purpose of memory, they used magnetic drums. These machines were complicated, large, and
expensive. They were mostly reliant on batch operating systems and punch cards. As output
and input devices, magnetic tape and paper tape were implemented. For example, ENIAC,
UNIVAC-1 and EDVAC.

Electronic Numerical Integrator and Computer Universal Automatic


Computer I

Electronic Discrete Variable Automatic Computer


ENIAC - Is the first programmable, electronic, general-purpose digital computer,
completed in 1946 it’s function was to calculate artillery firing tables that would be used
by the Ballistic Research Laboratory of the United States Army to assist US troops
during World War II

UNIVAC 1 - The first general-purpose electronic digital computer designed in the


United States for use in business applications. It was invented on March 31, 1951

EDVAC - The first Mainframe computer that represented binary systems rather
than decimal systems. It was built in the 1940’s.
2nd Generation: The years 1957-1963 were referred to as the “second generation of
computers” at the time. In second-generation computers, COBOL and FORTRAN are
employed as assembly languages and programming languages. Here they advanced from
vacuum tubes to transistors. This made the computers smaller, faster and more energy-efficient.
And they advanced from binary to assembly languages. For instance, IBM 1620, IBM 7094,
CDC 1604, CDC 3600, and so forth.

IBM 1620 IBM 7094

CDC 1604 CDC 3600

IBM 1620 - Announced on October 21, 1959, It was marketed as an inexpensive


scientific computer its general purpose was to store data processing system for small
businesses, Schools that needs solutions to complex problems like in the fields of engineering
and research.
IBM 7094 - Intented Mainly for scientific computation, but it was also suitable for
administrative and business purposes. It was invented on January 1962

CDC 1604 - The function of CDC 1604 included controlling weapons systems, real-time
data processing, large-scale scientific problem solving, and commercial applications. It was
introduced on October 1959.

CDC 3600- Used for numerical calculations. Invented on June 196

3rd Generation: The hallmark of this period (1964-1971) was the development of the
integrated circuit. A single integrated circuit (IC) is made up of many transistors, which
increases the power of a computer while simultaneously lowering its cost. These computers
were quicker, smaller, more reliable, and less expensive than their predecessors. High-level
programming languages such as FORTRON-II to IV, COBOL, and PASCAL PL/1 were
utilized. For example, the IBM-360 series, the Honeywell-6000 series, and the IBM-370/168.

IBM-360 Honeywell-6000

IBM-370/168.

IBM-360 The first family of computers designed to cover a wide range of applications,
from small to large, in both scientific and commercial contexts. It was ivented on April 7,
1964
Honeywell-6000 series - A “Memory oriented” system that had a system controller in
each memory module that sorted out requests from other system components. In production
from 1970 to 1989.

IBM-370/168- A large-scale system built with virtual storage features for use in
scientific and commercial applications. Virtual storage efficiently removes the constraints of
processor storage from the system and problematic programs. First made on August 7, 1972.
4th Generation: The invention of the microprocessors brought along the fourth
generation of computers. The years 1971-1980 were dominated by fourth generation
computers. C, C++ and Java were the programming languages utilized in this generation of
computers. For instance, the STAR 1000, PDP 11, CRAY-1, CRAY-X-MP, and Apple II. This
was when we started producing computers for home use.

Star 1000 PDP 11

...

CRAY-X-MP Apple II

STAR 1000- Known as the “First Supercomputers”, It was used to improve the
performance on scientific applications. Made public in the early 70’s

PDP 11- A typical minicomputer for general-purpose computing, including timesharing,


scientific, educational, medical, and business computing. It was introduced in 1970.

CRAY-X-MP- Provides the greatest possible computational power from the


smallest form factor of microchips or microprocessor dye
5th Generation: These computers have been utilized since 1980 and continue to be
used now. This is the present and the future of the computer world. The defining aspect of this
generation is artificial intelligence. The use of parallel processing and superconductors are
making this a reality and provide a lot of scope for the future. Fifth-generation computers use
ULSI (Ultra Large Scale Integration) technology. These are the most recent and sophisticated
computers. C, C++, Java,.Net, and more programming languages are used. For instance, IBM,
Pentium, Desktop, Laptop, Notebook, Ultrabook, and so on.

ULSI

IBM Pentium

Desktop Laptop

ULSI - Provides computational power from the smallest factor of microchip or microchip.
It was invented in 1984

You might also like