You are on page 1of 5

Evolution of Computers

Computers have been around a lot longer than many people might imagine. The word "computer" has
changed meaning over decades, but the electronic computer that we think of in modern times developed
throughout the second half of the 20th century. Its popularity as a household item surged in the 1980s following
the arrival of operating systems by Apple and Microsoft that mixed graphics and text, replacing the text-only
systems of the 1970s. By the 1990s, computers incorporated enhanced communication and multimedia
applications and became an indispensable part of daily life for millions of people.

Early Computing

The original definition of the word "computer" was a person who made calculations. This definition goes
back to the 1600s and extends midway through the 20th century, when the term "computer" began to refer
to a machine. The computer is based on the same concept as the abacus, which goes back many
centuries. Technology made a giant leap with punched cards, introduced by Joseph-Marie Masquard in
1801. It's interesting that an early use of this system involved music, in which piano rolls assigned actions
to notes on a piano, leading to the "player piano" in the 1870s. In 1835 Charles Babbage combined
punched cards with a steam engine to invent what he called an "analytical engine."

In the beginning, when the task was simply counting or adding, people used either their fingers or pebbles
along lines in the sand in order to simply the process of counting, people in Asia minor built a counting
device called ABACUS, the device allowed users to do calculations using a system of sliding beads
arranged on a rack.

With the passage of time, many computing devices such as Napier bones and slide rule were invented. It
took many centuries for the advancement in computing devices. In 1642, a French mathematician, Blaise
Pascal invented the first functional automatic calculator. The brass rectangular box also called Pascaline,
used eight movable dials to add sums and eight figures only.

In 1694, german mathematician Gotfried Wilhemvoz Leibniz, extended Pascal’s design to perform
multiplication, division and to find square root. This machine is known as stepped reckoner. The only
problem with this device that it lacked mechanical precision in its construction and was not reliable. The
real beginning computer was made by an English mathematician Charles Babbage in 1822. He proposed a
engine to perform difference equations, called a difference engine. It would print results automatically.
However, Babbage never quite made a fully functional difference engine, and in 1833, he quit working on it
to concentrate on analytical engine.
The basic design of the engine included input devices in the form of perforated cards containing
operating system as a store for memory of 1,000 numbers up to 50 decimal digits long. It also contained a
controlled unit that allowed processing instructions at any sequence, output device to produce printed
results. Babbage borrowed the idea of punch cards to encode the instructions in the machine from the
Joseph Marie jacquard’s loom

In 1889, Herman Hollerith, worked for us. Census Bureau, also applied the Jacquard’s Loom concept to
computing. The start of world war-2 substantial need for computers capacity, especially for military
purposes. One early success was mark-1, which was by IBM Andhardvard Huiken in 1944. In 1946, John
Eckert and johny Mauchly of Moore school of engineering, developed the ENIAC (Electronic Numeric
Integrator And Calculator). Later, EDVAC (Electronic Discrete Variable Automatic Computer). It was first
electronic computer developed by John Von Neumann. In 1949, Maurice developed EDSAC(Electronic
Delay Storage Automatic Calculator). Eckert-Mauchly corporation manufactured UNIVAC (Universal
Automatic computer) in 1951. In 1960, fastest computer to access the time of 1 micro second and a total
capacity of 100,000,000 words was developed. During 1970s, the trend for cheaper computers made
possible by integrated chips (IC) and Microprocessors. Today using VLSI (Very Large Scale Integrated
Circuits), which are programmed using ROM is made. It could handle 32 bits at a time, and can process
4,000,000 instructions at a time.

Computer Generations

The term Computer, originally meant a person capable of performing numerical calculations with the help
of a mechanical computing device. The evolution of computers started way back in the late 1930s. Binary
arithmetic is at the core of the computers of all times. History of computers dates back to the invention of a
mechanical adding machine in 1642. ABACUS, an early computing tool, invention of logarithm by John
Napier and the invention of slide rules by William Oughtred were significant events in the evolution of
computers from these early computing devices.

In the evolution of computers their first generation was characterized by the use of vacuum tubes.
These computers were expensive and bulky. They used machine language for computing and could solve
just one problem at a time. They did not support multitasking.

 It was in 1937 that John V. Atanasoff devised the first digital electronic computer. Atanasoff and
Clifford Berry came up with the ABC prototype in the November of 1939. Its computations were
based on a vacuum tube and it used regenerative capacitor memory.
 Konrad Zuse’s electromechanical ‘Z Machines’, especially the Z3 of 1941 was a notable
achievement in the evolution of computers. It was the first machine to include binary and floating-
point arithmetic and a considerable amount of programmability. In 1998, since it was proved to be
Turing complete, it is regarded as world’s first operational computer.
 In 1943, the Colossus was secretly designed at Bletchley Park, Britain to decode German
messages. The Harvard Mark I of 1944 was a large-scale electromechanical computer with less
programmability. It was another step forward in the evolution of computers.

 The U.S. Army's Ballistics Research Laboratory came up with the Electronic Numerical Integrator
And Computer (ENIAC) in 1946. It came to be known as the first general purpose electronic
computer. However it was required to be rewired to change it’s programming thus making its
architecture inflexible. Developers of ENIAC realized the flaws in the architecture and developed a
better architecture. It was known as the stored program architecture or von Neumann Architecture.
It got its name after John von Neumann, who for the first time described the architecture in 1945.
All the projects of developing computers taken up thereafter have been using the von Neumann
Architecture. All the computers use a ‘stored program architecture’, which is now a part of the
definition of the word ‘computer’.
 The U.S. National Bureau of Standards came up with Standards Electronic/Eastern Automatic
Computer (SEAC) in 1950. Diodes handled all the logic making it the first computer to base its
logic on solid devices. IBM announced the IBM 702 Electronic Data Processing Machine in 1953.
It was developed for business use and could address scientific and engineering applications. Till
the 1950s all computers that were used were vacuum tube based.
In the 1960s, transistor based computers replaced vacuum tubes. Transistors made computers smaller
and cheaper. They made computers energy efficient. But transistors were responsible for the emission of
large amounts of heat from the computer. Due to this computers were subject to damage. The use of
transistors marked the second generation of computers. Computers belonging to this generation used
punched cards for input. They used assembly language.
 Stanford Research Institute brought about ERMA, Electronic Recording Machine Accounting
Project, which dealt with automation of the process of bookkeeping in banking.
 In 1959, General Electric Corporation delivered its ERMA computing system to the Bank of
America in California.
The use of Integrated circuits ushered in the third generation of computers. Small transistors placed on
silicon chips, called semi conductors. This increased the speed and efficiency of computers. Operating
systems were the human interface to computing operations and keyboards and monitors became the input-
output devices.
 In 1968, DEC launched the first mini computer called the PDP-8.
 In 1969, the development of ARPANET began with the financial backing of the Department Of
Defense.
Thousands of integrated circuits placed onto a silicon chip made up a microprocessor. Introduction of
microprocessors was the hallmark of fourth generation computers.
 Intel produced large-scale integration circuits in 1971. During the same year, Micro Computer
came up with microprocessor and Ted Hoff, working for Intel introduced 4-bit 4004.
 In 1972, Intel introduced the 8080 microprocessors.
 In 1974, Xerox came up with Alto workstation at PARC. It consisted of a monitor, a graphical
interface, a mouse, and an Ethernet card for networking.
 Apple Computer brought about the Macintosh personal computer January 24 1984.
The fifth generation computers are under development. They are going to be based on principles of
artificial intelligence and natural language recognition. Developers are aiming at computers capable of
organizing themselves. The evolution of computers continues

Mechanical Information ProcessingComputer Generations

The company IBM grew out of the invention of the tabulator, crafted by Herman Hollerith in the late 1880s.
This was the first use of punched cards representing data as opposed to punched cards automating a
mechanical function like a player piano. The information processing world through the 1950s was based on
a combination of punched cards, the tabulator and key punch machines. The first calculators appeared in
the 1930s. Analog machines began to get replaced by the digital concept of zeroes and ones throughout
the World War II era. The first computer made for the masses was UNIVAC, made by Remington Rand in
1951. IBM introduced its mainframe computer the following year.

Computer Integration

Early Remington computers sold at over a million dollars per machine, but IBM made smaller, more
affordable machines that became popular. In 1954 IBM developed Fortran, one of the original computer
programming languages, based heavily on mathematics. During the same decade, the developments of
the transistor, integrated circuits and microprogramming led the way for reducing computer size.
Meanwhile, CPUs increased computer processing speed and memory improved data storage. The arrival
of microprocessors introduced by Texas Instruments and Intel in the early 1970s paved the way for
miniaturized yet more powerful computers.

Rise of the PC

Up until the 1970s computers were mainly used by business, government and universities. Personal
computers first appeared on the market in the late 1970s. Apple introduced the Apple I in 1976 and the
Apple II the following year, ushering in an era for the masses using computers at home. From this point on,
the software industry began to develop, with Microsoft and Apple as the primary companies. Microsoft
became a software giant by marketing its Dos operating system with IBM computers beginning in 1984.
Apple introduced the MacIntosh in 1984, marking the beginning of graphics and text, replacing systems
that only displayed text. Ever since, Apple has called its computer system "Mac" to differentiate itself from
the rest of the PC market.
Multimedia Culture

In the 1990s the computer became common in almost every household, as compared with the previous
decade. Part of the reason for this surge in computer popularity was that by the 1990s much of the
population had already become familiar with computers from school or work since computers were
considered a business necessity by then. Microsoft's Windows 95 operating system accelerated the mass
use of computers while the growth of the World Wide Web throughout the 1990s also helped attract
interest in computers. Soon, nearly every profession needed software to improve its product or service. By
the first decade of the 2000s, Microsoft had introduced the XP and Vista operating systems while Apple
offered the OS X series through Leopard. These developments, along with other popular software
applications, meant that the average person now had access to robust multimedia tools.

The New Television

In the early 2000s computers started to become integrated with the television. YouTube.com became one
of the top 10 most popular websites on the Internet. This site helped contribute to a growing interest in
watching videos, TV shows and movies on a computer. Companies such as Amazon.com, Netflix,
Blockbuster and Walmart began to offer TV and movie downloads following Apple's quick rise to top of the
music retail industry with iTunes. Apple's iMovie computer software also made it easy for a novice video
producer to create quality professional videos.

You might also like