You are on page 1of 10

History of computers

First Computers

Eniac Computer

The first substantial computer was the giant ENIAC machine by John W. Mauchly and
J. Presper Eckert at the University of Pennsylvania. ENIAC (Electrical Numerical
Integrator and Calculator) used a word of 10 decimal digits instead of binary ones like
previous automated calculators/computers. ENIAC was also the first machine to use
more than 2,000 vacuum tubes, using nearly 18,000 vacuum tubes. Storage of all
those vacuum tubes and the machinery required to keep the cool took up over 167
square meters (1800 square feet) of floor space. Nonetheless, it had punched-card
input and output and arithmetically had 1 multiplier, 1 divider-square rooter, and 20

adders employing decimal "ring counters," which served as adders and also as quickaccess (0.0002 seconds) read-write register storage.
The executable instructions composing a program were embodied in the separate units
of ENIAC, which were plugged together to form a route through the machine for the
flow of computations. These connections had to be redone for each different problem,
together with presetting function tables and switches. This "wire-your-own"
instruction technique was inconvenient, and only with some license could ENIAC be
considered programmable; it was, however, efficient in handling the particular
programs for which it had been designed. ENIAC is generally acknowledged to be the
first successful high-speed electronic digital computer (EDC) and was productively
used from 1946 to 1955. A controversy developed in 1971, however, over the
patentability of ENIAC's basic digital concepts, the claim being made that another
U.S. physicist, John V. Atanasoff, had already used the same ideas in a simpler
vacuum-tube device he built in the 1930s while at Iowa State College. In 1973, the
court found in favor of the company using Atanasoff claim and Atanasoff received the
acclaim he rightly deserved.

Progression of Hardware
In the 1950's two devices would be invented that would improve the
computer field and set in motion the beginning of the computer
revolution. The first of these two devices was the transistor.
Invented in 1947 by William Shockley, John Bardeen, and Walter
Brattain of Bell Labs, the transistor was fated to oust the days of
vacuum tubes in computers, radios, and other electronics.

Vaccum Tubes

The vacuum tube, used up to this time in almost all the computers and calculating
machines, had been invented by American physicist Lee De Forest in 1906. The
vacuum tube, which is about the size of a human thumb, worked by using large
amounts of electricity to heat a filament inside the tube until it was cherry red. One
result of heating this filament up was the release of electrons into the tube, which
could be controlled by other elements within the tube. De Forest's original device was
a triode, which could control the flow of electrons to a positively charged plate inside
the tube. A zero could then be represented by the absence of an electron current to the
plate; the presence of a small but detectable current to the plate represented a one.
Vacuum tubes were highly inefficient, required a great
deal of space, and needed to be replaced often.
Computers of the 1940s and 50s had 18,000 tubes in
them and housing all these tubes and cooling the
rooms from the heat produced by 18,000 tubes was not
cheap. The transistor promised to solve all of these
problems and it did so. Transistors, however, had their
problems too. The main problem was that transistors,
Transistors
like other electronic components, needed to be
soldered together. As a result, the more complex the
circuits became, the more complicated and numerous the connections between the
individual transistors and the likelihood of faulty wiring increased.
In 1958, this problem too was solved by Jack St. Clair Kilby of Texas Instruments. He
manufactured the first integrated circuit or chip. A chip is really a collection of tiny
transistors which are connected together when the transistor is manufactured. Thus,
the need for soldering together large numbers of transistors was practically nullified;
now only connections were needed to other electronic components. In addition to
saving space, the speed of the machine was now increased since there was a
diminished distance that the electrons had to follow.

Circuit Board

Silicon Chip

Mainframes to PCs
The 1960s saw large mainframe computers become much more
common in large industries and with the US military and space
program. IBM became the unquestioned market leader in selling
these large, expensive, error-prone, and very hard to use machines.
A veritable explosion of personal computers occurred in the early 1970s, starting with
Steve Jobs and Steve Wozniak exhibiting the first Apple II at the First West Coast
Computer Faire in San Francisco. The Apple II boasted built-in BASIC programming
language, color graphics, and a 4100 character memory for only $1298. Programs and
data could be stored on an everyday audio-cassette recorder. Before the end of the fair,
Wozniak and Jobs had secured 300 orders for the Apple II and from there Apple just
took off.
Also introduced in 1977 was the TRS-80. This was a home computer manufactured by
Tandy Radio Shack. In its second incarnation, the TRS-80 Model II, came complete
with a 64,000 character memory and a disk drive to store programs and data on. At

this time, only Apple and TRS had machines with disk drives. With the introduction of
the disk drive, personal computer applications took off as a floppy disk was a most
convenient publishing medium for distribution of software.
IBM, which up to this time had been producing mainframes and minicomputers for
medium to large-sized businesses, decided that it had to get into the act and started
working on the Acorn, which would later be called the IBM PC. The PC was the first
computer designed for the home market which would feature modular design so that
pieces could easily be added to the architecture. Most of the components, surprisingly,
came from outside of IBM, since building it with IBM parts would have cost too
much for the home computer market. When it was introduced, the PC came with a
16,000 character memory, keyboard from an IBM electric typewriter, and a
connection for tape cassette player for $1265.
By 1984, Apple and IBM had come out with new models. Apple released the first
generation Macintosh, which was the first computer to come with a graphical user
interface(GUI) and a mouse. The GUI made the machine much more attractive to
home computer users because it was easy to use. Sales of the Macintosh soared like
nothing ever seen before. IBM was hot on Apple's tail and released the 286-AT, which
with applications like Lotus 1-2-3, a spreadsheet, and Microsoft Word, quickly
became the favourite of business concerns.
That brings us up to about ten years ago. Now people have their own personal
graphics workstations and powerful home computers. The average computer a person
might have in their home is more powerful by several orders of magnitude than a
machine like ENIAC. The computer revolution has been the fastest growing
technology in man's history.

History of computing

The history of computing is longer than the history of computing hardware and modern computing
technology and includes the history of methods intended for pen and paper or for chalk and slate,
with or without the aid of tables. The timeline of computing presents a summary list of major
developments in computing by date.

Concrete devices
Computing is intimately tied to the representation of numbers. But long before abstractions like the
number arose, there were mathematical concepts to serve the purposes of civilization. These
concepts are implicit in concrete practices such as :

one-to-one correspondence, a rule to count how many items, say on a tally stick, eventually
abstracted into numbers;

comparison to a standard, a method for assuming reproducibility in a measurement, for


example, the number of coins;

the 3-4-5 right triangle was a device for assuring a right angle, using ropes with 12 evenly
spaced knots, for example.

Numbers
Eventually, the concept of numbers became concrete and familiar enough for counting to arise, at
times with sing-song mnemonics to teach sequences to others. All known languages have words for
at least "one" and "two" (although this is disputed: see Piraha language), and even some animals
like the blackbird can distinguish a surprising number of items.[1]
Advances in the numeral system and mathematical notation eventually led to the discovery of
mathematical operations such as addition, subtraction, multiplication, division, squaring, square root,
and so forth. Eventually the operations were formalized, and concepts about the operations became
understood well enough to be stated formally, and even proven. See, for example, Euclid's
algorithm for finding the greatest common divisor of two numbers.
By the High Middle Ages, the positional Hindu-Arabic numeral system had reached Europe, which
allowed for systematic computation of numbers. During this period, the representation of a
calculation on paper actually allowed calculation of mathematical expressions, and the tabulation
of mathematical functions such as the square root and thecommon logarithm (for use in
multiplication and division) and the trigonometric functions. By the time of Isaac Newton's research,
paper or vellum was an important computing resource, and even in our present time, researchers
like Enrico Fermi would cover random scraps of paper with calculation, to satisfy their curiosity about
an equation.[2] Even into the period of programmable calculators, Richard Feynman would
unhesitatingly compute any steps which overflowed the memory of the calculators, by hand, just to
learn the answer.[citation needed]

Early computation
Main article: Timeline of computing hardware 2400 BC1949
The earliest known tool for use in computation was the abacus, and it was thought to have been
invented in Babylon circa 2400 BC. Its original style of usage was by lines drawn in sand with
pebbles. Abaci, of a more modern design, are still used as calculation tools today. This was the first
known computer and most advanced system of calculation known to date - preceding Greek
methods by 2,000 years.[citation needed]

In 1110 BC, the south-pointing chariot was invented in ancient China. It was the first
known geared mechanism to use a differential gear, which was later used in analog computers.
The Chinese also invented a more sophisticated abacus from around the 2nd century BC known as
the Chinese abacus).[citation needed]
In the 5th century BC in ancient India, the grammarian Pnini formulated the grammar of Sanskrit in
3959 rules known as the Ashtadhyayi which was highly systematized and technical. Panini used
metarules, transformations and recursions.[3]
In the 3rd century BC, Archimedes used the mechanical principle of balance (see Archimedes
Palimpsest#Mathematical content) to calculate mathematical problems, such as the number of
grains of sand in the universe (The sand reckoner), which also required a recursive notation for
numbers, the myriad myriad ... .
The Antikythera mechanism is believed to be the earliest known mechanical analog computer.[4] It
was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck
off the Greek island of Antikythera, between Kythera and Crete, and has been dated to circa 100 BC.
Mechanical analog computer devices appeared again a thousand years later in the medieval Islamic
world and were developed by Muslim astronomers, such as the mechanical geared astrolabe by Ab
Rayhn al-Brn,[5] and the torquetum by Jabir ibn Aflah.[6] According to Simon Singh, Muslim
mathematicians also made important advances incryptography, such as the development
of cryptanalysis and frequency analysis by Alkindus.[7][8] Programmable machines were also invented
by Muslim engineers, such as the automatic flute player by the Ban Ms brothers,[9] and AlJazari's humanoid robots[citation needed] and castle clock, which is considered to be the
first programmable analog computer.[10]
During the Middle Ages, several European philosophers made attempts to produce analog computer
devices. Influenced by the Arabs and Scholasticism, Majorcan philosopherRamon Llull (12321315)
devoted a great part of his life to defining and designing several logical machines that, by combining
simple and undeniable philosophical truths, could produce all possible knowledge. These machines
were never actually built, as they were more of a thought experiment to produce new knowledge in
systematic ways; although they could make simple logical operations, they still needed a human
being for the interpretation of results. Moreover, they lacked a versatile architecture, each machine
serving only very concrete purposes. In spite of this, Llull's work had a strong influence on Gottfried
Leibniz (early 18th century), who developed his ideas further, and built several calculating tools
using them.
Indeed, when John Napier discovered logarithms for computational purposes in the early 17th
century, there followed a period of considerable progress by inventors and scientists in making
calculating tools. The apex of this early era of formal computing can be seen in the difference
engine and its successor the analytical engine (which was never completely constructed but was
designed in detail), both by Charles Babbage. The analytical engine combined concepts from his
work and that of others to create a device that if constructed as designed would have possessed
many properties of a modern electronic computer. These properties include such features as an
internal "scratch memory" equivalent to RAM, multiple forms of output including a bell, a graphplotter, and simple printer, and a programmable input-output "hard" memory of punch cards which it
could modify as well as read. The key advancement which Babbage's devices possessed beyond
those created before his was that each component of the device was independent of the rest of the
machine, much like the components of a modern electronic computer. This was a fundamental shift
in thought; previous computational devices served only a single purpose, but had to be at best
disassembled and reconfigured to solve a new problem. Babbage's devices could be reprogramed to
solve new problems by the entry of new data, and act upon previous calculations within the same
series of instructions. Ada Lovelace took this concept one step further, by creating a program for the
analytical engine to calculate Bernoulli numbers, a complex calculation requiring a recursive

algorithm. This is considered to be the first example of a true computer program, a series of
instructions that act upon data not known in full until the program is run.
Several examples of analog computation survived into recent times. A planimeter is a device which
does integrals, using distance as the analog quantity. Until the 1980s, HVACsystems used air both
as the analog quantity and the controlling element. Unlike modern digital computers, analog
computers are not very flexible, and need to be reconfigured (i.e., reprogrammed) manually to switch
them from working on one problem to another. Analog computers had an advantage over early
digital computers in that they could be used to solve complex problems using behavioral analogues
while the earliest attempts at digital computers were quite limited.

Since computers were rare in this era, the solutions were often hard-coded into paper forms such
as nomograms,[11] which could then produce analog solutions to these problems, such as the
distribution of pressures and temperatures in a heating system.
None of the early computational devices were really computers in the modern sense, and it took
considerable advancement in mathematics and theory before the first modern computers could be
designed.
The Z3 computer from 1941, by German inventor Konrad Zuse was the first working programmable,
fully automatic computing machine.
The ENIAC (Electronic Numerical Integrator And Computer) was the first electronic general-purpose
computer, announced to the public in 1946. It was Turing-complete, [citation needed] digital, and capable of
being reprogrammed to solve a full range of computing problems.

Navigation and astronomy


Starting with known special cases, the calculation of logarithms and trigonometric functions can be
performed by looking up numbers in amathematical table, and interpolating between known cases.
For small enough differences, this linear operation was accurate enough for use
in navigation and astronomy in the Age of Exploration. The uses of interpolation have thrived in the
past 500 years: by the twentieth century Leslie Comrie and W.J. Eckert systematized the use of
interpolation in tables of numbers for punch card calculation.
In our time, even a student can simulate the motion of the planets, an N-body differential equation,
using the concepts of numerical approximation, a feat which even Isaac Newton could admire, given
his struggles with the motion of the Moon.

Weather prediction

The numerical solution of differential equations, notably the Navier-Stokes equations was an
important stimulus to computing, with Lewis Fry Richardson's numerical approach to solving
differential equations. To this day, some of the most powerful computer systems on Earth are used
for weather forecasts.

Symbolic computations
By the late 1960s, computer systems could perform symbolic algebraic manipulations well enough to
pass college-level calculus courses.

BY :YOGITA MUKHIJA
Batch : B-4
(1st Year)

You might also like