You are on page 1of 16

World War II was a time of great technological advancement.

Radar was invented to defend


Great Britain from the bombers of the German Luftwaffe. Aviation was advanced by the jet
engine. And no one will forget the climax of the war with the world's first glimpse of
nuclear weapons over Hiroshima and Nagasaki. But none of these advancements were as
significant to the twentieth century as the electronic digital computer. There are many
stories about how the computer came into being, but the most exciting story is that of the
ENIAC. It begins in 1938, the outbreak of World War II, and a large problem faced by the
United States Army that was in dire need of a fast solution.

The machine designed by Drs. Eckert and Mauchly was a monstrosity. When it was finished,
the ENIAC filled an entire room, weighed thirty tons, and consumed two hundred kilowatts
of power. It generated so much heat that it had to be placed in one of the few rooms at the
University with a forced air-cooling system. Vacuum tubes, over 19,000 of them, were the
principal elements in the computer's circuitry. It also had fifteen hundred relays and
hundreds of thousands of resistors, capacitors, and inductors. All of this electronics were
held in forty-two panels nine feet tall, two feet wide, and one foot thick. They were
arranged in a "U" shape, with three panels on wheels so they could be moved around. An IBM
card reader and cardpunch were used respectively for input and output.
Pascal's triangle is named after the French mathematician and philosopher Blaise Pascal
(1623-62), who wrote a Treatise on the Arithmetical Triangle describing it. But Pascal was
not the first to draw out this triangle or to notice its amazing properties!

Pascal made several other important contributions to the


history of mathematics, including the first digital calculator,
which he designed to help his father, who was a tax collector.
Adding French currency was difficult, because the currency
consisted of livres, sols, and deniers, with 12 deniers in a sol
and 20 sols in a livre.
Pascal's machine, called the Pascaline, never was a great success, however. Fifty prototypes
were manufactured, but the machine did not sell well--perhaps because the only
arithmetical function it could perform was addition!

Blaise Pascal,
French mathematician,
Born in the year 1623,
Invented – Pascal’s Digital Counter, Pascaline,
Pascal’s Triangle etc.

Long before Pascal, 10th century Indian mathematicians described this array of numbers as
useful for representing the number of combinations of short and long sounds in poetic
meters. The triangle also appears in the writings of Omar Khayyam, the great eleventh-
century astronomer, poet, philosopher, and mathematician, who lived in what is modern-day
Iran.

The Chinese mathematician Chu Shih Chieh depicted the triangle and indicated its use in

providing coefficients for the binomial expansion of in his 1303 treatise The
Precious Mirror of the Four Elements.
The Chinese people invented the abacus about 6000 years ago.it has two decks. Many people
still use the abacus. You can see its framework as given below.

An Abacus consists of a rectangular frame carrying a number of rods or wires. A transverse


bar (centre bar) divides each of these rods into two unequal portions. On the upper smaller
portion of each rod are two beads and on the lower portion five beads. The whole acts as a
numerical register, each rod representing a decimal order as in our "Arabic" notation. There
is rather more space on the rods than is occupied by the beads, which are accordingly free
to slide a short distance towards or away from the centre bar. The positions of the beads
on a particular rod represent a digit in that particular decimal position.

Below a shopkeeper in
China is using an abacus for
doing calculations.
William Oughtred invented the slide rule in the year 1620

This was a great time saver but there was still quite a lot of work required. The
mathematician had to look up two logs, add them together and then look for the number
whose log was the sum. Edmund Gunter soon reduced the effort by drawing a number line
proportional to their logs. The scale started at one because the log of one is zero. Two
numbers could be multiplied by measuring the

Distance from the beginning of the scale to one factor with a pair of dividers, then moving
them to start at the other factor and reading the number at the combined distance. In the
years that followed, other people refined Oughtred's design into a sliding bar held in place
between two other bars. Circular slide rules and cylindrical/spiral slide rules also appeared
quickly. The cursor appeared on the earliest circular models but appeared much later on
straight versions. By the late 17th century, the slide rule was a common instrument with
many variations. It remained the tool of choice for many for the next three hundred years.
While great aids, slide rules were not particularly intuitive for beginners. A 1960 Pickett
manual said:

"When people have difficulty in learning to use a slide rule, usually it is not because
the instrument is difficult to use. The reason is likely to be that they don't
understand the mathematics on which the instrument is based, or the formulas they
are trying to evaluate. Some slide rule manuals contain relatively exhaustive
explanations of the theory underlying the operations. In this manual it is assumed
that the theory of exponents, of logarithms, of trigonometry, and of the slide rule
is known to the reader, or will be recalled or studied by reference to formal
textbooks on these subjects."
Computer History
Computer History Computer History
Description of
Year/Enter Inventors/Inventions
Event
Howard Aiken & Grace Hopper The Harvard Mark 1
1944
Harvard Mark I Computer computer.
John Presper Eckert & John W. Mauchly 20,000 vacuum tubes
1946
ENIAC 1 Computer later...
No, a transistor is
John Bardeen, Walter Brattain & William not a computer, but
1947/48 Shockley this invention greatly
The Transistor affected the history
of computers.
First commercial
John Presper Eckert & John W. Mauchly computer & able to
1951
UNIVAC Computer pick presidential
winners.
Jack Kilby & Robert Noyce Otherwise known as
1958
The Integrated Circuit 'The Chip'
From an "Acorn"
IIIIIIIIIIIIII
1981 IBM PC grows a personal
computer revolution
The more affordable
1984 Apple Macintosh Computer home computer with
a GUI.
Microsoft begins the
1985 Microsoft Windows friendly war with
Apple.

In 1822 Charles Babbage now called father of modern computer


invented the difference engine.
Late in 1820 the Astronomical Society of London commissioned on
behalf of its members the production of an accurate set of tables of
all the Greenwich stars, tables to reduce the observed positions of
these stars to their true positions. The ones that had been produced
officially by the Royal Observatory were not reliable enough and
were full of errors. As had always been the case such tables had to
be calculated manually. To minimise flaws in the process the method
used was to set two human computers first to work out each page
independently and then to compare their figures afterwards.
Members of the Society were asked to undertake the validation of
the results. Then Charles Babbage decided to make the difference
engine. This did not prove to be a success as it could solve only
polynomials. But it did not only solve sums it also printed them.
The complete engine, which would be room-sized, is planned to be
able to operate both on 6th-order differences with numbers of
about 20 digits, and on 3rd-order differences with numbers of 30
digits. Each addition would be done in two phases, the second one
taking care of any carries generated in the first. The output digits
would be punched into a soft metal plate, from which a plate for a
printing press could be made.

Portion of the calculating mechanism of Difference Engine. Made


by the Science Museum to verify the design of the basic adding element
The first large scale, automatic, general purpose, electromechanical calculator was the
Harvard Mark I (AKA IBM Automatic Sequence Control Calculator [ASCC]) conceived by
Howard Aiken in the late 1930's and implemented by Messrs. Hamilton, Lake, Durfee of
IBM. The machine, sponsored by the US Navy, was intended to compute the elements of
mathematical and navigation tables -- the same purpose as intended by Babbage for the
Difference Engine. Aiken dedicated his early reports to Babbage, having been made aware
of the piece of the Difference Engine at Harvard in 1937. The ASCC was not a stored
program machine but instead was driven by a paper tape containing the instructions.

Grace Murray Hopper went to work for Aiken at Harvard in June 1944 and became
the third programmer on the Mark I. The two who preceded her, then called "coders", were
Ensigns Robert Campbell and Richard Bloch.
The Difference Engines were automatic i.e. they did not rely (as did the manual calculators
that came before) on the continuous informed intervention of a human operator to get
useful results. The Difference Engines were the first designs to successfully embody
mathematical rule in mechanism. However, the Difference Engines are not a general purpose
machines. They could process numbers entered into them only by adding them in a particular
sequence. The Analytical Engine was not only automatic but also general purpose i.e. it could
be `programmed’ by the user to execute a repertoire instructions in any required order.

Portion of the mill of the Analytical Engine with printing mechanism,


under construction at the time of Babbage’s death.

The engine was envisaged as a universal machine for finding the value of almost any
algebraic function. The Analytical Engine is not a single physical machine but a succession of
designs that Babbage refined until his death in 1871. The engine was envisaged as a universal
machine for finding the value of almost any algebraic function. The Analytical Engine is not a
single physical machine but a succession of designs that Babbage refined until his death in
1871. The Engine would have been vast. Had it been built it would have needed to be
operated by a steam engine of some kind. Babbage made little attempt to raise funds to
build the Analytical Engine. Instead he continued to work on simpler and cheaper methods
of manufacturing parts and built a small trial model which was under construction at the
time of his death. The movement to automate mathematical calculation in the nineteenth
century failed and the impetus to continue this work was largely lost with Babbage’s death.
From the vantage point of the modern computer age we are better placed to appreciate the
full extent to which Babbage was indeed the first pioneer of computing.

Charles Babbage (1791-1871) is widely regarded as the first computer pioneer and the great
ancestral figure in the history of computing. Babbage excelled in a variety of scientific and
philosophical subjects though his present-day reputation rests largely on the invention and
design of his vast mechanical calculating engines. His Analytical Engine conceived in 1834 is
one of the startling intellectual feats of the nineteenth century. The design of this machine
possesses all the essential logical features of the modern general-purpose computer.
However, there is no direct line of descent from Babbage’s work to the modern electronic
computer invented by the pioneers of the electronic age in the late 1930s and early 1940s
largely in ignorance of the detail of Babbage's work.
John Napier invented Napier’s rods in the year 1617.

The working principal of the bones is explained in "Rabdologiae". The bones are an aid
for multiplication and division. Even squares roots and powers could be done.

Here is explained how you can reduce multiplication to simple additions with Napier's
rods or bones. Each bone contains a multiplication table of a number from 0 to 9.

Here is explained how you can reduce multiplication to simple additions with Napier's
rods or bones. Each bone contains a multiplication table of a number from 0 to 9.

With the "/" on the rods you can see which numbers should be
Added.
The transistor is an influential invention that changed the course of history for computers.
The first generation of computers used vacuum tubes; the second generation of computers
used transistors; the third generation of computers used integrated circuits; and the
fourth generation of computers used microprocessors. John Bardeen, William Shockley, and
Walter Brattain, scientists at the Bell Telephone Laboratories in Murray Hill, New Jersey,
were researching the behavior of crystals (germanium) as semi-conductors in an attempt to
replace vacuum tubes as mechanical relays in telecommunications. The vacuum tube, used to
amplify music and voice, made long-distance calling practical, but the tubes consumed power,
created heat and burned out rapidly, requiring high maintenance. The team's research was
about to come to a fruitless end when a last attempt to try a purer substance as a contact
point leads to the invention of the "point-contact" transistor amplifier. In 1956, the team
received the Nobel Prize in Physics for the invention of the transistor. A transistor is a
device composed of semi-conductor material that can both conduct and insulate (e.g.
germanium and silicon). Transistors switch and modulate electronic current. Before
transistors, digital circuits were composed of vacuum tubes. The transistor was the first
device designed to act as both a transmitter, converting sound waves into electronic waves,
and resistor, controlling electronic current. The name transistor comes from the 'trans' of
transmitter and 'sistor' of resistor. John Bardeen and Walter Brattain took out a patent
for their transistor. William Shockley applied for a patent for the transistor effect and a
transistor amplifier. Transistors transformed the world of electronics and had a huge
impact on computer design. Transistors made of semiconductors replaced tubes in the
construction of computers. By replacing bulky and unreliable vacuum tubes with transistors,
computers could now perform the same functions, using less power and space.
Datamatic Corporation was created jointly by Honeywell and Raytheon in 1955. The
Datamatic 1000 was a large-scale, general-purpose electronic data processing machine. Core
memory was 24K decimal digits.

The CPU and its peripherals filled an entire room. The Datamatic could convert the data on
900 punched cards to magnetic tape in one minute. Punched data output was 100 cards per
minute. It could multiply 1,000 sets of 11 digit numbers in one second. It used 3-inch wide
Mylar plastic magnetic tape.
In July of 1980, IBM representatives met for the first time with Microsoft's Bill Gates to
talk about writing an operating system for IBM's new hush-hush "personal" computer. IBM
had been observing the growing personal computer market for some time. They had already
made one dismal attempt to crack the market with their IBM 5100. At one point, IBM
considered buying the fledgling game company Atari to commandeer Atari's early line of
personal computers. However, IBM decided to stick with making their own personal
computer line and developed a brand new operating system to go with. The secret plans were
referred to as "Project Chess". The code name for the new computer was "Acorn". Twelve
engineers, led by William C. Lowe, assembled in Boca Raton, Florida, to design and build the
"Acorn". On August 12, 1981, IBM released their new computer, re-named the IBM PC. The
"PC" stood for "personal computer" making IBM responsible for popularizing the term "PC".

The first IBM PC ran on a 4.77 MHz Intel 8088 microprocessor. The PC came equipped with
16 kilobytes of memory, expandable to 256k. The PC came with one or two 160k floppy disk
drives and an optional color monitor. The price tag started at $1,565, which would be nearly
$4,000 today. What really made the IBM PC different from previous IBM computers was
that it was the first one built from off the shelf parts (called open architecture) and
marketed by outside distributors (Sears & Roebucks and Computer land). The Intel chip was
chosen because IBM had already obtained the rights to manufacture the Intel chips. IBM
had used the Intel 8086 for use in its Display writer Intelligent Typewriter in exchange for
giving Intel the rights to IBM's bubble memory technology.

Less than four months after IBM introduced the PC, Time Magazine named the computer
"man of the year"
In December, 1983, Apple Computers ran its' famous "1984" Macintosh television
commercial, on a small unknown station solely to make the commercial eligible for awards
during 1984. The commercial cost 1.5 million and only ran once in 1983, but news and talk
shows everywhere replayed it, making TV history. The next month, Apple Computer ran the
same ad during the NFL Super Bowl, and millions of viewers saw their first glimpse of the
Macintosh computer. The commercial was directed by Ridley Scott, and the Orwellian scene
depicted the IBM world being destroyed by a new machine, the "Macintosh".

Could we expect anything less from a company that was now being run by the former
president of Pepsi-Cola. Steve Jobs, co-founder of Apple Computers had been trying to hire
Pepsi's John Sculley since early 1983. In April of that year he succeeded. But Steve and
John discovered that they did not get along and one of John Sculley's first actions as CEO
of Apple was to boot Steve Jobs off the Apple "Lisa" project, the "Lisa" was the first
consumer computer with a graphical user interface or GUI. Jobs then switched over to
managing the Apple "Macintosh" project begun by Jeff Raskin. Jobs was determined that
the new "Macintosh" was going to have a graphical user interface, like the "Lisa" but at a
considerably lower cost.

Note: The early Mac team members (1979) consisted of Jeff Raskin, Brian Howard, Marc
LeBrun, Burrell Smith. Joanna Hoffman and Bud Tribble. Others began working working on
the Mac at later dates.
Sure, Deep Blue plays a mean game of chess. But what else can it do?

Deep Blue is at heart a massively parallel, RS/6000 SP-based computer system that was
designed to play chess at the grandmaster level. But the underlying RS/6000 technology is
being used to tackle complex "real world" problems like:

• Cleaning up toxic waste sites


• Forecasting the weather
• Modelling financial data
• Designing cars
• Developing innovative drug therapies

Not to mention running the occasional high-volume scalable WWW server.


What was the first computer and who built it?

It turns out that this is more a question of definition than a question of fact. The
computer, as we now understand the word, was very much an evolutionary development
rather than a simple invention. This article traces the sequence of the most important steps
in that development, and in the earlier development of digital calculators without
programmability. It may help you to decide for yourself whether you think the first
computer was the ABC, the V3 (aka Z3), the ENIAC, the SSEC, the Manchester Mark I, the
EDSAC, or perhaps yet another machine -- and how to apportion the honor of invention
among John Atanasoff, Charles Babbage, Presper Eckert, John Mauchly, Alan Turing, John
von Neumann, Konrad Zuse, and others.

"Who invented the computer?" is not a question with a simple answer. The real answer is
that many inventors contributed to the history of computers and that a computer is a
complex piece of machinery made up of many parts, each of which can be considered a
separate invention.

You might also like