You are on page 1of 7

HISTORY OF COMPUTER

The history of computers starts out about 2000 years ago, at the birth of the
abacus, a wooden rack holding two horizontal wires with beads strung on them. When these
beads are moved around, according to programming rules memorized by the user, all regular
arithmetic problems can be done. Another important invention around the same time was the
Astrolabe, used for navigation. Blaise Pascal is usually credited for building the first digital
computer in 1642. It added numbers entered with dials and was made to help his father, a tax
collector. In 1671, Gottfried Wilhelm von Leibniz invented a computer that was built in 1694. It
could add, and, after changing some things around, multiply. Leibniz invented a special stepped
gear mechanism for introducing the addend digits, and this is still being used. The prototypes
made by Pascal and Leibniz were not used in many places, and considered weird until a little
more than a century later, when Thomas of Colmar (A.K.A. Charles Xavier Thomas) created the
first successful mechanical calculator that could add, subtract, multiply, and divide. A lot of
improved desktop calculators by many inventors followed, so that by about 1890, the range of
improvements included:

 Accumulation of partial results


 Storage and automatic entry of past results (A memory function)
 Printing of the results

Each of these required manual installation. These improvements were mainly


made for commercial users, and not for the needs of science.

Mark I

A programmable, electromechanical calculator designed by professor Howard


Aiken. Built by IBM and installed at Harvard in 1944, it strung 78 adding machines together to
perform three calculations per second. It was 51 feet long, weighed five tons and used paper
tape for input and typewriters for output. Made of 765,000 parts, it sounded like a thousand
knitting needles according to Rear Admiral Grace Hopper. The Mark I worked in decimal
arithmetic, not binary, but it could go for hours without intervention. At its dedication ceremony,
Aiken asserted that the Mark I was the modern embodiment of Babbage's Analytical Engine,
although it did not have a conditional statement in its programming repertoire. The experience
helped IBM develop its own computers a few years later.
UNIVAC Computer

The Universal Automatic Computer or UNIVAC was a computer milestone


achieved by Dr. Presper Eckert and Dr. John Mauchly, the team that invented the ENIAC
computer.

John Presper Eckert and John Mauchly, after leaving the academic environment
of The Moore School of Engineering to start their own computer business, found their first client
was the United States Census Bureau. The Bureau needed a new computer to deal with the
exploding U.S. population (the beginning of the famous baby boom). In April 1946, a $300,000
deposit was given to Eckert and Mauchly for the research into a new computer called the
UNIVAC.

The research for the project proceeded badly, and it was not until 1948 that the
actual design and contract was finalized. The Census Bureau's ceiling for the project was
$400,000. J Presper Eckert and John Mauchly were prepared to absorb any overrun in costs in
hopes of recouping from future service contracts, but the economics of the situation brought the
inventors to the edge of bankruptcy.

In 1950, Eckert and Mauchly were bailed out of financial trouble by Remington
Rand Inc. (manufacturers of electric razors), and the "Eckert-Mauchly Computer Corporation"
became the "Univac Division of Remington Rand." Remington Rand's lawyers unsuccessfully
tried to re-negotiate the government contract for additional money. Under threat of legal action,
however, Remington Rand had no choice but to complete the UNIVAC at the original price.

On March 31, 1951, the Census Bureau accepted delivery of the first UNIVAC
computer. The final cost of constructing the first UNIVAC was close to one million dollars.
Forty-six UNIVAC computers were built for both government and business uses. Remington
Rand became the first American manufacturers of a commercial computer system. Their first
non-government contract was for General Electric's Appliance Park facility in Louisville,
Kentucky, who used the UNIVAC computer for a payroll application.
John von Neumann (English pronunciation: /vɒn ˈnɔɪmən/) (December 28, 1903
– February 8, 1957) was a Hungarian-born American mathematician who made major
contributions to a vast range of fields, [1] including set theory, functional analysis, quantum
mechanics, ergodic theory, continuous geometry, economics and game theory, computer science,
numerical analysis, hydrodynamics (of explosions), and statistics, as well as many other
mathematical fields. He is generally regarded as one of the greatest mathematicians in modern
history.[2] The mathematician Jean Dieudonné called von Neumann "the last of the great
mathematicians",[3] while Peter Lax described him as possessing the most "fearsome technical
prowess" and "scintillating intellect" of the century. [4] Even in Budapest, in the time that
produced geniuses like Theodore von Kármán (b. 1881), Leó Szilárd (b. 1898), Eugene Wigner
(b. 1902), and Edward Teller (b. 1908), his brilliance stood out.[5]

Von Neumann was a pioneer of the application of operator theory to quantum


mechanics, in the development of functional analysis, a principal member of the Manhattan
Project and the Institute for Advanced Study in Princeton (as one of the few originally
appointed), and a key figure in the development of game theory[1][6] and the concepts of cellular
automata[1] and the universal constructor. Along with Teller and Stanisław Ulam, von Neumann
worked out key steps in the nuclear physics involved in thermonuclear reactions and the
hydrogen bomb.

Charles Babbage (26 December 1791 - 18 October 1871) was an English


mathematician, analytical philosopher, mechanical engineer and (proto-) computer scientist
who originated the idea of a programmable computer. Parts of his uncompleted mechanisms are
on display in the London Science Museum. In 1991, working from Babbage's original plans, a
difference engine was completed, and functioned perfectly. It was built to tolerances achievable
in the 19th century, indicating that Babbage's machine would have worked. Nine years later, the
Science Museum completed the printer Babbage had designed for the difference engine; it
featured astonishing complexity for a 19th-century device.

Life

Charles Babbage was born in England, most likely at 44 Crosby Row, Walworth
Road, London. A blue plaque on the junction of Larcom Street and Walworth Road
commemorates the event. There was a discrepancy regarding the date of Babbage's birth, which
was published in The Times obituary as 26 December 1792. However, days later a nephew of
Babbage wrote to say that Babbage was born precisely one year earlier, in 1791. The parish
register of St. Mary's Newington, London, shows that Babbage was baptised on 6 January 1792.
Blaise Pascal (1623 - 1662)

From `A Short Account of the History of Mathematics' (4th edition, 1908) by W.


W. Rouse Ball.

Among the contemporaries of Descartes none displayed greater natural genius


than Pascal, but his mathematical reputation rests more on what he might have done than on
what he actually effected, as during a considerable part of his life he deemed it his duty to devote
his whole time to religious exercises.

Blaise Pascal was born at Clermont on June 19, 1623, and died at Paris on Aug.
19, 1662. His father, a local judge at Clermont, and himself of some scientific reputation, moved
to Paris in 1631, partly to prosecute his own scientific studies, partly to carry on the education
of his only son, who had already displayed exceptional ability. Pascal was kept at home in order
to ensure his not being overworked, and with the same object it was directed that his education
should be at first confined to the study of languages, and should not include any mathematics.
This naturally excited the boy's curiosity, and one day, being then twelve years old, he asked in
what geometry consisted. His tutor replied that it was the science of constructing exact figures
and of determining the proportions between their different parts. Pascal, stimulated no doubt by
the injunction against reading it, gave up his play-time to this new study, and in a few weeks had
discovered for himself many properties of figures, and in particular the proposition that the sum
of the angles of a triangle is equal to two right angles. I have read somewhere, but I cannot lay
my hand on the authority, that his proof merely consisted in turning the angular points of a
triangular piece of paper over so as to meet in the centre of the inscribed circle: a similar
demonstration can be got by turning the angular points over so as to meet at the foot of the
perpendicular drawn from the biggest angle to the opposite side. His father, struck by this
display of ability, gave him a copy of Euclid's Elements, a book which Pascal read with avidity
and soon mastered.

Joseph Jacquard , the son of a silk weaver, was born in Lyon in 1752. He
inherited his father's small weaving business but trade was bad and eventually went bankrupt. In
1790 he was given the task of restoring a loom made by Jacques de Vaucasan. Although fifty
years old, it was one of the earliest examples of an automatic loom. Working on this loom led to
him developing a strong interest in the mechanization of silk manufacture.

The French Revolution brought a temporary halt to Jacquard's experiments. Jacquard fought on
the side of the Republicans but as soon as they achieved victory, he returned to work.

In 1801 he constructed a loom that used a series of punched cards to control the pattern of
longitudinal warp threads depressed before each sideways passage of the shuttle. Jacquard later
developed a machine where the punched cards were joined to form an endless loop that
represented the program for the repeating pattern used for cloth and carpet designs.
GENERATIONS OF COMPUTER

First Generation (1940-1956) Vacuum Tubes

The first computers used vacuum tubes for circuitry and magnetic drums for
memory, and were often enormous, taking up entire rooms. They were very expensive to operate
and in addition to using a great deal of electricity, generated a lot of heat, which was often the
cause of malfunctions.

First generation computers relied on machine language, the lowest-level


programming language understood by computers, to perform operations, and they could only
solve one problem at a time. Input was based on punched cards and paper tape, and output was
displayed on printouts.

The UNIVAC and ENIAC computers are examples of first-generation computing


devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S.
Census Bureau in 1951.

Second Generation (1956-1963) Transistors

Transistors replaced vacuum tubes and ushered in the second generation of


computers. The transistor was invented in 1947 but did not see widespread use in computers
until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to
become smaller, faster, cheaper, more energy-efficient and more reliable than their first-
generation predecessors. Though the transistor still generated a great deal of heat that subjected
the computer to damage, it was a vast improvement over the vacuum tube. Second-generation
computers still relied on punched cards for input and printouts for output.

Second-generation computers moved from cryptic binary machine language to


symbolic, or assembly, languages, which allowed programmers to specify instructions in words.
High-level programming languages were also being developed at this time, such as early
versions of COBOL and FORTRAN. These were also the first computers that stored their
instructions in their memory, which moved from a magnetic drum to magnetic core
technology.The first computers of this generation were developed for the atomic energy industry.
Third Generation (1964-1971) Integrated Circuits

The development of the integrated circuit was the hallmark of the third
generation of computers. Transistors were miniaturized and placed on silicon chips, called
semiconductors, which drastically increased the speed and efficiency of computers.

Instead of punched cards and printouts, users interacted with third generation
computers through keyboards and monitors and interfaced with an operating system, which
allowed the device to run many different applications at one time with a central program that
monitored the memory. Computers for the first time became accessible to a mass audience
because they were smaller and cheaper than their predecessors.

Fourth Generation (1971-Present) Microprocessors

The microprocessor brought the fourth generation of computers, as thousands of


integrated circuits were built onto a single silicon chip. What in the first generation filled an
entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971,
located all the components of the computer—from the central processing unit and memory to
input/output controls—on a single chip.

In 1981 IBM introduced its first computer for the home user, and in 1984 Apple
introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers
and into many areas of life as more and more everyday products began to use microprocessors.

As these small computers became more powerful, they could be linked together to
form networks, which eventually led to the development of the Internet. Fourth generation
computers also saw the development of GUIs, the mouse and handheld devices.

Fifth Generation (Present and Beyond) Artificial Intelligence

Fifth generation computing devices, based on artificial intelligence, are still in


development, though there are some applications, such as voice recognition, that are being used
today. The use of parallel processing and superconductors is helping to make artificial
intelligence a reality. Quantum computation and molecular and nanotechnology will radically
change the face of computers in years to come. The goal of fifth-generation computing is to
develop devices that respond to natural language input and are capable of learning and self-
organization.
COMPUTER HARDWARE

Hardware is a comprehensive term for all of the physical parts of a computer, as


distinguished from the data it contains or operates on, and the software that provides
instructions for the hardware to acoomplish tasks. The boundary between hardware and
software is slightly blurry - firmware is software that is "built-in" to the hardware, but such
firmware is usually the province of computer programmers and computer engineers in any case
and not an issue that computer users need to concern themselves with.

A typical computer (Personal Computer, PC) contains in a desktop or tower case the following
parts:

 Motherboard which holds the CPU, main memory and other parts, and has slots for
expansion cards
 power supply - a case that holds a transformer, voltage control and fan
 storage controllers, of IDE, SCSI or other type, that control hard disk , floppy disk, CD-
ROM and other drives; the controllers sit directly on the motherboard (on-board) or on
expansion cards
 graphics controller that produces the output for the monitor
 the hard disk, floppy disk and other drives for mass storage
 interface controllers (parallel, serial, USB, Firewire) to connect the computer to external
peripheral devices such as printers or scanners

You might also like