You are on page 1of 6

Chapter 1 – The Development of Computers and Information Systems

Pre-Computer Age and Calculating Machines


The abacus is one of the earliest machines invented over 2000 years ago by Asian
merchants to speed up calculation. It is a simple hand device for recording numbers or
performing simple calculations.
Calculating machines were first introduced in the
17th century. In 1642, the first calculating machine that can
perform addition and subtraction, a precursor of the digital
computer, was devised by the French scientist,
mathematician, and philosopher Blaise Pascal. This device
employed a series of ten-toothed wheels, each tooth
representing a digit from 0 to 9. The wheels were connected
so that numbers could be added to each other by advancing
the wheels by a correct number of teeth. In the 1670s the
German philosopher and mathematician Gottfried Wilhelm
Leibniz improved on this machine by devising one that
could also multiply.
It was in 1820 when the next
generation of calculating devices
was invented, the artithometer, by
Charles Xavier Thomas of France.
It combined the features of the
Leibnitz calculator with newer
engineering techniques.
The first mechanical calculator produced in
the US was developed in 1972 by Frank S.
Baldwin. Improving the Leibnitz design, it
made a much smaller and lighter calculator.

The first commercial calculator that was both a


calculating and a listing machine was developed in 1886
by William Seward Burroughs, an American bank
clerk.

Punched Card Information Processing and the Analytical Engine


The French weaver and inventor Joseph-Marie
Jacquard, designed an automatic loom
(Jacquard’s loom), which used thin, perforated
wooden boards to control the weaving of
complicated cloth designs. The concept of
recording data in the form of holes punched in
cards was used in the design of punched card
information processing equipment. Another
lesson from Jacquard learned from Jacquard
was that work can be performed automatically if
a set of instructions can be given to a machine
to direct it in its operations. This was the
fundamental for the development of computers.
During the 1880s the American statistician Herman Hollerith who worked in the US
Bureau of Census, conceived the idea of using perforated cards (punch cards similar to
Jacquard’s boards) for processing data.
Employing a system that passed punched cards
over electrical contacts, he devised the Hollerith’s
punched-cards tabulating machine, which he used to
speed up the compilation of statistical information for
the 1890 United States census. Hollerith went on to
establish the Tabulating Machine Company to
manufacture and market his invention, which IN 1911
merged with other organizations to form the
Computing-Tabulating-Recording Company.
In 1924, after further acquisitions, Computing-Recording-Tabulating Company absorbed the
International Business Machines Corporation (IBM) and assumed that company's name.
Thomas J. Watson, Sr. arrived that same year and began to build the foundering company into an
industrial giant. IBM soon became the country's largest manufacturer of time clocks and
developed and marketed the first electric typewriter. In 1951 the company entered the computer
field. The punched-card technology was widely used until the mid-1950s.
Also in the 19th century, the British mathematician and
inventor Charles Babbage (referred to as the Father of the modern
computer) worked out the principles of the modern digital computer.
He conceived a number of machines, such as the Difference Engine
and Analytical engine, the forerunners of the modern computer, that
were designed to handle complicated mathematical problems.
One of Babbage’s designs, the Analytical Engine, had many
features of a modern computer. It had an input stream in the form
of a deck of punched cards, a “store” for saving data, a “mill” for
arithmetic operations, and a printer that made a permanent record.
Babbage failed to put this idea into practice, though it may well
have been technically possible at that date.
Many historians consider Babbage and his associate, the mathematician Augusta Ada
Byron, Countess of Lovelace and daughter of the poet, Lord Byron, the true pioneers of the
modern digital computer. The latter provided complete details as to exactly how the analytical
engine was to work. Because she described some of the key elements in computer programming,
she was referred to as the “world’s first computer programmer”.
Early Computers
Analogue computers began to be built in the late 19th century.
Early models calculated by means of rotating shafts and gears. Numerical
approximations of equations too difficult to solve in any other way were
evaluated with such machines. Lord Kelvin built a mechanical tide
predictor that was a specialized analogue computer. During World Wars I
and II, mechanical and, later, electrical analogue computing systems were
used as torpedo course predictors in submarines and as bombsight
controllers in aircraft. Another system was designed to predict spring
floods in the Mississippi River basin.

In the United States, a prototype electronic


machine had been built as early as 1939, by John
Atanasoff and Clifford Berry, at Iowa State College.
This prototype and later research were completed
quietly for the development of the Atanasoff-Berry
Computer (ABC). This is considered as the first
electronic computing machine.
It could only perform addition and subtraction, and never became operational because of
the involvement of the inventors in US military efforts in World War II.
In 1944, Howard Aiken completed the MARK I computer (also known as the
Automatic Sequence controlled Calculator), the first electromechanical computer. It can solve
mathematical problems 1,000 times faster than existing machines.
The first electronic computer to be made
operational was the Electronic Numerical Integrator
and Calculator (ENIAC). It was built in 1946 for the
US Army to perform quickly and accurately the
complex calculations that gunners needed to aim their
artillery weapons. ENIAC contained 18,000 vacuum
tubes and had a speed of several hundred
multiplications per minute, but originally its program
was wired into the processor and had to be manually
altered.
The scientists of the Cambridge University in England
designed the world’s first electronic computer that stored its
program of instructions, the Electronic Delay Storage
Automatic Calculator (EDSAC). This gave more flexibility
in the use of the computer. Two years after (1951), machines
were built with program storage, based on the ideas of the
Hungarian-American mathematician John von Neumann of
Pennsylvania University. The instructions, like the data, were
stored within a “memory”, freeing the computer from the
speed limitations of the paper-tape reader during execution
and permitting problems to be solved without rewiring the
computer. This concept gave birth to the Electronic Discreet
Variable Automatic Computer (EDVAC).
During World War II a team of scientists and
mathematicians, working at Bletchley Park, north of
London, created one of the first all-electronic digital
computers: Colossus. By December 1943, Colossus,
which incorporated 1,500 vacuum tubes, was operational.
It was used by the team headed by Alan Turing, in the
largely successful attempt to crack German radio messages
enciphered in the Enigma code.
First Generation of Computers
 The first generation of computers (1951-1959) is characterized by use of the vacuum
tube and were very large in size (a mainframe can occupy the whole room).
 The first business computer, the Universal Automatic Computer (UNIVAC I), was
developed in 1951. It was invented to improve information processing in business
organizations.
 In 1953, IBM produced the first of its computers, the IBM 701—a machine designed to
be mass-produced and easily installed in a customer’s building. The success of the 701
led IBM to manufacture many other machines for commercial data processing. The IBM
650 computer is probably the reason why IBM enjoys such a healthy share of today’s
computer market. The sales of IBM 650 were a particularly good indicator of how rapidly
the business world accepted electronic data processing. Initial sales forecasts were
extremely low because the machine was thought to be too expensive, but over 1,800 were
eventually made and sold.
 The invention of the integrated circuit (IC) by Jack S. Kilbey of Texas Instruments in
1958 is considered as a great invention which changed how the world functions. It is the
heart of all electronic equipment today.
 Between 1959 and 1961, (COBOL) was invented by Grace Murray Hopper. It is a
verbose, English-like programming language. Its establishment as a required language by
the United States Department of Defense, its emphasis on data structures, and its English-
like syntax led to its widespread acceptance and usage, especially in business
applications. It is a champion of standardized programming languages that are hardware
independent. COBOL is run in many types of computers by a compiler that is also
designed by Hopper.
Second Generation of Computers
 The invention of the transistor marked the start of second generation of computers (ca.
1954-1964) which were smaller in size (a mainframe can be the size of a closet). Second
generation computers were smaller, faster, and more versatile logical elements than were
possible with vacuum-tube machines. Because transistors use much less power and have
a much longer life, components became smaller, as did inter-component spacings, and the
system became much less expensive to build. The Honeywell 400 computer is the first in
the line op of second generation computers.
 In the 1950’s and 1960’s, only the largest companies could afford the six to seven digit
tags of mainframe computers. Digital Equipment Corporation introduced the PDP-8,
which is generally considered as the first successful transistor-based microcomputer. It
was an instant hit and there were tremendous demands from business and scientific
organizations.
Third Generation of Computers
 Even if the first IC Integrated Circuit was invented earlier during the era of first-
generation computers, it was only in late 1960s when it was introduced, making it
possible for many transistors to be fabricated on one silicon substrate, with
interconnecting wires plated in place. The IC resulted in a further reduction in price, size,
and failure rate. This was the start of third generation computers (mid-1960s to mid-
1970s).
 Some historians consider the IBM System/360 of computers the single most important
innovation in the history of computers. It was conceived as a family of computers with
upward compatibility, when a company outgrew one model it could move up to the next
model without worrying about converting its data. This made all previous computers
obsolete.
 In 1964, Beginner's All-purpose Symbolic Instruction Code (BASIC). a high-level
programming language, was developed by John Kemeny and Thomas Kurtz at Dartmouth
College. BASIC gained its enormous popularity mostly because it can be learned and
used quickly. The language has changed over the years, from a teaching language into a
versatile and powerful language of both business and scientific applications.
 In 1969, two Bell Telephone Labs software engineers, Dennis Ritchie and Ken
Thompson, developed a multi-user computer system named Multics (Multiplexed
Information and Computing Service). They eventually implemented a rudimentary
operating system they named Unics, as a pun of Multics. Somehow, the name became
UNIX. The most notable feature of this operating system is its portability: the operating
system can run in all types of computers, is machine-independent, and supports multi-
user processing, multitasking, and networking. UNIX is used in high-end workstations
and servers. This is written in C language, which was also developed by Ritchie and
Thompson.
Fourth Generation of Computers
 The introduction of large-scale integration of circuitry (more circuits per unit of
space) is the mark of the beginning of fourth generation of computers. The base
technology, though, is still the IC, had significant innovations after two decades have
passed. The computer industry actually experienced a mind-bogging succession of
advancements in the further miniaturization of circuitry, data communications, and the
design of computer hardware and software. The microprocessor became a reality in the
mid-1970s with the introduction of the large-scale integrated (LSI) circuit.
 Bill Gates and Paul Allen revolutionized the computer industry. They developed the
BASIC programming language for the first commercially-available microcomputer, the
MITS Altair. After successful completion of the project, the two formed Microsoft
Corporation in 1975. Microsoft is now the largest and most influential software
company in the world. Microsoft was given an anonymous boost when its operating
system software, MS-DOS was selected for use by the IBM PC. Gates, now the
wealthiest person in the world, provides the company’s vision of new product ideas and
technologies.

 One important entrepreneurial venture during the early years is the Apple II personal
computer, which was introduced in 1977. This event has forever changed how society
perceives computers: that computing is made available to individuals and very small
companies.

 IBM tossed its hat into the personal computer ring with its release of the IBM personal
computer in 1981. By the end of 1982, 835,000 units had been sold. When software
vendors began to orient their products to the IBM PC, many companies began offering
IBM PC-compatibles or clones. Today, the IBM PC and its clones have become a
powerful standard in the microcomputer industry.

 In 1982, Michael Kapor founded the Lotus Development Company, a subsidiary of


IBM. It introduced an electronic spreadsheet product (Lotus 123) and gave IBM PC
credibility in the business marketplace. Sales of IBM PC and Lotus 123 soared.

 In 1984, Apple Macintosh introduced the Macintosh desktop computer with a very
friendly graphical user interface (GUI). This was a proof that computers can be easy and
fun to use. GUI began to change the complexion of the software industry. They have
changed the interaction between the user and the computer from a short, character-
oriented exchange modeled from the teletypewriter to the now famous WIMP interface
(WIMP stands for windows, icons, menus, and pointing devices).

 It was in 1985 when Microsoft adopted the GUI in its Windows operating system for
IBM PC compatible computers. Windows did not enjoy widespread acceptance until
1990, with the release of Windows 3.0. It gave a huge boost to the software industry
because larger, more complex programs could not be run on IBM-PC compatibles.
Subsequent releases made the PC even easier to use, fueling the PC explosion in the
1990s.

 In 1991, Linus Torvalds developed LINUX, a reliable and compactly designed operating
system that is an offshoot of UNIX and can be run on many different hardware platforms.
It is available free or at very low cost. LINUX was used as an alternative to the costly
Windows Operating System.

 In 1993, the IBM-PC compatible PCs started out using Intel microprocessor chips, then
a succession of even more powerful chips. But not until the Intel Pentium and its
successors did PCs do much with multimedia (the integration of motion, video,
animation, graphics, sound, and so on). The emergence of the high-powered Intel
Pentium processors and their ability to handle multimedia applications changed the way
people view and use PCs.

 It was also in this year when millions of people began to tune into the Internet for news.
The World Wide Web (WWW), one of several internet-based applications, came of age
as Web traffic grew 341.634%. The web is unique that it enabled Web pages to be linked
across the Internet. A number of Internet browsers were introduced (e.g. Mosaic and
Netscape Navigator which were developed by Marc Andreesen, and Internet
Explorer by Microsoft Corporation). These browsers enabled users to navigate the
World Wide Web with ease. Today, WWW is the foundation for most Internet
communications and services. The World Wide Web was actually created in 1991 by
Tim Berners-Lee, an engineer in Geneva, Switzerland.

Fifth Generation of Computers


 The fifth generation of computers is characterized by the very large-scale integrated
(VLSI) circuit (microchip), with many thousands of interconnected transistors etched
into a single silicon substrate. It is also characterized by network computers of all sizes,
the Internet, Intranets, and Extranets.

 The year 1996 marked the 50th year of computer history. The US Postal service issued
stamps that commemorated the 50th anniversary of ENIAC, the first full-scale
computer and the 50 years of computer technology that followed. It was during this year
when the handheld computer was introduced and signaled to the world that you can
place a tremendous computing power at the palm of your hand. Nowadays, millions of
people rely on handhelds for a variety of personal information management applications,
including e-mail.

 In the year 1999, the world was threatened by the Y2K problem, known as the
millennium bug. It may have been one of the biggest challenges ever to confront the
businesses of the world. For most of the 20th century, information systems had only two
digits to represent the year (e.g. 99 for 1999). But what would happen when the 20th
century ended and a new one begins is that non-compliant computers would interpret the
date 01-01-00 for January 1, 2000 as being January 1, 1900. Y2K heightened
management’s awareness of how critical information technology is to the operation of
any organization.
 Jack Kilbey’s first IC contained a single transistor. Tens of thousands engineers around
the world have built on his invention, such that each year, our society is the beneficiary of
smaller, more powerful, cheaper chips.
 One continuing trend in computer development is microminiaturization, the effort to
compress more circuit elements into smaller and smaller chip space. In 1999, scientists
developed a circuit the size of a single layer of molecules, and in 2000 IBM announced
that it had developed new technology to produce computer chips that operate five times
faster than the most advanced models to date.
 Researchers are also trying to speed up circuitry functions through the use of
superconductivity, the phenomenon of decreased electrical resistance observed in certain
materials at very low temperatures.
 Whether we are moving into a fifth generation of computing is a subject of debate since
the concept of generations may no longer fit the continual, rapid changes occurring in
computer hardware, software, data, and networking technologies. But in any case, we can
be sure that progress in computing will continue to accelerate and that the development of
Internet-based technologies and applications will be one of the major forces driving
computing in the 21st century.

You might also like