You are on page 1of 26

INTRODUCTION:

Everyday we see and use this piece of machinery called a


computer. We play games, look for information on the internet, write to people, watch
movies, listen to music and do our school work on this machine. But, did you ever think
where it came from? Did one day a computer just appear by magic? Did we find a
computer in the wreckage of an alien spaceship? Your quest will be to use the internet to
uncover the true origin of the computer. When you are finished, all secrets concerning
the creation of the modern computer will be answered.

WHAT IS A COMPUTER?
A computer is an electronic device, which executes
software programs. It consists of 2 parts-hardware and software . The computer processes
input through input devices like mouse and keyboard. The computer displays output
through output devices like color monitor and printer. The size of a computer varies
considerably from very small to very big. The speed of computers also has a very large
range. Computers have become indispensable in today’s world. Millions of people use
computers all over the world.

HISTORY OF COMPUTER:
The basic idea of computing develops in the
1200's when a Moslem cleric proposes solving problems with a series of written
procedures. Webster's Dictionary defines "computer" as any programmable electronic
device that can store, retrieve, and process data. The first computers were people! That is,
electronic computers were given this name because they performed the work that had
previously been assigned to people. "Computer" was originally a job title: it was used to
describe those human beings whose job it was to perform the repetitive calculations
required to compute such things as navigational tables, tide charts, and planetary
positions for astronomical almanacs.

A typical computer operation back when computers were people.

1
ABACUS was an early aid for mathematical computations. Its only value is that it aids
the memory of the human performing the calculation. A skilled abacus
operator can work on addition and subtraction problems at the speed of a
person equipped with a hand calculator (multiplication and division are
slower). The abacus is often wrongly attributed to China. In fact, the oldest
surviving abacus was used in 300 B.C. by the Babylonians.A modern abacus
consists of rings that slide over rods,

A more modern abacus. Note how the abacus is really just a representation of the human
fingers: the 5 lower rings on each rod represent the 5 fingers and the 2 upper
rings represent the 2 hands.

In 1617 an eccentric Scotsman named John Napier invented logarithms,


which are a technology that allows multiplication to be performed via addition. The
magic ingredient is the logarithm of each operand, which was originally obtained from a
printed table. But Napier also invented an alternative to tables, where the logarithm
values were carved on ivory sticks which are now called Napier's Bones.

An original set of Napier's Bones

2
A more modern set of Napier's Bones

Napier's invention led directly to the slide rule, first built in England in 1632 and still in
use in the 1960's by the NASA engineers of the Mercury, Gemini, and Apollo programs
which landed men on the moon.

A slide rule

In 1642 Blaise Pascal, at age 19, invented the Pascaline as an aid for his father who was a
tax collector. Pascal built 50 of this gear-driven one-function calculator (it could only
add).

3
Pascal's Pascaline

Just a few years after Pascal, the German Gottfried Wilhelm Leibniz managed to build a
four-function (addition, subtraction, multiplication, and division) calculator that he called
the stepped reckoner.

Leibniz's Stepped Reckoner

In 1801 a Frenchman, Joseph-Marie Jacquard builds a loom that weaves by reading


punched holes stored on small sheets of hardwood. These plates are then inserted into the
loom which reads (retrieves) the pattern and creates(process) the weave. Powered by
water, this "machine" came 140 years before the development of the modern computer.

4
Jacquard's Loom showing the threads and the punched cards

By 1822 the English mathematician Charles Babbage was proposing a steam driven
calculating machine the size of a room, which he called the Difference Engine. This
machine would be able to compute tables of numbers, such as logarithm tables. He
obtained government funding for this project due to the importance of numeric tables in
ocean navigation

The 1890 census is tabulated on punch cards similar to the ones used 90 years earlier to
create weaves. Developed by Herman Hollerith of MIT, the system uses electric
power(non-mechanical). The Hollerith Tabulating Company is a forerunner of today's
IBM.

Just prior to the introduction of Hollerith's machine the first printing calculator is
introduced. In 1892 William Burroughs, a sickly ex-teller, introduces a commercially
successful printing calculator. Although hand-powered, Burroughs quickly introduces an
electronic model.

In 1925, unaware of the work of Charles Babbage, Vannevar Bush of MIT builds a
machine he calls the differential analyzer. Using a set of gears and shafts, much like
Babbage, the machine can handle simple calculus problems, but accuracy is a problem.

The period from 1935 through 1952 gets murky with claims and counterclaims of who
invents what and when. Part of the problem lies in the international situation that makes
much of the research secret. Other problems include poor record-keeping, deception and
lack of definition.

In 1935, Konrad Zuse, a German construction engineer, builds a mechanical calculator to


handle the math involved in his profession. Shortly after completion, Zuse starts on a
programmable electronic device which he completes in 1938.

5
John Vincent Atanasoff begins work on a digital computer in 1936 in the basement of the
Physics building on the campus of Iowa State. A graduate student, Clifford (John) Berry
assists. The "ABC" is designed to solve linear equations common in physics. It displays
some early features of later computers including electronic calculations. He shows it to
others in 1939 and leaves the patent application with attorneys for the school when he
leaves for a job in Washington during World War II. Unimpressed, the school never files
and ABC is cannibalized by students.

The Enigma, a complex mechanical encoder is used


by the Germans and they believe it to be unbreakable.
Several people involved, most notably Alan Turing,
conceive machines to handle the problem, but none
are technically feasible. Turing proposes a "Universal
Machine" capable of "computing" any algorithm in
1937. That same year George Steblitz creates his
Model K(itchen), a conglomeration of otherwise
useless and leftover material, to solve complex
calculations. He improves the design while working at
Bell Labs and on September 11, 1940, Steblitz uses a
teletype machine at Dartmouth College in New
Hampshire to transmit a problem to his Complex
Number Calculator in New York and receives the
The Enigma results. It is the first example of a network.

First in Poland, and later in Great Britain and the United States, the Enigma code is
broken. Information gained by this shortens the war. To break the code, the British, led
by Touring, build the Colossus Mark I. The existence of this machine is a closely
guarded secret of the British Government until 1970. The United States Navy, aided to
some extent by the British, builds a machine capable of breaking not only the German
code but the Japanese code as well.

In 1943 development begins on the Electronic Numerical Integrator And Computer


(ENIAC) in earnest at Penn State. Designed by John Mauchly and J. Presper Eckert of
the Moore School, they get help from John von Neumann and others. In 1944, the
Havard Mark I is introduced. Based on a series of proposals from Howard Aiken in the
late 1930's, the Mark I computes complex tables for the U.S. Navy. It uses a paper tape to
store instructions and Aiken hires Grace Hopper("Amazing Grace") as one of three
programmers working on the machine. Thomas J. Watson Sr. Plays a pivotal role
involving his company, IBM, in the machine's development.

Early in 1945, with the Mark I stopped for repairs, Hopper notices a moth in one of the
relays, possibly causing the problem. From this day on, Hopper refers to fixing the
system as "debugging". The same year Von Neumann proposes the concept of a "stored
program" in a paper that is never officially published.

6
Work completes on ENIAC in 1946. Although only three years old the machine is
woefully behind on technology, but the inventors opt to continue while working on a
more modern machine, the EDVAC. Programming ENIAC requires it to be rewired. A
later version eliminates this problem. To make the machine appear more impressive to
reporters during its unveiling, a team member (possibly Eckert) puts translucent
spheres(halved ping pong balls) over the lights. The US patent office will later recognize
this as the first computer.

The next year scientists employed by Bell Labs complete work on the transistor (John
Bardeen, Walter Brattain and William Shockley receive the Nobel Prize in Physics in
1956), and by 1948 teams around the world work on a "stored program" machine. The
first, nicknamed "Baby", is a prototype of a much larger machine under construction in
Britain and is shown in June 1948.

The impetus over the next 5 years for advances in computers is mostly the government
and military. UNIVAC, delivered in 1951 to the Census Bureau, results in a tremendous
financial loss to its manufacturer, Remington-Rand. The next year Grace Hopper, now an
employee of that company proposes "reuseable software," code segments that could be
extracted and assembled according to instructions in a "higher level language." The
concept of compiling is born. Hopper would revise this concept over the next twenty
years and her ideas would become an integral part of all modern computers. CBS uses
one of the 46 UNIVAC computers produced to predict the outcome of the 1952
Presidential Election. They do not air the prediction for 3 hours because they do not trust
the machine.

IBM introduces the 701 the


following year. It is the first
commercially successful computer.
Addition time: 60 microseconds,
Multiplication: 456 microseconds,
Memory: 2048 (36 bit) words using
Williams tubes,
Secondary memory:
Magnetic drum: 8192 words,
Magnetic tape: plastic,
Delivered: December 1952: IBM,
World Headquarters (total of 19
installed).

Small portion of the IBM 701

In 1956 FORTRAN is introduced(proposed 1954, it takes nearly 3 years to develop the


compiler). Two additional languages, LISP and COBOL, are added in 1957 and 1958.
Other early languages include ALGOL and BASIC. Although never widely used,
ALGOL is the basis for many of today's languages.

7
With the introduction of Control Data's CDC1604 in 1958, the first transistor powered
computer, a new age dawns. Brilliant scientist Seymour Cray heads the development
team. This year integrated circuits are introduced by two men, Jack Kilby and John
Noyce, working independently. The second network is developed at MIT. Over the next
three years computers begin affecting the day-to-day lives of most Americans. The
addition of MICR characters at the bottom of checks is common.

In 1961 Fairchild Semiconductor introduces the integrated circuit. Within ten years all
computers use these instead of the transistor. Formally building sized computers are now
room-sized, and are considerably more powerful. The following year the Atlas becomes
operational, displaying many of the features that make today's systems so powerful
including virtual memory, pipeline instruction execution and paging. Designed at the
University of Manchester, some of the people who developed Colossus thirty years
earlier make contributions.

On April 7, 1964, IBM introduces the System/360. While a technical marvel, the main
feature of this machine is business oriented...IBM guarantees the "upward compatibility"
of the system, reducing the risk that a business would invest in outdated technology.
Dartmouth College, where the first network was demonstrated 25 years earlier, moves to
the forefront of the "computer age" with the introduction of TSS(Time Share System) a
crude(by today's standards) networking system. It is the first Wide Area Network. In
three years Randy Golden, President and Founder of Golden Ink, would begin working
on this network.

Within a year MIT returns to the top of the intellectual computer community with the
introduction of a greatly refined network that features shared resources and uses the first
minicomputer(DEC's PDP-8) to manage telephone lines. Bell Labs and GE play major
roles in its design.

In 1969 Bell Labs, unhappy with the direction of the MIT project, leaves and develops its
own operating system, UNIX. One of the many precursors to today's Internet, arpanet, is
quietly launched. Alan Keys, who will later become a designer for Apple, proposes the
"personal computer." Also in 1969, unhappy with Fairchild Semiconductor, a group of
technicians begin discussing forming their own company. This company, formed the next
year, would be known as Intel. The movie Colossus:The Forbin Project has a
supercomputer as the villain. Next year, The Computer Wore Tennis Shoes was the first
feature length movie with the word computer in the title. In 1971, Texas Instruments
introduces the first "pocket calculator." It weighs 2.5 pounds.

With the country embroiled in a crisis of confidence known as Watergate, in 1973 a little
publicized judicial decision takes the patent for the computer away from Mauchly and
Eckert and awards it to Atanasoff. Xerox introduces the mouse. Proposals are made for
the first local area networks.

In 1975 the first personal computer is marketed in kit form. The Altair features 256 bytes
of memory. Bill Gates, with others, writes a BASIC compiler for the machine. The next

8
year Apple begins to market PC's, also in kit form. It includes a monitor and keyboard.
The earliest RISC platforms become stable. In 1976, Queen Elizabeth goes on-line with
the first royal email message.

During the next few years the personal computer explodes on the American scene.
Microsoft, Apple and many smaller PC related companies form (and some die). By 1977
stores begin to sell PC's. Continuing today, companies strive to reduce the size and price
of PC's while increasing capacity. Entering the fray, IBM introduces it's PC in 1981(it's
actually IBM's second attempt, but the first failed miserably). Time selects the computer
as its Man of the Year in 1982.

You might wonder what type of event is required to dislodge an industry heavyweight. In
IBM's case it was their own decision to hire an unknown but aggressive firm called
Microsoft to provide the software for their personal computer (PC). This lucrative
contract allowed Microsoft to grow so dominant that by the year 2000 their market
capitalization (the total value of their stock) was twice that of IBM and they were
convicted in Federal Court of running an illegal monopoly. Computer programming in
the 1970's, you dealt with what today are called mainframe computers, such as the IBM
7094.

The IBM 7094, a typical mainframe computer

There were 2 ways to interact with a mainframe. The first was called time sharing
because the computer gave each user a tiny sliver of time in a round-robin fashion.
Perhaps 100 users would be simultaneously logged on, each typing on a teletype such as
the following:

9
The Teletype was the standard mechanism used to interact with a time-sharing computer

A teletype was a motorized typewriter that could transmit your keystrokes to the
mainframe and then print the computer's response on its roll of paper. You typed a single
line of text, hit the carriage return button, and waited for the teletype to begin noisily
printing the computer's response (at a whopping 10 characters per second). On the left-
hand side of the teletype in the prior picture you can observe a paper tape reader and
writer (i.e., puncher). Here's a close-up of paper tape:

After observing the holes in paper tape it is perhaps obvious why all computers use
binary numbers to represent data: a binary bit (that is, one digit of a binary
number) can only have the value of 0 or 1 (just as a decimal digit can only
have the value of 0 thru 9). Something which can only take two states is very
easy to manufacture, control, and sense.

The alternative to time sharing was batch mode processing, where the computer gives its
full attention to your program. In exchange for getting the computer's full attention at
run-time, you had to agree to prepare your program off-line on a key punch machine
which generated punch cards.

10
An IBM Key Punch machine which operates like a typewriter except it produces punched
cards rather than a printed sheet of paper

But things changed fast. By the 1990's a university student would typically own his own
computer and have exclusive use of it in his dorm room.

The original IBM Personal Computer (PC)

This transformation was a result of the invention of the microprocessor. A


microprocessor (up) is a computer that is fabricated on an integrated circuit
(IC). Computers had been around for 20 years before the first microprocessor
was developed at Intel in 1971. The micro in the name microprocessor refers
to the physical size. Intel didn't invent the electronic computer. But they were
the first to succeed in cramming an entire computer on a single chip (IC).

INTEL:

11
Noyce, Moore, and Andrew Grove leave Fairchild and found Intel in 1968 focus on
random access memory (RAM) chips.
Question: if you can put transistors, capacitors, etc. on a chip, why couldn’t you put a
central processor on a chip?
Ted Hoff designs the Intel 4004, the first microprocessor in 1969 based on Digital’s
PDP-8.

PDP-8, first mass-produced

Ed Roberts founds Micro Instrumentation Telemetry Systems (MITS) in 1968.Popular


Electronics puts the MITS Altair on the cover in January 1975 [Intel 8080]. Les
Solomon’s 12 year old daughter, Lauren, was a lover of Star Trek. He asked her what
the name of the computer on the Enterprise was. She said “ ‘computer’ but why don’t
you call it Altair because that is where they are going tonight!”

Altair 8800 Computer

12
THE ORIGINAL PENTIUM:

Pentium Vitals Summary Table

Introduction date: March 22, 1993


Process: 0.8 micron
Transistor Count: 3.1 million
Clock speed at introduction: 60 and 66 MHz
Cache sizes: L1: 8K instruction, 8K data
Features: MMX added in 1997

The original Pentium is an extremely modest design by today's standards, and when it
was introduced in 1993 it wasn't exactly a blockbuster by the standards of its RISC
contemporaries, either. While its superscalar design (Intel's first) certainly improved on
the performance of its predecessor, the 486, the main thing that the Pentium had going
for it was x86 compatibility. In fact, Intel's decision to make enormous sacrifices of
performance, power consumption, and cost for the sake of maintaining the Pentium's
backwards compatibility with legacy x86 code was probably the most strategically-
important decision that the company has ever made.

The choice to continue along the x86 path inflicted some serious short- and medium-term
pain on Intel, and a certain amount of long-term pain on the industry as a whole (how
much pain depends on who you talk to), but as we'll see the negative impact of this
critical move has gradually lessened over time.

INTEL PROCESSORS

CPU YEAR DATA MEMORY MIPS


4004 1971 4 1K
8008 1972 8 16K
8080 1974 8 64K
8088 1980 8 1M .33
80286 1982 16 1M 3
80386 1985 32 4G 11
80486 1989 32 4G 41
Pentium 1993 64 4G 111

TRANSISTORS - BUILDING BLOCKS OF COMPUTERS:

13
microprocessors contain many transistors

(ENIAC): 19,500 vacuum tubes and relays


Intel 8088 processor (1st PC): 29,000 transistors
Intel Pentium II processor: 7 million transistors
Intel Pentium III processor: 28 million transistors
Intel Pentium 4 processor: 42 million transistors

logically, each transistor acts as an on-off switch


transistors combined to implement logic gates
AND, OR, NOT
gates combined to build higher-level structures
adder, multiplexor, decoder, register, …

ARTIFICIAL INTELLIGENCE:
The branch of computer science concerned
with making computers behave like humans. The term was coined in 1956 by
John mccarthy at the Massachusetts Institute of Technology.

The intellectual roots of AI, and the concept of intelligent machines, may be found in
Greek mythology. Intelligent artifacts appear in literature since then, with
mechanical devices actually (and sometimes fraudulently) demonstrating
behaviour with some degree of intelligence. After modern computers became
available following World War II, it has become possible to create programs
that perform difficult intellectual tasks. Even more importantly, general
purpose methods and tools have been created that allow similar tasks to be
performed.

Artificial Intelligence is, in our opinion, the most exciting area of computer science.
Artificial Intelligence (or AI, for short) is the name given to any attempt to have
computers gain attributes of the human mind.

Of course, this is a very vague statement, and much argument has happened over what
exactly constitutes AI (mathematicians and scientists hate vague statements). There are
essentially two schools of thought: Strong AI and Weak AI.

Weak AI philosophers believe that computers, as advanced as they may get, will only be
able to seem intelligent. The responses of the computer may seem like intelligent actions,
but the Weak AI theory insists that the computers are just mindlessly manipulating data
to produce "intelligent" actions. Strong AI philosophers believe that computers someday
can be as intelligent as humans.

14
Of course, the heart of the discussion is this: What is intelligence? Normally, we declare
that humans are the standard for intelligence, but then, isn't human intelligence, at the
very basic level, just a bunch of mindless chemical reactions? Both the Weak AI and the
Strong AI have strong supporters, many of them quite fanatic.

Aside from the philosophy of the Strong AI and the Weak AI debate, there is also
another, related, division in AI: Connectionism and Classicalism.

Classicalism is the AI that is found in Chess programs, weather diagnostics, and language
processors. Classicalism, also known as the top-down approach, approaches AI from the
standpoint that human minds are computational by nature, that is, we manipulate data one
piece at a time (serially) through a built in circuitry in our brains. So, much of the
classical approach to AI consists of things like minimax trees, preprogrammed databases,
and prewritten code. Expert systems are another name for Classical AI.

Connectionism is the newer form of AI. The problem with classicalism, connectionists
say, is that it is too unlike the human mind. The human mind can learn, expand, and
change, but many of the Expert systems are too rigid and don't learn. Connectionism is
the AI that the media really likes, which is why it contains many famous names like
Neural Networks and Parallel Processing. Connectionism seems a step closer to the
human mind, since it uses networks of nodes that seem like the human brain's network of
neurons.

Connectionism, however, also has its flaws. Connectionism is many times inaccurate and
slow, and currently connectionism has failed to reach higher level AI, such as language
and some advanced logic, which humans seem to pick up easily in little time. Human
intelligence isn't built from scratch, like the Connectionist systems often are. So, for
those higher-level AI, Classicalism is by far the better suited. Connectionism, however, is
quite successful at modeling lower level thinking like motor skills, face-recognition, and
some vision.

Currently, no computers exhibit full artificial intelligence (that is, are able to simulate
human behavior). The greatest advances have occurred in the field of games playing. The
best computer chess programs are now capable of beating humans. In May, 1997, an
IBM super-computer called Deep Blue defeated world chess champion Gary Kasparov in
a chess match.

In the end, both viewpoints are valid in both debates. Regardless of the viewpoint,
however, the most important goal for AI is that it helps us understand the mechanisms of
the human mind.

CASE STUDIES:
The following four case studies highlight application areas where AI
technology is having a strong impact on industry and everyday life.

15
1. AUTHORIZING FINANCIAL TRANSACTIONS:

Credit card providers, telephone companies,


mortgage lenders, banks, and the U.S. Government employ AI systems to detect fraud
and expedite financial transactions, with daily transaction volumes in the billions. These
systems first use learning algorithms to construct profiles of customer usage patterns, and
then use the resulting profiles to detect unusual patterns and take the appropriate action
(e.g., disable the credit card). Such automated oversight of financial transactions is an
important component in achieving a viable basis for electronic commerce.

2. CONFIGURING HARDWARE AND SOFTWARE:

AI systems configure custom computer,


communications, and manufacturing systems, guaranteeing the purchaser maximum
efficiency and minimum setup time, while providing the seller with superhuman
expertise in tracking the rapid technological evolution of system components and
specifications. These systems detect order incompletenesses and inconsistencies,
employing large bodies of knowledge that describe the complex interactions of system
components. Systems currently deployed process billions of dollars of orders annually;
the estimated value of the market leader in this area is over a billion dollars.

3. DIAGNOSING AND TREATING PROBLEMS:

Systems that diagnose and treat problems --


whether illnesses in people or problems in hardware and software -- are now in
widespread use. Diagnostic systems based on AI technology are being built into
photocopiers, computer operating systems, and office automation tools to reduce service
calls. Stand-alone units are being used to monitor and control operations in factories and
office buildings. AI-based systems assist physicians in many kinds of medical diagnosis,
in prescribing treatments, and in monitoring patient responses. Microsoft's Office
Assistant, an integral part of every Office 97 application, provides users with customized
help by means of decision-theoretic reasoning.

4. SCHEDULING FOR MANUFACTURING:

The use of automatic scheduling for manufacturing


operations is exploding as manufacturers realize that remaining competitive demands an
ever more efficient use of resources. This AI technology -- supporting rapid rescheduling
up and down the "supply chain" to respond to changing orders, changing markets, and
unexpected events -- has shown itself superior to less adaptable systems based on older
technology. This same technology has proven highly effective in other commercial tasks,
including job shop scheduling, and assigning airport gates and railway crews. It also has
proven highly effective in military settings -- DARPA reported that an AI-based logistics

16
planning tool, DART, pressed into service for operations Desert Shield and Desert
Storm, completely repaid its three decades of investment in AI research.

THE FUTURE:

AI began as an attempt to answer some of the most fundamental


questions about human existence by understanding the nature of intelligence, but it has
grown into a scientific and technological field affecting many aspects of commerce and
society.

Even as AI technology becomes integrated into the fabric of everyday life, AI researchers
remain focused on the grand challenges of automating intelligence. Work is progressing
on developing systems that converse in natural language, that perceive and respond to
their surroundings, and that encode and provide useful access to all of human knowledge
and expertise. The pursuit of the ultimate goals of AI -- the design of intelligent artifacts;
understanding of human intelligence; abstract understanding of intelligence (possibly
superhuman) -- continues to have practical consequences in the form of new industries,
enhanced functionality for existing systems, increased productivity in general, and
improvements in the quality of life. But the ultimate promises of AI are still decades
away, and the necessary advances in knowledge and technology will require a sustained
fundamental research effort.

INFORMATION SYSTEM:
An information system is a mechanism that helps people,collect,store,organize, and use
information. Ultimately the information system is the computer’s reason for
being.

Because there are so many types of information-and uses for it-many kind of information
systems have been developed.

THE PURPOSE OF INFORMATION SYSTEM:


Information system consist of three basic components:

THE PHYSICAL:

The physical means for storing information such as file cabinet or hard
disk. Depending on the organization’s needs, storage requirement can be tiny
or enormous (involving a mainframe system with terabytes of disk space).

17
THE PROCEDURES:

The procedures for handling information that ensure its integrity. In


any information system regardless of size it is important to follow data
management procedures to eliminate duplicate entries, validate the accuracy or
format of data and avoid the loss of important information.

THE RULE:

The rules regarding information’s use and distribution. In any organization,


data is meant to be used for pacific purposes, to achieve a desired result. For
Example, a sales organization will use its data to make decisions about prices,
inventory, and account management. By establishing rules governing the use
of its information, and organization preserves its resources rather then wasting
them on manipulating data in useless ways. To ensure the security of their
mission-critical data, many organizations set rules that limit the information
that can be made available to certain workers enabling workers to access only
the most appropriate types of information for there jobs. Different people
require different information to perform their jobs. The rules of the system
govern what information should be distributed to whom, at what time, and in
what format.

Although these three basic components are simple, a complex information system can be
very complicated. In addition to the three basic components listed above you
might at the means of distributing information to different users, whether it is a
system of desk trays or a moderen local area network most or today’s
information systems also include tools for sorting, categorizing, and analyzing
information.

TYPES OF INFORMATION SYSTEM:


OFFICE AUTOMATION SYSTEMS:

An office automation system uses computers and for


networks to carry out various operations, such as word processing,
accounting, document, management, or communication. The purposes of an
office automation system are to manage information and, even more
important, to help users handle certain information-related tasks more
efficiently, in large organizations, simple tasks such as project scheduling,
recordkeeping, and correspondence can become extremely time-consuming
and labor-intensive. By using office automation tools, workers at all levels can
reduce the amount of time and effort they spend on such mundane tasks,
freeing them to handle more mission-critical jobs such as planning designing,

18
and selling. For this reason, nearly any complete information system has an
office automation component.

Office automation systems can be built from off-the-shelf applications.

TRANSACTION PROCESSING SYSTEMS:

A transaction is a complete event, which may


occur as a series of many steps, such as taking an order from a customer. You
conduct business transactions all the time but may have never considered the
steps that make up a typical transaction-all of which can be processed through
an information system. A system that handles the processing and tracking of
transactions is called a transaction processing system.

DECICIONS SUPPORT SYSTEM:

A decision support system is a specialized application


used to collect and represent certain types of business data. Which can be used
to aid managers in the decision process. Business managers frequently use
decision support systems to access the data in the company’s transaction
processing system. In addition, these systems can include or access other type
of data, such as stock market reports or data about competitors. By compiling
this kind of data, the decision support system can generate specific reports that
managers can use in making mission-critical decisions.

Decision support systems are useful tools because they provide management with highly
tailored, highly structured data about relevant issues. Many decision support
systems are spreadsheet or data base applications that have been customized
for specific businesses. These powerful systems can import and analyze data in
various formats, such as flat database tables or spreadsheets, two-dimensional
charts, or multidimensional “cubes”. They can generate reports quickly, based
on existing data, and update those reports instantly as data changes.

MANAGEMENT INFORMATION SYSTEMS:

Within any business, workers at different levels need


access to the same type of information, but they may need to view the
information in different ways. At a call centre: for example, a supervisor may
need to see a daily report detailing the number of calls received, the types of
request made, and the production levels of individual staff members. A middle
level manager, such as a branch manager, may need to see a monthly summary
of the data. Managers at different levels may also need very different types of
data.

19
A Management information system is a set of software tools that enables managers to
gather, organize, and evaluate information about a workgroup, department, or
an entire organization. These systems meet the needs of three different
categories of managers, executive, middle managers and front-line managers-
by producing a range of standardized reports drawn from the organization’s
database. An efficient management information system summarizes vast
amounts of business data into information that is useful to each type of
manager.

EXPERT SYSTEMS:

An expert system is a specialized application that performs tasks that


would normally be done by a human, such as medical diagnose of review of
credit histories for loan approval. After analyzing the pertinent data, some
expert systems produce a recommendation for a course of action. Some
systems are empowered to make decisions and initiate actions. Such as
ordering a new shipment of a given product from a supplier when the current
inventory falls below a given level.

Expert systems are increasingly being used in the medical profession by enabling a
physician to enter a patient’s condition, along with treatment options. A
financial expert system might recommend that a particular client be given a
particular requested raise on his or her credit limit.

THE COMPUTER TREND:

A BRIEF HISTORY:

The last two decades have marked an enormous increase in the


number of home computers. With it, computer owners have invariably taken to
entrepreneurship in many varied fields. Thanks to the growth of technology, computers
and the Internet, new methods have been developed for processing everyday business
activities easily. Without the advent of technology, routine tasks would otherwise have
taken and enormous amount of time and specialization. Undoubtedly, the computer
represents the top technology development in the last century as it relates to businesses
today, both large and small. Advances in the field of technology have created a vast
number of business opportunities.

SOME STATISTICS:

In 2003, the U.S. Small Business Administration produced a


report/survey that established conclusively that more than 75% of small businesses
owned computers and had heavily invested in new technology. Let us try to understand

20
what the computers mean to businesses and how they contribute to increase their
productivity.

computers are extremely important in todays business's not only pcs but other
kinds of computers today most noticable the evelution of debit and credit cards this is all
electronic. It is the most convient and safest way to carry around your money.
Then there is the actual pcs and fax machines that have made things much easier to
recieve documents email information, find out information and much more that a
company might need to do quickly.
Also the standard pc is a great way of storing information or using special
software to make our jobs easier and erase human error that might occur . I am thinking
of my accounting class when saying this.

computers and computer networks act as the central nervous system of today's
enterprise. That is very true. It is not very often that you go to a business and
see them without a computer. Computers don't make mistakes (software does
though), you can store files upon files in a computer and it doesn't use a lot of
space, it is time efficient and if you are ever stuck with something at your
work all you have to do is look it up on the internet because it is most likely
there. There is almost 9 billion sites on google so there is plenty of information
to go around.

Business's these days use computers for everything: ordering products,


shipping information, price checks in retail stores, etc. and they reduce the
amount of human interferance which eliminates a greater margin of error.
Instead of having to pick up the phone and waste time being put on hold or
transfered around until you get the right person, you simply send an email or
order through a wholesalers site online with no waiting involved with the
exception of the time it takes to get the product to you. And communicating
with company's around the globe is much easier by email or teleconference,
not having to wait until the correct time of day. the example of using computer
in business: In Hong Kong, but I suppose many other places have this
technology too, the "Octopus" card convinient our lives a lot; By using the
"Octopus" card, we can pay the fare by bus, MTR, train, ferry; things from
convinience store, super market etc.

It is true that business is expanding quite a bit because of the internet and its
applications that are available to almost eveyone is the world in such a short
time. This is just a little reminder of how ebusiness and the role of computers
in business, Bill Gates was reported for the 11th time that he is the richest man
in the worl with a net worth of 51 billion (us.) Obviously his net worth is next
to impossible to also achieve through ebusiness and business in general. But
think of this though, the founders of google, Sergey Brin and Larry Page, are
the 16th richest in the world with 11 billion each. The scary thing is that last
year, their net worth was only 4 billion each. In one year for google they each
made 7 billion dollars!! So if you are thinking that ebusiness or business with

21
computers is not the way to make money for you, think it over, because we are
entering a very new world with how computers are being used, and obviously
there is money out there, even if it is a little lower then 51 and 11 billion.

USE OF COMPUTERS IN BUSINESS WORLD:


The fundamental reasons for the popularity of computers with small businesses are their
efficiency, speed, low procurement cost and more than anything else, capability to handle
multiple tasks with little chance for error.

1. OFFICE ROUTINES:

Almost invariably, businesses loaded with the burden of increasing


workloads and the pressures of being lean and mean, fall back upon technology for most
of their administrative tasks. This work includes, among others, bookkeeping, inventory
managing and email. The advent of the Internet has also substantially contributed in
bringing down the costs of communication and marketing. In a nutshell, technology has
reduced the overall cost of business operations.

2. NEW BUSINESS OPPORTUNITIES:

The explosion of Internet and e-commerce has opened up a


plethora of opportunities for all types of businesses. New management methodologies,
such as Six Sigma are easier to implement due to statistical software. Also, companies
are able to train their own employees using in-house Six Sigma software programs, and
as a result, save money on labor costs.

It is now possible to have many business functions operate on autopilot. This has opened
up new opportunities for software development companies and business consultants.
Another business trend that has opened up as a result of advancing technology is
outsourcing. It is now possible for a company in America to have its data entry and
customer service centers in overseas countries such as the UK. In this way, companies
can service their customers 24/7.

3. INDISPENSABLE COMPONENTS OF SMALL BUSINESSES:

It is difficult to think of a situation where businesses can do without technology and


computers today. It is extremely difficult to say whether businesses depend on computers
or computers created business opportunities.

4. INVENTORY CONTROL AND MANAGEMENT:

Inventory control and management is a crucial


process, especially in establishments related to retail and production. Computers are used

22
for recording all aspects of the goods coming in, details of goods and services,
distribution of stock, and storage details. Note, that in small retail and production firms,
simple computer software are generally used. Whereas in large corporations, Enterprise
Resource Planning (ERPs) are employed.

5. ACCOUNTS AND PAYROLL MANAGEMENT:

Accounting and payroll management is also


believed to be an important part of the overall system in a company. Be it any kind of
industry; computers are largely used for the purpose of managing accounts of
administration, sales, purchases, invoices, and also payroll management, which includes
recording financial details of employees. These are just some components of the accounts
and payroll management system where computing is used.

6. DATABASE MANAGEMENT:

Database management is associated with filing and recording,


managing, and retrieval of data whenever required. For smooth running of businesses, it
is very important that they have all procedures and details stored. This storage of data is
done with the help of large databases and servers which have to be maintained on a
regular basis. These information databases and servers are controlled by computers by
appropriate authorities in a company.

7. SOFTWARE DEVELOPMENT:

It can be said that for every computing need, a software has to


be used. Software can only be made using computers for the purpose of helping
businesses to combine processes and carry out their work properly. Nowadays, ERPs are
largely used in business to blend all their processes together and execute the output as
expected. There are many other software and application packages that a business may
need to use according to the nature of work.

This is some general information on the use of computers in businesses.


There are many other fields such as security control, communication, research, budgeting
and forecasting, web management, where computers are essential. The impact of
information technology on business has certainly changed the way businesses operate
and have coordinated different practices of the firm to function collectively. Computer
use is not only present in businesses, but computers are even used in sectors like medical
and defense.

5. TRENDS IN E-BUSINESS:
While there are many technical and business trends hare are few important ones.

23
a. E-BUSINESS IS CRUCIAL TO BUSINESS SUCCESS:

It is being proved in industry after industry


that you must do e-business in order to be profitable, grow and even survives.
E-business had proved to be more long lasting than the fad of re-engineering.

b. TECHNOLOGY FOCUS IS ON E-BUSINESS:

If your consider the activities of the hardware,


software and network vendors, they are directed toward providing the tools for e-
business.
To them E-business is a natural extension of their products and services. In addition,
new technology offers greater support for wireless and land based communications
and networking – keys to the capability to do e-business.

c. E-BUSINESS PRODUCES CUMULATIVE EFFECTS:

E-business is long lasting. As you


implement e-business, the nature of your relationship with customers, suppliers, and
employees changes.
New dynamic forces are set loose that are not as not easily controlled and dealt with
as compared to those of the standard internal business.

Dependency on business departments continues, but there is a greater dependence on


IT resources.

d. E-BUSINESS IMPLEMENTATION RESULTS IN MANY SUCCESSES


AND FAILURES:

The literature abounds with


examples of both success and failure. Failure tends to be more obvious and dramatic
with e-business because is more visible externally.

The Future of Computers:


The history of computers and computer technology thus far
has been a long and a fascinating one, stretching back more than half a century to the first
primitive computing machines. These machines were huge and complicated affairs,
consisting of row upon row of vacuum tubes and wires, often encompassing several
rooms to fit it all in.

As anyone who has looked at the world of computers lately can attest, the size of
computers has been reduced sharply, even as the power of these machines has increased
at an exponential rate. In fact, the cost of computers has come down so much that many
households now own not only one, but two, three or even more, PCs.

24
As the world of computers and computer technology continues to evolve and change,
many people, from science fiction writers and futurists to computer workers and ordinary
users, have wondered what the future holds for the computer and related technologies.
Many things have been pictured, from robots in the form of household servants to
computers so small they can fit in a pocket. Indeed, some of these predicted inventions
have already come to pass, with the introduction of PDA's and robotic vacuum cleaners.

Beyond these innovations, however, there are likely to be many, many more. One of the
most important areas of research in the world of computers is that of artificial
intelligence. When many people think of artificial intelligence, they may picture fully
aware machines, complete with emotions, and the problems that can arise from them.
Even though this remains the goal of many artificial intelligence researchers, in fact
artificial intelligence technology is already in place and already serving the needs of
humans everywhere.

One of the most powerful uses of artificial intelligence thus far is in the world of speech
recognition. This powerful technology is already in place in call centers, banks,
brokerage centers, insurance companies and other businesses throughout the world.
While speech recognition is still imperfect, it has improved greatly in recent years, and in
the future many routine, and even non-routine, phone calls and telephone inquiries may
be handled completely without human intervention.

Robot technology has also come a long way, but it still has a long way to go. Robots in
the future are unlikely to take human form, expect in a few specialized applications.
Instead, robots are likely to do a great deal of work that is simply too dangerous for
humans to accomplish. From spaceflight applications to search and rescue, robots are
likely to continue down the learning curve they have already entered, further enhancing
human lives and providing valuable services for a fraction of the cost of today's robot
helpers.

Quantum computers are also likely to transform the computing experience, for both
business and home users. These powerful machines are already on the drawing board,
and they are likely to be introduced in the near future. The quantum computer is expected
to be a giant leap forward in computing technology, with exciting implications for
everything from scientific research to stock market predictions.

Nanotechnology is another important part of the future of computers, expected to have a


profound impact on people around the globe. Nanotechnology is the process whereby
matter is manipulated at the atomic level, providing the ability to “build” objects from
their most basic parts. Like robotics and artificial intelligence, nanotechnology is already
in use in many places, providing everything from stain resistant clothing to better suntan
lotion. These advances in nanotechnology are likely to continue in the future, making this
one of the most powerful aspects of future computing.

And if history is to be any guide, some of the most powerful advances in the world of
computers and computer technology are likely to be completely unforeseen. After all,

25
some of the most powerful technologies of the past have taken us by surprise, so stay
tuned for a truly fascinating future. Let CBT Planet help you with your computer training
needs. Get certified or get the skills you need to succeed in today's IT world.

26

You might also like