Professional Documents
Culture Documents
WHAT IS A COMPUTER?
A computer is an electronic device, which executes
software programs. It consists of 2 parts-hardware and software . The computer processes
input through input devices like mouse and keyboard. The computer displays output
through output devices like color monitor and printer. The size of a computer varies
considerably from very small to very big. The speed of computers also has a very large
range. Computers have become indispensable in today’s world. Millions of people use
computers all over the world.
HISTORY OF COMPUTER:
The basic idea of computing develops in the
1200's when a Moslem cleric proposes solving problems with a series of written
procedures. Webster's Dictionary defines "computer" as any programmable electronic
device that can store, retrieve, and process data. The first computers were people! That is,
electronic computers were given this name because they performed the work that had
previously been assigned to people. "Computer" was originally a job title: it was used to
describe those human beings whose job it was to perform the repetitive calculations
required to compute such things as navigational tables, tide charts, and planetary
positions for astronomical almanacs.
1
ABACUS was an early aid for mathematical computations. Its only value is that it aids
the memory of the human performing the calculation. A skilled abacus
operator can work on addition and subtraction problems at the speed of a
person equipped with a hand calculator (multiplication and division are
slower). The abacus is often wrongly attributed to China. In fact, the oldest
surviving abacus was used in 300 B.C. by the Babylonians.A modern abacus
consists of rings that slide over rods,
A more modern abacus. Note how the abacus is really just a representation of the human
fingers: the 5 lower rings on each rod represent the 5 fingers and the 2 upper
rings represent the 2 hands.
2
A more modern set of Napier's Bones
Napier's invention led directly to the slide rule, first built in England in 1632 and still in
use in the 1960's by the NASA engineers of the Mercury, Gemini, and Apollo programs
which landed men on the moon.
A slide rule
In 1642 Blaise Pascal, at age 19, invented the Pascaline as an aid for his father who was a
tax collector. Pascal built 50 of this gear-driven one-function calculator (it could only
add).
3
Pascal's Pascaline
Just a few years after Pascal, the German Gottfried Wilhelm Leibniz managed to build a
four-function (addition, subtraction, multiplication, and division) calculator that he called
the stepped reckoner.
4
Jacquard's Loom showing the threads and the punched cards
By 1822 the English mathematician Charles Babbage was proposing a steam driven
calculating machine the size of a room, which he called the Difference Engine. This
machine would be able to compute tables of numbers, such as logarithm tables. He
obtained government funding for this project due to the importance of numeric tables in
ocean navigation
The 1890 census is tabulated on punch cards similar to the ones used 90 years earlier to
create weaves. Developed by Herman Hollerith of MIT, the system uses electric
power(non-mechanical). The Hollerith Tabulating Company is a forerunner of today's
IBM.
Just prior to the introduction of Hollerith's machine the first printing calculator is
introduced. In 1892 William Burroughs, a sickly ex-teller, introduces a commercially
successful printing calculator. Although hand-powered, Burroughs quickly introduces an
electronic model.
In 1925, unaware of the work of Charles Babbage, Vannevar Bush of MIT builds a
machine he calls the differential analyzer. Using a set of gears and shafts, much like
Babbage, the machine can handle simple calculus problems, but accuracy is a problem.
The period from 1935 through 1952 gets murky with claims and counterclaims of who
invents what and when. Part of the problem lies in the international situation that makes
much of the research secret. Other problems include poor record-keeping, deception and
lack of definition.
5
John Vincent Atanasoff begins work on a digital computer in 1936 in the basement of the
Physics building on the campus of Iowa State. A graduate student, Clifford (John) Berry
assists. The "ABC" is designed to solve linear equations common in physics. It displays
some early features of later computers including electronic calculations. He shows it to
others in 1939 and leaves the patent application with attorneys for the school when he
leaves for a job in Washington during World War II. Unimpressed, the school never files
and ABC is cannibalized by students.
First in Poland, and later in Great Britain and the United States, the Enigma code is
broken. Information gained by this shortens the war. To break the code, the British, led
by Touring, build the Colossus Mark I. The existence of this machine is a closely
guarded secret of the British Government until 1970. The United States Navy, aided to
some extent by the British, builds a machine capable of breaking not only the German
code but the Japanese code as well.
Early in 1945, with the Mark I stopped for repairs, Hopper notices a moth in one of the
relays, possibly causing the problem. From this day on, Hopper refers to fixing the
system as "debugging". The same year Von Neumann proposes the concept of a "stored
program" in a paper that is never officially published.
6
Work completes on ENIAC in 1946. Although only three years old the machine is
woefully behind on technology, but the inventors opt to continue while working on a
more modern machine, the EDVAC. Programming ENIAC requires it to be rewired. A
later version eliminates this problem. To make the machine appear more impressive to
reporters during its unveiling, a team member (possibly Eckert) puts translucent
spheres(halved ping pong balls) over the lights. The US patent office will later recognize
this as the first computer.
The next year scientists employed by Bell Labs complete work on the transistor (John
Bardeen, Walter Brattain and William Shockley receive the Nobel Prize in Physics in
1956), and by 1948 teams around the world work on a "stored program" machine. The
first, nicknamed "Baby", is a prototype of a much larger machine under construction in
Britain and is shown in June 1948.
The impetus over the next 5 years for advances in computers is mostly the government
and military. UNIVAC, delivered in 1951 to the Census Bureau, results in a tremendous
financial loss to its manufacturer, Remington-Rand. The next year Grace Hopper, now an
employee of that company proposes "reuseable software," code segments that could be
extracted and assembled according to instructions in a "higher level language." The
concept of compiling is born. Hopper would revise this concept over the next twenty
years and her ideas would become an integral part of all modern computers. CBS uses
one of the 46 UNIVAC computers produced to predict the outcome of the 1952
Presidential Election. They do not air the prediction for 3 hours because they do not trust
the machine.
7
With the introduction of Control Data's CDC1604 in 1958, the first transistor powered
computer, a new age dawns. Brilliant scientist Seymour Cray heads the development
team. This year integrated circuits are introduced by two men, Jack Kilby and John
Noyce, working independently. The second network is developed at MIT. Over the next
three years computers begin affecting the day-to-day lives of most Americans. The
addition of MICR characters at the bottom of checks is common.
In 1961 Fairchild Semiconductor introduces the integrated circuit. Within ten years all
computers use these instead of the transistor. Formally building sized computers are now
room-sized, and are considerably more powerful. The following year the Atlas becomes
operational, displaying many of the features that make today's systems so powerful
including virtual memory, pipeline instruction execution and paging. Designed at the
University of Manchester, some of the people who developed Colossus thirty years
earlier make contributions.
On April 7, 1964, IBM introduces the System/360. While a technical marvel, the main
feature of this machine is business oriented...IBM guarantees the "upward compatibility"
of the system, reducing the risk that a business would invest in outdated technology.
Dartmouth College, where the first network was demonstrated 25 years earlier, moves to
the forefront of the "computer age" with the introduction of TSS(Time Share System) a
crude(by today's standards) networking system. It is the first Wide Area Network. In
three years Randy Golden, President and Founder of Golden Ink, would begin working
on this network.
Within a year MIT returns to the top of the intellectual computer community with the
introduction of a greatly refined network that features shared resources and uses the first
minicomputer(DEC's PDP-8) to manage telephone lines. Bell Labs and GE play major
roles in its design.
In 1969 Bell Labs, unhappy with the direction of the MIT project, leaves and develops its
own operating system, UNIX. One of the many precursors to today's Internet, arpanet, is
quietly launched. Alan Keys, who will later become a designer for Apple, proposes the
"personal computer." Also in 1969, unhappy with Fairchild Semiconductor, a group of
technicians begin discussing forming their own company. This company, formed the next
year, would be known as Intel. The movie Colossus:The Forbin Project has a
supercomputer as the villain. Next year, The Computer Wore Tennis Shoes was the first
feature length movie with the word computer in the title. In 1971, Texas Instruments
introduces the first "pocket calculator." It weighs 2.5 pounds.
With the country embroiled in a crisis of confidence known as Watergate, in 1973 a little
publicized judicial decision takes the patent for the computer away from Mauchly and
Eckert and awards it to Atanasoff. Xerox introduces the mouse. Proposals are made for
the first local area networks.
In 1975 the first personal computer is marketed in kit form. The Altair features 256 bytes
of memory. Bill Gates, with others, writes a BASIC compiler for the machine. The next
8
year Apple begins to market PC's, also in kit form. It includes a monitor and keyboard.
The earliest RISC platforms become stable. In 1976, Queen Elizabeth goes on-line with
the first royal email message.
During the next few years the personal computer explodes on the American scene.
Microsoft, Apple and many smaller PC related companies form (and some die). By 1977
stores begin to sell PC's. Continuing today, companies strive to reduce the size and price
of PC's while increasing capacity. Entering the fray, IBM introduces it's PC in 1981(it's
actually IBM's second attempt, but the first failed miserably). Time selects the computer
as its Man of the Year in 1982.
You might wonder what type of event is required to dislodge an industry heavyweight. In
IBM's case it was their own decision to hire an unknown but aggressive firm called
Microsoft to provide the software for their personal computer (PC). This lucrative
contract allowed Microsoft to grow so dominant that by the year 2000 their market
capitalization (the total value of their stock) was twice that of IBM and they were
convicted in Federal Court of running an illegal monopoly. Computer programming in
the 1970's, you dealt with what today are called mainframe computers, such as the IBM
7094.
There were 2 ways to interact with a mainframe. The first was called time sharing
because the computer gave each user a tiny sliver of time in a round-robin fashion.
Perhaps 100 users would be simultaneously logged on, each typing on a teletype such as
the following:
9
The Teletype was the standard mechanism used to interact with a time-sharing computer
A teletype was a motorized typewriter that could transmit your keystrokes to the
mainframe and then print the computer's response on its roll of paper. You typed a single
line of text, hit the carriage return button, and waited for the teletype to begin noisily
printing the computer's response (at a whopping 10 characters per second). On the left-
hand side of the teletype in the prior picture you can observe a paper tape reader and
writer (i.e., puncher). Here's a close-up of paper tape:
After observing the holes in paper tape it is perhaps obvious why all computers use
binary numbers to represent data: a binary bit (that is, one digit of a binary
number) can only have the value of 0 or 1 (just as a decimal digit can only
have the value of 0 thru 9). Something which can only take two states is very
easy to manufacture, control, and sense.
The alternative to time sharing was batch mode processing, where the computer gives its
full attention to your program. In exchange for getting the computer's full attention at
run-time, you had to agree to prepare your program off-line on a key punch machine
which generated punch cards.
10
An IBM Key Punch machine which operates like a typewriter except it produces punched
cards rather than a printed sheet of paper
But things changed fast. By the 1990's a university student would typically own his own
computer and have exclusive use of it in his dorm room.
INTEL:
11
Noyce, Moore, and Andrew Grove leave Fairchild and found Intel in 1968 focus on
random access memory (RAM) chips.
Question: if you can put transistors, capacitors, etc. on a chip, why couldn’t you put a
central processor on a chip?
Ted Hoff designs the Intel 4004, the first microprocessor in 1969 based on Digital’s
PDP-8.
12
THE ORIGINAL PENTIUM:
The original Pentium is an extremely modest design by today's standards, and when it
was introduced in 1993 it wasn't exactly a blockbuster by the standards of its RISC
contemporaries, either. While its superscalar design (Intel's first) certainly improved on
the performance of its predecessor, the 486, the main thing that the Pentium had going
for it was x86 compatibility. In fact, Intel's decision to make enormous sacrifices of
performance, power consumption, and cost for the sake of maintaining the Pentium's
backwards compatibility with legacy x86 code was probably the most strategically-
important decision that the company has ever made.
The choice to continue along the x86 path inflicted some serious short- and medium-term
pain on Intel, and a certain amount of long-term pain on the industry as a whole (how
much pain depends on who you talk to), but as we'll see the negative impact of this
critical move has gradually lessened over time.
INTEL PROCESSORS
13
microprocessors contain many transistors
ARTIFICIAL INTELLIGENCE:
The branch of computer science concerned
with making computers behave like humans. The term was coined in 1956 by
John mccarthy at the Massachusetts Institute of Technology.
The intellectual roots of AI, and the concept of intelligent machines, may be found in
Greek mythology. Intelligent artifacts appear in literature since then, with
mechanical devices actually (and sometimes fraudulently) demonstrating
behaviour with some degree of intelligence. After modern computers became
available following World War II, it has become possible to create programs
that perform difficult intellectual tasks. Even more importantly, general
purpose methods and tools have been created that allow similar tasks to be
performed.
Artificial Intelligence is, in our opinion, the most exciting area of computer science.
Artificial Intelligence (or AI, for short) is the name given to any attempt to have
computers gain attributes of the human mind.
Of course, this is a very vague statement, and much argument has happened over what
exactly constitutes AI (mathematicians and scientists hate vague statements). There are
essentially two schools of thought: Strong AI and Weak AI.
Weak AI philosophers believe that computers, as advanced as they may get, will only be
able to seem intelligent. The responses of the computer may seem like intelligent actions,
but the Weak AI theory insists that the computers are just mindlessly manipulating data
to produce "intelligent" actions. Strong AI philosophers believe that computers someday
can be as intelligent as humans.
14
Of course, the heart of the discussion is this: What is intelligence? Normally, we declare
that humans are the standard for intelligence, but then, isn't human intelligence, at the
very basic level, just a bunch of mindless chemical reactions? Both the Weak AI and the
Strong AI have strong supporters, many of them quite fanatic.
Aside from the philosophy of the Strong AI and the Weak AI debate, there is also
another, related, division in AI: Connectionism and Classicalism.
Classicalism is the AI that is found in Chess programs, weather diagnostics, and language
processors. Classicalism, also known as the top-down approach, approaches AI from the
standpoint that human minds are computational by nature, that is, we manipulate data one
piece at a time (serially) through a built in circuitry in our brains. So, much of the
classical approach to AI consists of things like minimax trees, preprogrammed databases,
and prewritten code. Expert systems are another name for Classical AI.
Connectionism is the newer form of AI. The problem with classicalism, connectionists
say, is that it is too unlike the human mind. The human mind can learn, expand, and
change, but many of the Expert systems are too rigid and don't learn. Connectionism is
the AI that the media really likes, which is why it contains many famous names like
Neural Networks and Parallel Processing. Connectionism seems a step closer to the
human mind, since it uses networks of nodes that seem like the human brain's network of
neurons.
Connectionism, however, also has its flaws. Connectionism is many times inaccurate and
slow, and currently connectionism has failed to reach higher level AI, such as language
and some advanced logic, which humans seem to pick up easily in little time. Human
intelligence isn't built from scratch, like the Connectionist systems often are. So, for
those higher-level AI, Classicalism is by far the better suited. Connectionism, however, is
quite successful at modeling lower level thinking like motor skills, face-recognition, and
some vision.
Currently, no computers exhibit full artificial intelligence (that is, are able to simulate
human behavior). The greatest advances have occurred in the field of games playing. The
best computer chess programs are now capable of beating humans. In May, 1997, an
IBM super-computer called Deep Blue defeated world chess champion Gary Kasparov in
a chess match.
In the end, both viewpoints are valid in both debates. Regardless of the viewpoint,
however, the most important goal for AI is that it helps us understand the mechanisms of
the human mind.
CASE STUDIES:
The following four case studies highlight application areas where AI
technology is having a strong impact on industry and everyday life.
15
1. AUTHORIZING FINANCIAL TRANSACTIONS:
16
planning tool, DART, pressed into service for operations Desert Shield and Desert
Storm, completely repaid its three decades of investment in AI research.
THE FUTURE:
Even as AI technology becomes integrated into the fabric of everyday life, AI researchers
remain focused on the grand challenges of automating intelligence. Work is progressing
on developing systems that converse in natural language, that perceive and respond to
their surroundings, and that encode and provide useful access to all of human knowledge
and expertise. The pursuit of the ultimate goals of AI -- the design of intelligent artifacts;
understanding of human intelligence; abstract understanding of intelligence (possibly
superhuman) -- continues to have practical consequences in the form of new industries,
enhanced functionality for existing systems, increased productivity in general, and
improvements in the quality of life. But the ultimate promises of AI are still decades
away, and the necessary advances in knowledge and technology will require a sustained
fundamental research effort.
INFORMATION SYSTEM:
An information system is a mechanism that helps people,collect,store,organize, and use
information. Ultimately the information system is the computer’s reason for
being.
Because there are so many types of information-and uses for it-many kind of information
systems have been developed.
THE PHYSICAL:
The physical means for storing information such as file cabinet or hard
disk. Depending on the organization’s needs, storage requirement can be tiny
or enormous (involving a mainframe system with terabytes of disk space).
17
THE PROCEDURES:
THE RULE:
Although these three basic components are simple, a complex information system can be
very complicated. In addition to the three basic components listed above you
might at the means of distributing information to different users, whether it is a
system of desk trays or a moderen local area network most or today’s
information systems also include tools for sorting, categorizing, and analyzing
information.
18
and selling. For this reason, nearly any complete information system has an
office automation component.
Decision support systems are useful tools because they provide management with highly
tailored, highly structured data about relevant issues. Many decision support
systems are spreadsheet or data base applications that have been customized
for specific businesses. These powerful systems can import and analyze data in
various formats, such as flat database tables or spreadsheets, two-dimensional
charts, or multidimensional “cubes”. They can generate reports quickly, based
on existing data, and update those reports instantly as data changes.
19
A Management information system is a set of software tools that enables managers to
gather, organize, and evaluate information about a workgroup, department, or
an entire organization. These systems meet the needs of three different
categories of managers, executive, middle managers and front-line managers-
by producing a range of standardized reports drawn from the organization’s
database. An efficient management information system summarizes vast
amounts of business data into information that is useful to each type of
manager.
EXPERT SYSTEMS:
Expert systems are increasingly being used in the medical profession by enabling a
physician to enter a patient’s condition, along with treatment options. A
financial expert system might recommend that a particular client be given a
particular requested raise on his or her credit limit.
A BRIEF HISTORY:
SOME STATISTICS:
20
what the computers mean to businesses and how they contribute to increase their
productivity.
computers are extremely important in todays business's not only pcs but other
kinds of computers today most noticable the evelution of debit and credit cards this is all
electronic. It is the most convient and safest way to carry around your money.
Then there is the actual pcs and fax machines that have made things much easier to
recieve documents email information, find out information and much more that a
company might need to do quickly.
Also the standard pc is a great way of storing information or using special
software to make our jobs easier and erase human error that might occur . I am thinking
of my accounting class when saying this.
computers and computer networks act as the central nervous system of today's
enterprise. That is very true. It is not very often that you go to a business and
see them without a computer. Computers don't make mistakes (software does
though), you can store files upon files in a computer and it doesn't use a lot of
space, it is time efficient and if you are ever stuck with something at your
work all you have to do is look it up on the internet because it is most likely
there. There is almost 9 billion sites on google so there is plenty of information
to go around.
It is true that business is expanding quite a bit because of the internet and its
applications that are available to almost eveyone is the world in such a short
time. This is just a little reminder of how ebusiness and the role of computers
in business, Bill Gates was reported for the 11th time that he is the richest man
in the worl with a net worth of 51 billion (us.) Obviously his net worth is next
to impossible to also achieve through ebusiness and business in general. But
think of this though, the founders of google, Sergey Brin and Larry Page, are
the 16th richest in the world with 11 billion each. The scary thing is that last
year, their net worth was only 4 billion each. In one year for google they each
made 7 billion dollars!! So if you are thinking that ebusiness or business with
21
computers is not the way to make money for you, think it over, because we are
entering a very new world with how computers are being used, and obviously
there is money out there, even if it is a little lower then 51 and 11 billion.
1. OFFICE ROUTINES:
It is now possible to have many business functions operate on autopilot. This has opened
up new opportunities for software development companies and business consultants.
Another business trend that has opened up as a result of advancing technology is
outsourcing. It is now possible for a company in America to have its data entry and
customer service centers in overseas countries such as the UK. In this way, companies
can service their customers 24/7.
22
for recording all aspects of the goods coming in, details of goods and services,
distribution of stock, and storage details. Note, that in small retail and production firms,
simple computer software are generally used. Whereas in large corporations, Enterprise
Resource Planning (ERPs) are employed.
6. DATABASE MANAGEMENT:
7. SOFTWARE DEVELOPMENT:
5. TRENDS IN E-BUSINESS:
While there are many technical and business trends hare are few important ones.
23
a. E-BUSINESS IS CRUCIAL TO BUSINESS SUCCESS:
As anyone who has looked at the world of computers lately can attest, the size of
computers has been reduced sharply, even as the power of these machines has increased
at an exponential rate. In fact, the cost of computers has come down so much that many
households now own not only one, but two, three or even more, PCs.
24
As the world of computers and computer technology continues to evolve and change,
many people, from science fiction writers and futurists to computer workers and ordinary
users, have wondered what the future holds for the computer and related technologies.
Many things have been pictured, from robots in the form of household servants to
computers so small they can fit in a pocket. Indeed, some of these predicted inventions
have already come to pass, with the introduction of PDA's and robotic vacuum cleaners.
Beyond these innovations, however, there are likely to be many, many more. One of the
most important areas of research in the world of computers is that of artificial
intelligence. When many people think of artificial intelligence, they may picture fully
aware machines, complete with emotions, and the problems that can arise from them.
Even though this remains the goal of many artificial intelligence researchers, in fact
artificial intelligence technology is already in place and already serving the needs of
humans everywhere.
One of the most powerful uses of artificial intelligence thus far is in the world of speech
recognition. This powerful technology is already in place in call centers, banks,
brokerage centers, insurance companies and other businesses throughout the world.
While speech recognition is still imperfect, it has improved greatly in recent years, and in
the future many routine, and even non-routine, phone calls and telephone inquiries may
be handled completely without human intervention.
Robot technology has also come a long way, but it still has a long way to go. Robots in
the future are unlikely to take human form, expect in a few specialized applications.
Instead, robots are likely to do a great deal of work that is simply too dangerous for
humans to accomplish. From spaceflight applications to search and rescue, robots are
likely to continue down the learning curve they have already entered, further enhancing
human lives and providing valuable services for a fraction of the cost of today's robot
helpers.
Quantum computers are also likely to transform the computing experience, for both
business and home users. These powerful machines are already on the drawing board,
and they are likely to be introduced in the near future. The quantum computer is expected
to be a giant leap forward in computing technology, with exciting implications for
everything from scientific research to stock market predictions.
And if history is to be any guide, some of the most powerful advances in the world of
computers and computer technology are likely to be completely unforeseen. After all,
25
some of the most powerful technologies of the past have taken us by surprise, so stay
tuned for a truly fascinating future. Let CBT Planet help you with your computer training
needs. Get certified or get the skills you need to succeed in today's IT world.
26