You are on page 1of 8

Module one: Introduction to Computer

Science
What is Computer Science?
Everyday, we are surrounded and exposed to significant advances in computing: high-speed supercomputers that
perform over 8 quadrillion (1015) mathematical calculations per second, artificial intelligence systems that can answer
English questions faster and more accurate than humans, tiny computer chips being embedded into appliances,
clothing and even human bodies.

Technological breakthroughs will continue to grow at an exponential rate and will continue to change the lives of
everyone around the world. The goal of this course is to provide you with a basic understanding of Computer Science.

Definition of Computer Science


Learning Objectives

m
er as
o Understand and convey the common misconceptions in Computer Science.

co
o Provide a definition of Computer Science while capturing the breadth and various scopes of the discipline.

eH w
o Explain what an algorithm is in relevance to Computer Science, and provide examples.
Many people can produce a reasonable accurate description of most scientific fields, such as Chemistry and Biology,

o.
without ever taking an educational course on the subject. Computer Science is one of the youngest scientific fields and
rs e
yet most people cannot intuitively provide an accurate description of Computer Science and the types of problems
ou urc
computer scientists strive to solve.

The following are some common misconceptions about this field:


o

1. Computer Science is the study of computers


aC s
vi y re

Some of the earliest and most fundamental theoretical work in computer science took place from 1920-1940, years
before the first computer system was developed. Even today, there are branches of computer science that are quite
distinct from the study of actual computer machines. For example, the study of Human-Computer Interaction (HCI)
focuses on how people use computers and the user's experience. HCI combines a variety of disciplines other than
ed d

computer science, such as psychology, social sciences, cognitive sciences, and many more.
ar stu

2. Computer Science is the study of how to write computer programs


is

Many people are introduced to Computer Science by learning to write programs using programming languages, such
Th

as C++, Java or Python. This is an almost universal approach when introducing students to Computer Science at an
entry level. This may create a misunderstanding that computer science is equal to computer programming.

While computer programming is extremely important and enables programmers and researchers to create, test and
sh

implement new ideas, it is also just a tool (similar to the computer itself).

3. Computer Science is the study of the uses and applications of computers and software.

When introducing the concept of computer science without learning about programming, then computer science may
appear to only focus on the applications of computers and software. This could include mainstream applications such
as word processors, search engines, database systems, graphics and imaging software, web browsers and mobile
applications. All of these tools are widely used by programmers and non-programmers alike. While most professions

This study source was downloaded by 100000812010758 from CourseHero.com on 10-09-2021 06:26:26 GMT -05:00

https://www.coursehero.com/file/17958666/module-1-Introduction-to-Computer-Science/
use these computer applications for their jobs, computer scientists are responsible for designing, building, and testing
these applications, among other things.

These misconceptions about computer science are not necessarily incorrect, but more so incomplete. Computers,
programming languages, software and applications are all part of computer science. So how can we successfully define
computer science while capturing the breath and scope in a single definition?

Algorithms
There are many different definitions of computer science, but there is one definition that best encompasses the breadth
and diversity of computer science. Professors Norman Gibbs and Allen Tucker defined computer science as the study
of algorithms; that the task of a computer scientist is to design and develop algorithms to solve a range of important
problems. This design process includes:

1. Formal and Mathematical Properties: Studying the behavior of algorithms to determine if they are
correct and efficient.
2. Hardware Realizations: Designing and building computer systems that are able to execute algorithms.
3. Linguistic Realizations: Designing programming languages and translating algorithms into these
languages so they can be executed by the hardware

m
4. Applications: Identifying important problems and designing correct and efficient software packages to

er as
solve these problems.

co
So what exactly is an algorithm? The dictionary defines the word algorithm as follows:

eH w
o.
A procedure for solving a mathematical problem in a finite number of steps that frequently involves repetition of an

rs e
operation; A step-by-step method for accomplishing a task.
ou urc
An informal definition of an algorithm would be an ordered sequence of instructions that is guaranteed to solve a
specific problem. It might look something like this:
o

o Step 1: Do something
aC s

o Step 2: Do something
vi y re

o Step 3: Do something
o .
o .
o .
ed d

o Step N: Stop, you are finished.


ar stu

If you carefully follow this list of steps, you should have a solved problem once you reach the end. We use algorithms
all the time in our everyday lives, even though we don't call them algorithms. Whenever we follow a set of instructions
to assemble a child's toy, bake a cake, or go through a college registration process, we are following an algorithm that
will solve our problem or the task at hand.
is

Example: An algorithm taken from the back of a shampoo bottle and instructs the user how to use that product:
Th

o Step 1: Wet Hair


o Step 2: Lather
o Step 3: Rinse
sh

o Step 4: Repeat
There is a lot of ambiguity with this algorithm. In Step 4, what operations should be repeated? If we go back to Step 1,
we are unnecessarily re-wetting the hair (since it will probably be still wet from the beginning). If we go back to Step
3, our hair won't be any cleaner since we didn't reapply the shampoo the second time. Therefore, Step 4 is too
ambiguous as to what we should repeat. This violates the well ordered requirement of an algorithm. Also, this
algorithm never stops! - it will continue to repeat this process forever.

This study source was downloaded by 100000812010758 from CourseHero.com on 10-09-2021 06:26:26 GMT -05:00

https://www.coursehero.com/file/17958666/module-1-Introduction-to-Computer-Science/
The following statements are too ambiguous and can leave us confused about which step we should execute next:

o Go back and do it again (Do what again?)


o Start over (From where?)
o If you understand this material, you may skip ahead (How far?)
o Do either Part 1 or Part 2 (How do I decide which one to do?)
We must be extremely precise in specifying the order of operations to be carried out. One way is to number the steps
and reference those numbers. For example:

o Go back to Step 3 and continue execution from that point


o Start over from Step 1
o If you understand this material, skip ahead to line 21
o If you are in Grade 12, do Part 1 beginning with Step 9; otherwise, do Part 2 beginning with step 40.
Algorithms are composed of "operations". Operations must meet two criteria: they must be unambiguous and they
must be effectively computable.

An example of an algorithm to make a pie:

m
o Step 1: Make the crust

er as
o Step 2: Make the cherry filling

co
o Step 3: Pour the filling into the crust

eH w
o Step 4: Bake at 350 degrees for 45 minutes
This algorithm would be fine for an advanced baker. But for beginner cooks, Step 1 & 2 would be very confusing

o.
rs e
while step 3 & 4 would be easily executable. A more detailed algorithm would be:
ou urc
o Step 1: Make the crust
o 1.1 Take 1 1/3 cups of flour
o 1.2 Sift the flour
o

o 1.3 Mixed the sifted flour with 1/2 cup butter and 1/4 cup water
aC s

o 1.4 Roll into two 9-inch pie crusts


vi y re

o Step 2: Make the cherry filling


o 2.1 Open a 16-ounce can of cherry pie filling and pour into a bowl
o 2.2 Add a dash of cinnamon and nutmeg, and stir.
Now most inexperienced bakers can follow these instructions and successfully make a pie. But what about small
ed d

children - would they be able to follow these instructions? You would probably have to further define Step 1.2, since
ar stu

most children do not know how to sift flour. Overall, every step is now clearly worded and does not contain any
unambiguous operations.

Why should operations be unambiguous?


is

When writing algorithms in computer science, most of the time our algorithms will be executed by a computing
Th

agent. That means a machine, robot, person or thing will be carrying out out the steps in the algorithm. Computers are
only as intelligent as the algorithms we program them to follow. Therefore, if our algorithms are ambiguous, our
algorithms will be faulty and the computer may not understand what we are trying to instruct it to do.
sh

It is not enough that an operation is understandable. Operations must be doable by the computing agent. For example,
if an algorithm tells you to flap your arms and fly, you understand exactly what it is asking you to do. However, you
are incapable of flying, so that operation is not doable. The term effectively computable means there is a
computational process that allows the computing agent to complete the operation successfully. Another important
characteristic of an algorithm is that the result must be reached after a finite number of operations. The algorithm must
reach the end of it's operations and stop. If not, then algorithms would continue to loop forever!

This study source was downloaded by 100000812010758 from CourseHero.com on 10-09-2021 06:26:26 GMT -05:00

https://www.coursehero.com/file/17958666/module-1-Introduction-to-Computer-Science/
Now that we investigated all the important characteristics and traits of an algorithm, we are able to come up with a
suitable definition:

An algorithm is a well-ordered collection of unambiguous computable operations that, when executed, produces a
result and halts in a finite amount of time.

Evolution of Computers
Learning Objectives
o Identify and describe some historical inventions that influenced the design of the modern computer.
o Identify some prominent people credited for ideas leading to the design of the modern computer.
o Gain a historical perspective of the evolution of computers.
o Understand the key characteristics that distinguish each generation.
Computers have infiltrated every aspect of our modern society and seem to be a relatively new phenomenon, but the
ideas behind computing and computation are not new. Although the tools we use have evolved, computing has been
around for centuries.

Today's computers do much more than simple computations. For example: many people perform banking transactions

m
online, computers in supermarkets are able to scan and calculate grocery bills, email is now a common means of

er as
communication, traffic in major cities and highways is controlled by computers, police keep criminal records in

co
computer databases, and you are taking a university course online.

eH w
But where did the idea of computing and computation come from? Humans have needed to quantify and count things

o.
for thousands of years. Throughout history, as trade increased and cultures became more complex, the need for more
rs e
advanced counting tools became apparent. While the modern computer has existed for only half a century, the ideas
ou urc
behind its development have been forming for centuries. This lesson looks at a few key inventions that shaped and
influenced the design of the modern computer.
o

Evolution of Computers: Early Period (up to 1940)


aC s

2500 BC: The Abacus


vi y re

One early counting tool was the Abacus. The Abacus was invented thousands of years ago. It was used by Chinese,
Babylonians, and other early cultures to help with counting, adding, and subtracting. It was made of beads and wooden
rods. The Abacus was a memory aid rather than a calculator. It did not perform any calculations but simply helped
ed d

keep track of numbers while someone mentally calculated a result.


ar stu

1642: Pascal's Pascaline

As a young man, Blaise Pascal had to help his father with his work as a tax collector. This was very tedious work since
is

all the calculations were done by hand. Pascal decided to build a machine to help with these calculations. This
mechanical device was eventually named the Pascaline. It used a system of gears and teethed wheels to perform
Th

addition and subtraction.

Reaction to the Pascaline was surprisingly negative. People were distrustful of a calculating machine -- was it
accurate? Was it cheating them of their hard-earned money? And even more pressing was the worry that it would take
sh

away jobs and lead to unemployment. Unpopular as it was in its own time, the Pascaline led the way for the
development of more complicated calculating machines by later generations of inventors.

1801: Jacquard Loom

In the early 1800s, Joseph-Marie Jacquard invented an automatic weaving loom. This machine was capable of
automatically weaving complex patterns by following the instructions from a series of punch cards. The cards were
attached together to form a continuous loop, therefore repeating the weaving pattern until manually stopped. The

This study source was downloaded by 100000812010758 from CourseHero.com on 10-09-2021 06:26:26 GMT -05:00

https://www.coursehero.com/file/17958666/module-1-Introduction-to-Computer-Science/
Jacquard Loom represents the first time the idea of a "stored" program was implemented. The same loom could weave
different patterns simply by changing the loop of punch cards. Although initially destroyed by weavers fearing for their
jobs, the Jacquard Loom was widely used in the textile industry within ten years of its invention.

1821: Babbage's Difference Machine

Babbage designed a device named the Difference Engine. Its purpose was to automate the production of mathematical
tables. At the time, these tables were calculated by hand and required enormous amounts of time and effort. The
Difference Engine was intended to be fully automatic, accurate to 31 decimal places, and provide printouts of its
resulting tables. Babbage began to build his Difference Engine, but ran out of funding long before its completion.

A model of Babbage's Difference Engine was built in the 1990s, using his original plans. It works perfectly,
highlighting the fact that Babbage's idea was well ahead of its time. Technology of the day simply did not provide the
means to build such a precise and complex machine. Babbage's next project was even more ambitious -- to build the
world's first general-use programmable computer. His dream machine was called the Analytical Engine. This machine
was never built, but the plans described a machine conceptually very similar to today's computers.

1890: Hollerith's Tabulating Machine

m
er as
The US Census Bureau had a problem. It was estimated that the 1890 census would take well over ten years to tabulate
using the existing manual method. The Bureau held a competition to determine what new tabulating method would be

co
used for that year's census. Herman Hollerith's Tabulating Machine was the clear winner. His electrical system of

eH w
punch cards and clock-like counters reduced the amount of time needed to process the census data from ten years

o.
down to three months. Hollerith's Tabulating Machine proved successful for other statistical applications as well. In
rs e
fact, Hollerith's company still exists today – in 1924, it became known as International Business Machines, or IBM.
ou urc
Evolution of Computers: The First Computers (1940-1950)
By the twentieth century, technology had reached a point where it was feasible to build the type of machines once
o

imagined by earlier visionaries such as Charles Babbage. These computers barely resemble today's computers but they
aC s

represented huge advancements in the technology of the time.


vi y re

1939: Atanasoff-Berry Computer (ABC)

In 1939, John Atanasoff wanted a machine to help solve complicated mathematical equations. He developed what was
almost the first electronic digital computer, but he never managed to completely finish building his machine, as he was
ed d

distracted by projects brought on by the war. Atanasoff was a professor at Iowa State University at the time. For some
ar stu

reason, the patent paperwork for the ABC was never filed by the university and the machine was left to gather dust in
the basement of the physics building. When storage space was needed, the ABC was dismantled. After a lengthy court
battle in 1973, a Federal Judge finally concluded that Atanasoff was in fact the inventor of the first electronic digital
computer and declared invalid the earlier patent claiming that the ENIAC was the first computer.
is

1943: The Colossus


Th

The Colossus was created in 1943 by Alan Turing along with a team of mathematicians and engineers who were
working at the top-secret Bletchley Park intelligence/code breaking centre in England. The Colossus was the first fully
electronic computer. It was used throughout much of the Second World War to crack German military codes. It played
sh

a significant role in the allied victory as the Germans did not know their messages were being intercepted and decoded.

The Colossus processed information at a rate of 5000 characters per second. This was very fast, even modern desktop
computers would take almost the same amount of time to perform the same decoding task. It was built using close to
two thousand vacuum tubes, which are prone to failure, especially when powering on and off - so once powered on, the
Colossus was never turned off during the entire war. In all, 10 Colossi were built and used during the war. At the end of
the war, all of the Colossi were dismantled and all technical diagrams burned to keep their existence secret. No one

This study source was downloaded by 100000812010758 from CourseHero.com on 10-09-2021 06:26:26 GMT -05:00

https://www.coursehero.com/file/17958666/module-1-Introduction-to-Computer-Science/
even acknowledged that the machines had existed until 1976 when secret documents were released by the British
government.

1944: Harvard Mark I

The Mark I was the first automatic, general-purpose electromechanical calculator. It was sponsored by IBM and the
US Navy and intended for calculating ballistics trajectories for the military. It was designed by Howard Aiken and
Grace Hopper at Harvard University. This huge machine measured 55 feet long, 8 feet high, and weighed 5 tons. It
used punch cards as input and was extremely noisy. Its speed was remarkable -- it was able to do calculations five or
six times faster than a human!

These early computers were not nearly as reliable as today's computers. They often had hardware problems. One day, a
particular hardware failure had everyone puzzled. It was Dr. Grace Hopper who solved the problem -- she found that a
moth had managed to find its way inside the machine and caused it to stop working, hence the first "computer bug". To
this day, the term is still used to describe puzzling computer problems. While the Mark I was soon surpassed in terms
of speed by newer computers, it continued to be used for over 15 years, a much longer lifespan than most modern
computers!

1946: ENIAC

m
er as
The ENIAC is often called the First Electronic Digital Computer; it competes with the Atanasoff-Berry Computer for

co
this title. Its inventors, John Mauchly and Presper Eckbert had spent some time in Atanasoff's lab examining the ABC

eH w
and it is argued that their design ideas are simply modifications to the ABC.

o.
rs e
While this first title is debatable, it is definitely the first "general-purpose" digital computer. However,
ou urc
"reprogramming" the ENIAC meant unplugging and re-plugging some 6000 wires in a new configuration!
Reprogramming was a very tedious task to say the least, but it was the first computer to perform different tasks simply
by changing its wiring.
o

The ENIAC weighed 30 tons and measured two stories high. It performed at least 500 times faster than previous
aC s

calculating machines. However, one problem with the ENIAC was that it required replacement of some of its 18 000
vi y re

vacuum tubes approximately every 7 minutes. Like the Mark I, the ENIAC was originally built to help the US military
in WWII efforts, but its official unveiling occurred two months after the war had ended.

Evolution of Computers: Modern Era (1950-Present)


ed d

In the last few decades, computer technology has advanced very quickly. A computer that was considered most
ar stu

powerful two years ago is now ancient by technology standards. Computer generations are an attempt to broadly group
computers in terms of their type of technology and age. The dates defining each generation are approximations,
different sources will use different dates, but the overall idea of computer generations remains the same.
is

First Generation (1950-1957)


Th

The First Generation of computers dates back to the 1950s, when a single computer filled a large room and consumed
the same amount of energy as an entire block of homes. These machines used vacuum tubes and had the processing
power of today's pocket calculators. Due to their size and their cost, First Generation computers were used only by
sh

large corporations, research institutions, military facilities, and government departments. At the time, popular opinion
was that only a few dozen computers would ever be required to meet the needs of the world.

A quote from a magazine in 1949 gives an idea of the predicted future of computers:

"Where a calculator on the ENIAC is equipped with 18,000 vacuum tubes and weighs 30 tons, computers in the future
may have only 1,000 vacuum tubes and perhaps weigh 1-1/2 tons." --Popular Mechanics, March 1949

This study source was downloaded by 100000812010758 from CourseHero.com on 10-09-2021 06:26:26 GMT -05:00

https://www.coursehero.com/file/17958666/module-1-Introduction-to-Computer-Science/
Second Generation (1957-1965)

The Second Generation of computers is marked by the replacement of vacuum tubes by transistors. While each
vacuum tube was the size of a light bulb, the new transistors were approximately the size of a thumb nail. The switch
to transistors meant several improvements to computers:

o Increased speed
o Reduced size
o More energy efficient
o Much more reliable
These improvements helped make computers available to other markets. It was now possible for airlines and even
small businesses to purchase a computer. The primary programming languages of the time were FORTRAN (for
scientific applications), COBOL (for business needs), and BASIC (for educational purposes). Versions of these
languages can still be found today.

Third Generation (1965-1975)

The significant attribute of this generation was the invention of integrated circuits. Integrated circuits combined
transistors, wires, etc. onto one silicon chip. The circuits on this chip were thousands of times smaller than the original

m
er as
transistors, greatly reducing the size of computers once again.

co
eH w
Integrated circuits meant that computers could now be included in different machines because the size of computers
was no longer a problem. The Third Generation saw the start of embedded computers. Computers were incorporated

o.
into traffic lights, elevators, pocket calculators, etc. To give an idea of the improvements in technology since the First
rs e
Generation, picture this handheld calculator: it weighed half a pound, it was faster than the ENIAC, and it cost
ou urc
1/10000th the cost of the ENIAC.

Fourth Generation (1975-1985)


o

This generation is characterized by Large Scale Integration (LSI) and Very Large Scale Integration (VLSI). Hundreds
aC s

of thousands (and later millions) of components could now fit onto one chip about the size of half a dime. This again
vi y re

greatly reduced the size and price of computers as well as increased their power and efficiency. LSI and VLSI allowed
computers to be brought into people's homes. Items like microwave ovens, television sets, and automobiles
incorporated computers. During this time, personal computers were also introduced for home, office, and school use.
This era also introduced computer networks, allowing users to communicate with each other using computers.
ed d

Electronic mail became and important and age-defining application.


ar stu

Fifth Generation (1985-Present)

The fifth generation is more difficult to define because it is where we are today. Some characteristics and
is

developments include:
Th

o artificial intelligence
o parallel processors (i.e. having many processors working at once)
o superconductor technology (allowing electricity to flow with less resistance, thus improving speed of
information flow)
sh

o Smart phones, tablets,handheld digital devices


o High resolution graphics used in animation, imaging, movies, video games and virtual reality.
o Wireless communications
o Integrated digital devices; combining data, television, telephone, camera, fax, and the Internet.
o And many more....

This study source was downloaded by 100000812010758 from CourseHero.com on 10-09-2021 06:26:26 GMT -05:00

https://www.coursehero.com/file/17958666/module-1-Introduction-to-Computer-Science/
Moore's Law

Since the development of the first integrated circuits in the 1950's, the number of transistors on a circuit board has been
doubling every 24 months.This observation was first reported by Gordon Moore in 1964 and is now referenced as
"Moore's Law". Moore's Law states that the capacity of computers have doubled yearly since the 1950s with no
significant increase in cost.

More transistors on a chip means more speed and more power, and is the reason for the enormous increase in
performance for computers for the past 50 years. Industry experts has predicted Moore's Law to only last 10-15 years,
but the development of new materials and new technologies has allowed for this industry to continue at it's
phenomenal rate of improvement.

A popular comparison highlights this advancement:

"If automotives progressed as fast as computer technology, today's car would have a 1/10 th inch engine, get 120,000
miles per gallon, run at a speed of 240,000 miles per hour, and cost $4.00."

m
er as
co
eH w
o.
rs e
ou urc
o
aC s
vi y re
ed d
ar stu
is
Th
sh

This study source was downloaded by 100000812010758 from CourseHero.com on 10-09-2021 06:26:26 GMT -05:00

https://www.coursehero.com/file/17958666/module-1-Introduction-to-Computer-Science/
Powered by TCPDF (www.tcpdf.org)

You might also like