You are on page 1of 51

LESSON 1: COMPUTER HARDWARE

LEARNING OBJECTIVES

 Define what is a computer


 Explain the different types of computers
 Know the different parts of computers and its peripheral devices
 Know the history of development of computers

WHAT IS A COMPUTER

A computer is a machine or device that performs processes, calculations and operations based on
instructions provided by a software or hardware program. It is designed to execute applications and
provides a variety of solutions by combining integrated hardware and software components.

The term computer is derived from the Latin term ‗computare‘, this means to calculate or
programmable machine. Computer cannot do anything without a Program. It represents the decimal
numbers through a string of binary digits. The Word 'Computer' usually refers to the Center Processor
Unit Plus Internal memory.

The term "computer" was originally given to humans (human computers) who performed
numerical calculations using mechanical calculators, such as the abacus and slide rule. The term was
later given to a mechanical device as they began replacing the human computers. Today's computers are
electronic devices that accept data (input), process that data, produce output, and store (storage) the
results.

COMPUTER TYPES

Computers are classified into five main categories which relies on the size and power of the unit:

1. Personal Computer (PC) - A small, single-user computer based on a microprocessor. Defined


as a small, relatively inexpensive computer designed for an individual user. In price, personal
computers range anywhere from a few hundred pounds to over five thousand pounds. All are
based on the microprocessor technology that enables manufacturers to put an entire CPU on one
chip. Businesses use personal computers for word processing, accounting, desktop publishing,
and for running spreadsheet and database management applications. At home, the most popular
use for personal computers is for playing games and recently for surfing the Internet. Furtherly
classified into its subtypes:
a. Tower Model - The term refers to a computer in which the power supply, motherboard,
and mass storage devices are stacked on top of each other in a cabinet. This is in
contrast to desktop models, in which these components are housed in a more compact
box. The main advantage of tower models is that there are fewer space constraints,
which makes installation of additional storage devices easier.
b. Desktop Model - A computer designed to fit comfortably on top of a desk, typically with
the monitor sitting on top of the computer. Desktop model computers are broad and low,
whereas tower model computers are narrow and tall. Because of their shape, desktop
model computers are generally limited to three internal mass storage devices. Desktop
models designed to be very small are sometimes referred to as slimline models.
c. Notebook Computer - An extremely lightweight personal computer. Notebook
computers typically weigh less than 6 pounds and are small enough to fit easily in a
briefcase. Aside from size, the principal difference between a notebook computer and a
personal computer is the display screen. Notebook computers use a variety of
techniques, known as flat panel technologies, to produce a lightweight and non-bulky
display screen. The quality of notebook display screens varies considerably. In terms of
computing power, modern notebook computers are nearly equivalent to personal
computers. They have the same CPUs, memory capacity, and disk drives. However, all
this power in a small package is expensive. Notebook computers cost about twice as
much as equivalent regular-sized computers. Notebook computers come with battery
packs that enable you to run them without plugging them in. However, the batteries need
to be recharged every few hours.
d. Laptop - A small, portable computer -- small enough that it can sit on your lap.
Nowadays, laptop computers are more frequently called notebook computers.

2. Workstation - A powerful, single-user computer. A workstation is like a personal computer, but it


has a more powerful microprocessor and, in general, a higher quality monitor. It is a type of
computer used for engineering applications (CAD/CAM), desktop publishing, software
development, and other types of applications that require a moderate amount of computing power
and relatively high quality graphics capabilities. Workstations generally come with a large, high-
resolution graphics screen, at large amount of RAM, built-in network support, and
a. graphical user interface. Most workstations also have a mass storage device such as a
disk drive, but a special type of workstation, called a diskless workstation, comes without
a disk drive. The most common operating systems for workstations are UNIX and
Windows NT. Like personal computers, most workstations are single-user computers.
However, workstations are typically linked together to form a local-area network, although
they can also be used as stand-alone systems.
3. Minicomputer - A multi-user computer capable of supporting up to hundreds of users
simultaneously. It is a midsize computer. In the past decade, the distinction between large
minicomputers and small mainframes has blurred, however, as has the distinction between small
minicomputers and workstations. But in general, a minicomputer is a multiprocessing system
capable of supporting from up to 200 users simultaneously.
4. Mainframe - A powerful multi-user computer capable of supporting many hundreds or thousands
of users simultaneously. Mainframe was a term originally referring to the cabinet containing the
central processor unit or "main frame" of a room-filling Stone Age batch machine. After the
emergence of smaller "minicomputer" designs in the early 1970s, the traditional big iron machines
were described as "mainframe computers" and eventually just as mainframes. Nowadays a
Mainframe is a very large and expensive computer capable of supporting hundreds, or even
thousands, of users simultaneously.
5. Supercomputer - An extremely fast computer that can perform hundreds of millions of
instructions per second. Supercomputer is a broad term for one of the fastest computers currently
available. Supercomputers are very expensive and are employed for specialized applications that
require immense amounts of mathematical calculations (number crunching). For example,
weather forecasting requires a supercomputer. Other uses of supercomputers scientific
simulations, (animated) graphics, fluid dynamic calculations, nuclear energy research, electronic
design, and analysis of geological data (e.g. in petrochemical prospecting).
HISTORY OF COMPUTERS
DATE EVENTS
1801 In France, Joseph Marie Jacquard invents a loom that uses punched wooden
cards to automatically weave fabric designs. Early computers would use
similar punch cards.

1890 Herman Hollerith designs a punch card system to calculate the 1880 census,
accomplishing the task in just three years and saving the government $5
million. He establishes a company that would ultimately become IBM.

1936 Alan Turing presents the notion of a universal machine, later called the Turing
machine, capable of computing anything that is computable. The central
concept of the modern computer was based on his ideas.

1937 J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts to
build the first computer without gears, cams, belts or shafts.

Hewlett-Packard is founded by David Packard and Bill Hewlett in a Palo Alto, California,
garage, according to the Computer History Museum.

1941 Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29
equations simultaneously. This marks the first time a computer is able to store information
on its main memory.

1943-44 Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert, build the
Electronic
Numerical Integrator and Calculator (ENIAC). Considered the grandfather of digital
computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes.

1946 Mauchly and Presper leave the University of Pennsylvania and receive funding from the
Census Bureau to build the UNIVAC, the first commercial computer for business and
government applications.

1947 William Shockley, John Bardeen and Walter


Brattain of Bell Laboratories invent the transistor. They discovered how to make an electric
switch with solid materials and no need for a vacuum.

1953 Grace Hopper develops the first computer language, which eventually becomes known as
COBOL. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr.,
conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954 The FORTRAN programming language, an acronym for FORmula TRANslation, is


developed by a team of programmers at IBM led by John Backus, according to the
University of Michigan

1958 Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby
was awarded the Nobel Prize in Physics in 2000 for his work.

1964 Douglas Engelbart shows a prototype of the modern computer, with a mouse and a
graphical user interface (GUI). This marks the evolution of the cor from a specialized
machine for scientists and mathematicians to technology that is more accessible to the
general public.

1969 A group of developers at Bell Labs produce UNIX, an operating system that addressed
compatibility issues. Written in the C programming language, UNIX was portable across
multiple platforms and became the operating system of choice among mainframes at large
companies and government entities. Due to the slow nature of the system, it never quite
gained traction among home PC users.

1971 Alan Shugart leads a team of IBM engineers who invent the "floppy disk," allowing data to be
shared among computers.
1973 Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for
connecting multiple computers and other hardware.

1974-1977 A number of personal computers hit the market, including Scelbi & Mark-8 Altair, IBM
5100, Radio Shack's TRS-80 — affectionately known as the "Trash 80" — and the
Commodore PET.

1975 The January issue of Popular Electronics magazine features the Altair 8080, described as
the "world's first minicomputer kit to rival commercial models." Two "computer geeks," Paul
Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language.
On April 4, after the success of this first endeavor, the two childhood friends form their own
software company, Microsoft.

1976 Steve Jobs and Steve Wozniak start Apple Computers on April Fool's Day and roll out the
Apple I, the first computer with a single-circuit board, according to Stanford University.

1977 Radio Shack's initial production run of the TRS-80 was just 3,000. It sold like crazy. For
the first time, non-geeks could write programs and make a computer do what they wished.

1977 Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast
Computer Faire. It offers color graphics and incorporates an audio cassette drive for
storage.
1978 Accountants rejoice at the introduction of VisiCalc, the first computerized spreadsheet
program.
1979 Word processing becomes a reality as MicroPro International releases WordStar. "The
defining change was to add margins and word wrap," said creator Rob Barnaby in email to
Mike Petrie in 2000. "Additional changes included getting rid of command mode and
adding a print function. I was the technical brains — I figured out how to do it, and did it,
and documented it. "
The first IBM personal computer, introduced on
Aug. 12, 1981, used the MS-DOS operating system.

1981 The first IBM personal computer, code-named "Acorn," is introduced. It uses Microsoft's
MS-DOS operating system. It has an Intel chip, two floppy disks and an optional color
monitor. Sears & Roebuck and Computerland sell the machines, marking the first time a
computer is available through outside distributors. It also popularizes the term PC.

1983 Apple's Lisa is the first personal computer with a GUI. It also features a drop-
down menu and icons. It flops but eventually evolves into the Macintosh. The
Gavilan SC is the first portable computer with the familiar flip form factor and
the first to be marketed as a "laptop."

1985 Microsoft announces Windows, according to Encyclopedia Britannica. This


was the company's response to Apple's GUI. Commodore unveils the Amiga
1000, which features advanced audio and video capabilities.

1985 The first dot-com domain name is registered on March 15, years before the
World Wide Web would mark the formal beginning of Internet history. The
Symbolics Computer Company, a small
Massachusetts computer manufacturer, registers Symbolics.com. More than
two years later, only 100 dot-coms had been registered.

1986 Compaq brings the Deskpro 386 to market. Its 32bit architecture provides as
speed comparable to mainframes.
1990 Tim Berners-Lee, a researcher at CERN, the highenergy physics laboratory in
Geneva, develops HyperText Markup Language (HTML), giving rise to the
World Wide Web.

1993 The Pentium microprocessor advances the use of graphics and music on PCs.

1994 PCs become gaming machines as "Command & Conquer," "Alone in the Dark
2," "Theme Park," "Magic Carpet," "Descent" and "Little Big Adventure" are
among the games to hit the market.
1996 Sergey Brin and Larry Page develop the Google search engine at Stanford
University.

1997 Microsoft invests $150 million in Apple, which was struggling at the time,
ending Apple's court case against Microsoft in which it alleged that Microsoft
copied the "look and feel" of its operating system.

1999 The term Wi-Fi becomes part of the computing language and users begin
connecting to the
Internet without wires.

2001 Apple unveils the Mac OS X operating system, which provides protected
memory architecture and pre-emptive multi-tasking, among other benefits. Not
to be outdone, Microsoft rolls out Windows XP, which has a significantly
redesigned GUI.

2003 The first 64-bit processor, AMD's Athlon 64, becomes available to the
consumer market.

2004 Mozilla's Firefox 1.0 challenges Microsoft's Internet Explorer, the dominant
Web browser. Facebook, a social networking site, launches.

2005 YouTube, a video sharing service, is founded. Google acquires Android, a


Linux-based mobile phone operating system.
2006 Apple introduces the MacBook Pro, its first Intel based, dual-core mobile
computer, as well as an Intel-based iMac. Nintendo's Wii game console hits
the market.

2007 The iPhone brings many computer functions to the smartphone.

2009 Microsoft launches Windows 7, which offers the ability to pin applications to the
taskbar and advances in touch and handwriting recognition, among other
features.

2010 Apple unveils the iPad, changing the way consumers view media and
jumpstarting the dormant tablet computer segment

2011 Google releases the Chromebook, a laptop that runs the Google Chrome OS.

2015 Apple releases the Apple Watch. Microsoft releases Windows 10.

2016 The first reprogrammable quantum computer was created. "Until now, there
hasn't been any quantum-computing platform that had the capability to
program new algorithms into their system. They're usually each tailored to
attack a particular algorithm," said study lead author Shantanu Debnath, a
quantum physicist and optical engineer at the University of Maryland, College
Park.

2017 The Defense Advanced Research Projects Agency (DARPA) is developing a


new "Molecular Informatics" program that uses molecules as computers.
"Chemistry offers a rich set of properties that we may be able to harness for
rapid, scalable information storage and processing," Anne Fischer, program
manager in DARPA's Defense Sciences Office, said in a statement. "Millions of
molecules exist, and each molecule has a unique three-dimensional atomic
structure as well as variables such as shape, size, or even color. This richness
provides a vast design space for exploring novel and multi-value ways to
encode and process data beyond the 0s and 1s of current logic-based, digital
architectures."

Activity 1. TRUE OR FALSE

Directions: Answer the following questions with True or False on the space provided.

1. In 1954 scientist were able to predict exactly what computers would like today.

2. Logging off the computer will close any open program

3. Search engines make it harder to find information on the internet.

4. John Lovelace created a machine called the Analytical Engine. His ideas were some of
the first that led to the creation of computers

5. Charles Babbage was created the first computer program. The program was made to
help the Analytical Engine calculate numbers.

6. Steve Jobs was known as inventor of the modern computer. He actually created the
first fully electronic computer.

7. The invention was 1,000 times faster than any machines built before it. It was big and
known as apple.

8. The transistor replaced vacuum tubes and made computers much smaller and faster.
9. Al Gore invented the internet.

10. Resistor invented to helped make computer much smaller and faster

LESSON 2: COMPUTER PROCESSOR and AUTHENTICATION


IDENTIFICATION FUNCTIONS

LEARNING OBJECTIVES

 Know what a Computer Processor is


 Know the Types of Computer Processor
 Know the history of development of computer processor

COMPUTER PROCESSOR

A processor, or "microprocessor," is a small chip that resides in computers and other electronic
devices. Its basic job is to receive input and provide the appropriate output. While this may seem like a
simple task, modern processors can handle trillions of calculations per second.

A computer processor is the part of a computer that analyzes, controls and disperses data.
Commonly referred to as the central processing unit or CPU, a computer processor acts as the brains of
the computer, telling which program and application to do what at a specific time and interval. Modern
computer processors operate with speeds of 2.6 to 3.66 gigahertz. The most advanced models are even
faster. It takes the form of a small microchip that fits into a series of sockets in the motherboard. The
more powerful the computer processor is on the computer, the faster and more efficient the machine will
run.

Besides the central processing unit, most desktop and laptop computers also include a GPU.
This processor is specifically designed for rendering graphics that are output on a monitor. Desktop
computers often have a video card that contains the GPU, while mobile devices usually contain a
graphics chip that is integrated into the motherboard. By using separate processors for system and
graphics processing, computers are able to handle graphic-intensive applications more efficiently.

FUNCTION OF A COMPUTER PROCESSOR

A microprocessor is a silicon chip containing millions of microscopic transistors. This chip


functions as the computer's brain. It processes the instructions or operations contained within executable
computer programs. Instead of taking instructions directly off of the hard drive, the processor takes its
instructions from memory. This greatly increases the computer's speed.

TYPES OF COMPUTER PROCESSOR

A microprocessor is a silicon chip containing millions of microscopic transistors. This chip


functions as the computer's brain. It processes the instructions or operations contained within executable
computer programs. Instead of taking instructions directly off of the hard drive, the processor takes its
instructions from memory. This greatly increases the computer's speed.
Modern processors are designed by two distinct companies: Intel and Advanced Micro Devices
(AMD). Intel processors are most commonly used in prefabricated computer systems, such as those from
Dell and HP. The company focuses on two different lines of processors: the Pentium and the Celeron.
Pentium processors are the larger microchip style that works on most desktop and some laptops. They
can handle high-demand processing, such as that found in 3D gaming, video editing and other
multimedia-intense applications. Celeron processors are more compact models with the ability to run a
basic computer efficiently and cost-effectively. AMD's line of computer processors can be found in
prefabricated models, however, are most commonplace with home-built systems or specially designed
machines. AMD was the first to build a 64-bit processor, capable of high-end applications use with
graphic intensive operations. The previous industry standard had been 32-bit processing. Some AMD
processors offer built-in virus protection.

FEATURES OF A COMPUTER PROCESSOR

Each processor has a clock speed which is measured in gigahertz (GHz). Also, a processor has a front
side bus which connects it with the system's random access memory (RAM.) CPUs also typically have
two or three levels of cache. Cache is a type of fast memory which serves as a buffer between RAM and
the processor. The processor's socket type determines the motherboard type where it can be installed. [6]

When it comes to processors, size matters. Whether you're buying a new computer or upgrading your old
one, you must get the fastest processor you can afford. This is because the processor will become
obsolete very quickly. Choosing a 3.6 GHz processor over a 2 GHz today can buy you several years of
cheap computing time. Also check the speed of the front side bus (FSB) when purchasing your new
computer or CPU. A front side bus of 800 MHz or greater is essential for fast processing speeds. The
processor's cache is also important. Make sure it has at least 1 MB of last level cache if your computing
needs are average. If you're an extreme gamer or if you run intensive graphics programs, get the
processor with the largest cache that fits your budget. There can be hundreds of dollars' difference
between the cheapest processors and the most expensive ones. However, investing just a little extra
cash can get you a much better processor.

HISTORY OF DEVELOPMENT OF COMPUTER PROCESSOR

The earliest forms of computer processors were designed from vacuum tubes and electrical relays. By
the 1950s, these had been replaced by the advent of the transistor. These transistors were built onto
printed circuit boards, copper that is etched onto a non-electrical board, and various components were
added. These computer processors were large and bulky, sometimes taking up whole rooms. During the
construction of the Apollo guidance computer for NASA, scientists were able construct integrated circuits
that allowed large numbers of transistors to be manufactured into a single semiconductor. This was found
to be more reliable that previous models and much more compact. The microprocessor was invented by
Intel in 1970. The 4004 was as fast as its larger cousins, but could be used in much smaller devices. With
the advent of the personal computer, the majority of processor technology uses the microprocessor
model.

Engineers and technicians routinely reach a point in processor design in which they face limits in
making the device faster. They have been challenged by size and materials. At one time, designers
believed they could not get passed the 1 gigahertz speed level, that was accomplished by the AMD
Athlon in 2000. The 64-bit barrier was broken by the same company in 2003. Processors have since
become duo-core and quad-core, meaning they are capable of executing nearly twice as much data
transfers and flow as with a single-core. Many motherboards are now coming equipped for two or more
processors to work in unison. The most advanced research being accomplished is that which uses new
technologies to expand the speed and capability of the processor. IBM has designed computer processor
technology using lasers, much like fiber optics. The Georgia Institute of Technology has developed
biological computer processors using the brain cells of leeches. Other scientists are developing ways to
pass data along gaseous phenomena.

Other lines of processors are used in older models of computers. Macintosh computers
specifically used its own line for many years between 1984 and 2006. The company switched to Intel
processors in all its new machines after this period. During the early years of Apple Computers, 1984 to
1996, the company used Motorola branded computer processors to handle its operating systems and
data flow. These were known as the 68000 series and featured processors with speeds between 16 and
33 megahertz. Following 1996, Apple used IBM-designed processors in nearly all of its machines. These
ranged in speeds between 66 megahertz to 2.5 gigahertz by 2006.

AUTHENTICATION AND IDENTIFICATION FUNCTION


• IDENTIFICATION
-the ability to identify uniquely a user of a system or an application that is running in the system.
• AUTHENTICATION
-the ability to prove that a user or application is genuinely who that person or what that application claims to be.

For example, consider a user who logs on to a system by entering a user ID and password. The system uses
the user ID to identify the user. The system authenticates the user at the time of logon by checking that the
supplied password is correct
BIOMETRICS
Verifies an individual’s identity by analyzing a unique personal attribute or behavior
It is the most effective and accurate method for verifying identification. (also the most expensive)

TYPES OF BIOMETRICS
Fingerprint
are based on the ridge endings, bifurcation exhibited by the friction edges and some minutiae of the finger.
(most common)
Palm Scan
are based on the creases, ridges, and grooves that are unique in each individuals palm.
Hand geometry
are based on the shape (length, width) of a person’s hand and fingers.
Hand Topography
based on the different peaks, valleys, overall shape and curvature
of the hand.
Retina Scan
is based on the blood vessel pattern of the retina on the backside of the eyeball.
Iris Scan
is based on the colored portion of the eye that surrounds the pupil. The iris has unique patterns, rifts, colors,
rings, coronas and furrows
Facial Scan
based on the different bone structures, nose ridges, eye widths, forehead sizes and chin shapes of the face.
Voice Print
based on human voice
SIGNATURE DYNAMICS
• is based on electrical signals generated due to physical motion of the hand during signing a document.
KEYBOARD DYNAMICS

is based on electrical signals generated while the user types in the keys (passphrase) on the keyboard.
PASSWORD

A password is a protected string of characters that is used to authenticate an individual. (most common
form of system identification and authentication mechanism)

TWO TYPES OF PASSWORD

• Cognitive Passwords
-facts or opinion-based information used to verify an individual identity (e.g.: mothers maidens name).
• One-Time or Dynamic Passwords
-a token based system used for authentication purposes where the service is used only once. (used in
environments that require a higher level of security)
PASSPHRASE
• a sequence of characters that is longer than a password and in some cases, takes the place of a
password during an authentication process.
• It is more secure than passwords
CRYPTOGRAPHIC KEYS
• Uses private keys and Digital Signatures
Provides a higher level of security than passwords.
MEMORY CARDS
• Holds information but cannot process them
• More secure than passwords but costly
(Swipe cards, ATM cards
SMART CARDS
Holds information and has the capability to process information and can provide a two factor authentication
(knows and has)
• Categories of Smart Cards:
• Contact
• Contactless

TIMELINE OF COMPUTER PROCESSOR DEVELOPMENT


Retrieved from:

Year Event

1823 Baron Jons Jackob Berzelius discovers silicon (Si), which today is the basic
component of processors.
1903 Nikola Tesla patented electrical logic circuits called "gates" or "switches" in 1903.

1947 John Bardeen, Walter Brattain, and William Shockley invent the first transistor at the
Bell Laboratories on December 23, 1947.

1948 John Bardeen, Walter Brattain, and William Shockley patent the first transistor in
1948.
1956 John Bardeen, Walter Brattain, and William Shockley were awarded the Nobel Prize
in physics for their work on the transistor.

1958 The first working integrated circuit was developed by Robert Noyce of Fairchild
Semiconductor and Jack Kilby of Texas Instruments. The first IC was demonstrated
on September 12, 1958. (Geoffrey Dummer is credited as being the first person to
conceptualize and build a protoType of the integrated circuit.)

1960 IBM developed the first automatic mass-production facility for transistors in New
York in 1960.
1968 Intel Corporation was founded by Robert Noyce and Gordon Moore in 1968.
1969 AMD (Advanced Micro Devices) was founded on May 1, 1969.

1971 Intel with the help of Ted Hoff introduced the first microprocessor, the Intel 4004 on
November 15, 1971. The 4004 had 2,300 transistors, performed 60,000 OPS
(operations per second), addressed 640 bytes of memory, and cost $200.00.

1972 Intel introduced the 8008 processor on April 1, 1972.

1974 Intel's improved microprocessor chip was introduced on


April 1, 1974; the 8080 became a standard in the computer
industry.
1976 Intel introduced the 8085 processor in March 1976.
1976 The Intel 8086 was introduced on June 8, 1976.

1979 The Intel 8088 was released on June 1, 1979.

1979 The Motorola 68000, a 16/32-bit processor was released and was later chosen as
the processor for the Apple Macintosh and Amiga computers.

1982 The Intel 80286 was introduced on February 1, 1982.

1985 Intel introduced the first 80386 in October 1985.

1987 The SPARC processor was first introduced by Sun.


1988 Intel 80386SX was introduced in 1988.
1989 Cyrix released their first coprocessors, the FasMath 83D87 and 83S87, in 1989. These
were x87 compatible and designed for 386 computers. The FasMath coprocessors
were up to 50% faster than the Intel 80387 processor.
1991 AMD introduced the AM386 microprocessor family in March 1991.

1991 Intel introduced the Intel 486SX chip in April in efforts to help bring a lower-cost
processor to the PC market selling for $258.00.

1992 Intel released the 486DX2 chip on March 2, 1992, with a clock doubling ability that
generates higher operating speeds.
1993 Intel released the Pentium processor on March 22, 1993. The processor was a 60 MHz
processor, incorporates 3.1 million transistors and sells for $878.00.
1994 Intel released the second generation of Intel Pentium processors on March 7, 1994.

1995 Cyrix released the Cx5x86 processor in 1995, in an attempt to compete with the Intel
Pentium processors.
1995 Intel introduced the Intel Pentium Pro in November 1995.

1996 Cyrix released their MediaGX processor in 1996. It combined a processor with sound
and video processing on one chip.
1996 Intel announced the availability of the Pentium 150 MHz with 60 MHz bus and 166 MHz
with 66 MHz bus on January 4, 1996.
1996 AMD introduced the K5 processor on March 27, 1996, with speeds of 75 MHz to 133
MHz and bus speeds of 50 MHz, 60 MHz, or 66 MHz The K5 was the first processor
developed completely in-house by AMD.

1997 AMD released their K6 processor line in April 1997, with speeds of 166 MHz to 300
MHz and a 66 MHz bus speed.
1997 Intel Pentium II was introduced on May 7, 1997.
1998 AMD introduced their new K6-2 processor line on May 28, 1998, with speeds of 266
MHz to 550 MHz and bus speeds of 66 MHz to 100 MHz The K6-2 processor was an
enhanced version of AMD's K6 processor.
1998 Intel released the first Xeon processor, the Pentium II Xeon 400 (512 K or 1 M cache,
400 MHz, 100 MHz FSB) in June 1998.
1999 Intel released the Celeron 366 MHz and 400 MHz processors on January 4, 1999.
1999 AMD released its K6-III processors on February 22, 1999, with speeds of 400 MHz or
450 MHz and bus speeds of 66 MHz to 100 MHz It also featured an on-die L2 cache.
1999 The Intel Pentium III 500 MHz was released on February 26, 1999.
1999 The Intel Pentium III 550 MHz was released on May 17, 1999.
1999 AMD introduced the Athlon processor series on June 23, 1999. The Athlon would be
produced for the next six years in speeds ranging from 500 MHz up to 2.33 GHz.
1999 The Intel Pentium III 600 MHz was released on August 2, 1999.
1999 The Intel Pentium III 533B and 600B MHz was released on September 27, 1999.
1999 The Intel Pentium III Coppermine series was first introduced on October 25, 1999.
2000 On January 5, 2000, AMD released the 800 MHz Athlon processor.
2000 Intel released the Celeron 533 MHz with a 66 MHz bus processor on January 4, 2000.
2000 AMD first released the Duron processor on June 19, 2000, with speeds of 600 MHz to
1.8 GHz and bus speeds of 200 MHz to 266 MHz The Duron was built on the same K7
architecture as the Athlon processor.
2000 Intel announces on August 28th that it will recall its 1.3 GHz Pentium III processors
due to a glitch. Users with these processors should contact their vendors for
additional information about the recall.
2001 On January 3, 2001, Intel released the 800 MHz Celeron processor with a 100 MHz bus.
2001 On January 3, 2001, Intel released the 1.3 GHz Pentium 4 processor.
2001 AMD announced a new branding scheme on October 9, 2001. Instead of identifying
processors by their clock speed, the AMD Athlon XP processors will bear monikers of
1500+, 1600+, 1700+, 1800+, 1900+, 2000+, etc. Each higher model number will
represent a higher clock speed.
2002 Intel released the Celeron 1.3 GHz with a 100 MHz bus and 256 kB of level 2 cache.

2003 Intel Pentium M was introduced in March 2003.


2003 AMD released the first single-core Opteron processors, with speeds of 1.4 GHz to 2.4
GHz and 1024 KB L2 cache, on April 22, 2003.
2003 AMD released the first Athlon 64 processor, the 3200+ model, and the first Athlon 64
FX processor, the FX-51 model, on September 23, 2003.

2004 AMD released the first Sempron processor on July 28, 2004, with a 1.5 GHz to 2.0 GHz
clock speed and 166 MHz bus speed.
2005 AMD released their first dual-core processor, the Athlon 64 X2 3800+ (2.0 GHz, 512 KB
L2 cache per core), on April 21, 2005.
2006 AMD released their new Athlon 64 FX-60 processor, featuring 2x 1024 KB L2 cache, on
January 9, 2006.
2006 Intel released the Core 2 Duo processor E6320 (4 M cache, 1.86 GHz, 1066 MHz FSB)
on April 22, 2006.
2006 Intel introduced the Intel Core 2 Duo processors with the Core 2 Duo processor E6300
(2 M cache, 1.86 GHz, 1066 MHz FSB) on July 27, 2006.
2006 Intel introduced the Intel Core 2 Duo processor for the laptop computer with the Core
2 Duo processor T5500, as well as other Core 2 Duo T series processors, in August
2006.
2007 Intel released the Core 2 Quad processor Q6600 (8 M cache, 2.40 GHz, 1066 MHz FSB)
in January 2007.
2007 Intel released the Core 2 Duo processor E4300 (2 M cache, 1.80 GHz, 800 MHz FSB) on
January 21, 2007.
2007 Intel released the Core 2 Quad processor Q6700 (8 M cache, 2.67 GHz, 1066 MHz FSB)
in April 2007.
2007 Intel released the Core 2 Duo processor E4400 (2 M cache, 2.00 GHz, 800 MHz FSB) on
April 22, 2007.
2007 AMD renamed the Athlon 64 X2 processor line to Athlon X2 and released the first in
that line, the Brisbane series (1.9 to 2.6 GHz, 512 KB L2 cache) on June 1, 2007.
2007 Intel released the Core 2 Duo processor E4500 (2 M cache, 2.20 GHz, 800 MHz FSB) on
July 22, 2007.
2007 Intel released the Core 2 Duo processor E4600 (2 M cache, 2.40 GHz, 800 MHz FSB) on
October 21, 2007.
2007 AMD released the first Phenom X4 processors (2 M cache, 1.8 GHz to 2.6 GHz, 1066
MHz FSB) on November 19, 2007.
2008 Intel released the Core 2 Quad processor Q9300 and the Core 2 Quad processor
Q9450 in March 2008.
2008 Intel released the Core 2 Duo processor E4700 (2 M cache, 2.60 GHz, 800 MHz FSB) on
March 2, 2008.
2008 AMD released the first Phenom X3 processors (2 M cache, 2.1 GHz to 2.5 GHz, 1066
MHz FSB) on March 27, 2008.
2008 Intel released the first of the Intel Atom series of processors, the Z5xx series, in April
2008. They are single core processors with a 200 MHz GPU.
2008 Intel released the Core 2 Duo processor E7200 (3 M cache, 2.53 GHz, 1066 MHz FSB)
on April 20, 2008.
2008 Intel released the Core 2 Duo processor E7300 (3 M cache, 2.66 GHz, 1066 MHz FSB)
on August 10, 2008.
2008 Intel released several Core 2 Quad processors in August 2008: the Q8200, the Q9400,
and the Q9650.
2008 Intel released the Core 2 Duo processor E7400 (3 M cache, 2.80 GHz, 1066 MHz FSB)
on October 19, 2008.
2008 Intel released the first Core i7 desktop processors in November 2008: the i7-920, the
i7-940, and the i7965 Extreme Edition.
2009 AMD released the first Phenom II X4 (quad-core) processors (6 M cache, 2.5 to 3.7
GHz, 1066 MHz or 1333 MHz FSB) on January 8, 2009.
2009 AMD released the first Athlon Neo processor, the MV-40 model, (1.6 GHz and 512 KB
L2 cache) on January 8, 2009.
2009 Intel released the Core 2 Duo processor E7500 (3 M cache, 2.93 GHz, 1066 MHz FSB)
on January 18, 2009.
2009 AMD released the first Phenom II X3 (triple core) processors (6 M cache, 2.5 to 3.0
GHz, 1066 MHz or 1333 MHz FSB) on February 9, 2009.
2009 Intel released the Core 2 Quad processor Q8400 (4 M cache, 2.67 GHz, 1333 MHz FSB)
in April 2009.
2009 Intel released the Core 2 Duo processor E7600 (3 M cache, 3.06 GHz, 1066 MHz FSB)
on May 31, 2009.
2009 AMD released the first Athlon II X2 (dual-core) processors (1024KB L2 cache, 1.6 to
3.5 GHz, 1066 MHz or 1333 MHz FSB) in June 2009.

2009 AMD released the first Phenom II X2 (dual-core) processors (6 M cache, 3.0 to 3.5 GHz,
1066 MHz or 1333 MHz FSB) on June 1, 2009.

2009 AMD released the first Athlon II X4 (quad-core) processors (512 KB L2 cache, 2.2 to
3.1 GHz, 1066 MHz or 1333 MHz FSB) in September 2009.

2009 Intel released the first Core i7 mobile processor, the i7-720QM, in September 2009. It
uses the Socket G1 socket type, runs at 1.6 GHZ, and features 6 MB L3 cache.

2009 Intel released the first Core i5 desktop processor with four cores, the i5-750 (8 M
cache, 2.67 GHz, 1333 MHz FSB), on September 8, 2009.

2009 AMD released the first Athlon II X3 (triple core) processors in October 2009.

2010 Intel released the Core 2 Quad processor Q9500 (6 M cache, 2.83 GHz, 1333 MHz FSB)
in January 2010.
2010 Intel released the first Core i5 mobile processors, the i5-430M and the i5-520E in
January 2010.
2010 Intel released the first Core i5 desktop processor over 3.0 GHz, the i5-650 in January
2010.
2010 Intel released the first Core i3 desktop processors, the i3-530, and i3-540 on January
7, 2010.

2010 Intel released the first Core i3 mobile processors, the i3-330M (3 M cache, 2.13 GHz,
1066 MHz FSB) and the i3-350M, on January 7, 2010.

2010 AMD released the first Phenom II X6 (hex/six core) processors on April 27, 2010.
2010 Intel released the first Core i7 desktop processor with six cores, the i3-970, in July
2010. It runs at 3.2 GHz and features 12 MB L3 cache.

2011 Intel released seven new Core i5 processors with four cores, the i5-2xxx series in
January 2011.
2017 Intel released the first desktop processor with 16 cores, the Core i9-7960X, in
September 2017. It runs at 2.8 GHZ and features 22 MB L3 cache.
2017 Intel released the first desktop processor with 18 cores, the Core i9-7980X, in
September 2017. It runs at 2.6 GHZ and features 24.75 MB L3 cache.
2018 Intel released the first Core i9 mobile processor, the i9-8950HK, in April 2018. It uses
the BGA 1440 socket, runs at 2.9 GHZ, has six cores, and features 12 MB L3 cache.
Activity 2: Identification
Directions: Identify the following. Write your answer on the space provided. Be ready for the submission

1. Refers to the industry technology who invented the graphical user interface.

2. Refers to the transfer rate of a standard USB 2.0 device.

3. The hyper transport have all but replaced this in current hardware, but what used to be the
single most important factor in overall system speed?

4. When was the first commercial microprocessor introduced?

5. It refers to the width of the smallest wire on a computer chip is typically measured in

6. The external system bus architecture is created using from _________ architecture

7.The accumulator based microprocessor example are _________________

8. Computer has a built-in system clock that emits millions of regularly spaced electric pulses per
__________ called clock cycles.

9. A circuitry that processes that responds to and processes the basic instructions that are
required to drive a computer system is _____________.

10. The CPU controls the transfer of data between ___________and other devices.

II. Answer the question: What can you say on the development of processor in terms of sizes, speed,
process and durability factor

LESSON 3: COMPUTER COMPONENTS


LEARNING OBJECTIVES

 Know what a Motherboard is


 Know the different parts and ports of a Motherboard
 Know the basic working principle of Motherboard.
PARTS OF A COMPUTER

1. INPUT DEVICES - Data and instructions must enter the computer system before any
computation can be performed on the supplied data. The input unit that links the external
environment with the computer system performs this task. Data and instructions enter input units
in forms that depend upon the particular device used. For example, data is entered from a
keyboard in a manner similar to typing, and this differs from the way in which data is entered
through a mouse, which is another type of input device. However, regardless of the form in which
they receive their inputs, all input devices must provide a computer with data that are transformed
into the binary codes that the primary memory of the computer is designed to accept. This
transformation is accomplished by units that called input interfaces. Input interfaces are designed
to match the unique physical or electrical characteristics of input devices to the requirements of
the computer system.
• Keyboard is the most common and very popular input device
which helps to input data to the computer. The layout of the keyboard is like
that of traditional typewriter, although there are some additional keys
provided for performing additional functions
• Mouse is the most popular pointing device. It is a very famous
cursor-control device having a small palm size box with a round ball at its
base, which senses the movement of the mouse and sends
corresponding signals to the CPU when the mouse buttons are pressed.
Generally, it has two buttons called the left and the right button and a wheel
is present between the buttons. A mouse can be used to control the position
of the cursor on the screen, but it cannot be used to enter text into the
compute
• Microphone is an input device to input sound that is then stored
in a digital form. The microphone is used for various applications such as
adding sound to a multimedia presentation or for mixing music.
• Joystick is also a pointing device, which is used to move the
cursor position on a monitor screen. It is a stick having a spherical ball at its
both lower and upper ends. The lower spherical ball moves in a socket. The
joystick can be moved in all four directions. The function of the joystick is
similar to that of a mouse. It is mainly used in Computer Aided Designing
(CAD) and playing computer games.
Scanner is an input device, which works more like a photocopy machine. It is
used when some information is available on paper and it is to be transferred to
the hard disk of the computer for further manipulation. Scanner captures images
from the source which are then converted into a digital form that can be stored
on the disk. These images can be edited before they are printed.

2. OUTPUT DEVICES - The job of an output unit is just the reverse of that of an input unit. It
supplied information and results of computation to the outside world. Thus it links the computer
with the external environment. As computers work with binary code, the results produced are also
in the binary form. Hence, before supplying the results to the outside world, it must be converted
to human acceptable (readable) form. This task is accomplished by units called output interfaces.
• Monitors, commonly called as Visual Display Unit
(VDU), are the main output device of a computer. It forms images from tiny
dots, called pixels that are arranged in a rectangular form. The sharpness of
the image depends upon the number of pixels.
• Printers are another common output device found in
homes in offices. In computing terms, they take electronic data stored on a
computer and generates a hard copy of it. Usually that means printing
images and text onto paper. There are numerous different types of printer,
with Inkjet and laser printers being two of the most common. Modern printers
usually connect to a computer with a USB cable or via Wi-Fi.
• Computer speakers are hardware devices that
transform the signal from the computer's sound card into audio. Speakers are essential if you want a
louder sound, surround sound, fuller bass, or just a better quality of audio. External computer
speakers began to appear in stores in the early 1990's when computer gaming, digital music, and
other forms of media became popular. Some computer speakers are wireless nowadays, connecting
to the computer via Bluetooth.
3. STORAGE UNIT - The data and instructions that are entered into the computer system through
input units have to be stored inside the computer before the actual processing starts. Similarly,
the results produced by the computer after processing must also be kept somewhere inside the
computer system before being passed on to the output units. Moreover, the intermediate results
produced by the computer must also be preserved for ongoing processing. The Storage Unit or
the primary / main storage of a computer system is designed to do all these things. It provides
space for storing data and instructions, space for intermediate results and also space for the final
results.
• Cache memory is a very high speed semiconductor memory which can speed up the
CPU. It acts as a buffer between the CPU and the main memory. It is used to hold those
parts of data and program which are most frequently used by the CPU. The parts of data
and programs are transferred from the disk to cache memory by the operating system,
from where the CPU can access them.

• Primary memory holds only those data and instructions on which the computer is
currently working. It has a limited capacity and data is lost when power is switched off. It
is generally made up of semiconductor device. These memories are not as fast as
registers. The data and instruction required to be processed resides in the main memory.
It is divided into two subcategories RAM and ROM.
• This type of memory is also known as external memory or non-volatile. It is slower than
the main memory. These are used for storing data/information permanently. CPU directly
does not access these memories, instead they are accessed via input-output routines.
The contents of secondary memories are first transferred to the main memory, and then
the CPU can access it. For example, disk, CD-ROM, DVD, etc.
4. CENTRAL PROCESSING UNIT - The main unit inside the computer is the CPU.
This unit is responsible for all events inside the computer. It controls all internal and external devices,
performs “Arithmetic and Logical operations”. The operations a Microprocessor performs are called
“instruction set” of this processor. The instruction set is ―hard wired‖ in the CPU and determines the
machine language for the CPU. The more complicated the instruction set is, the slower the CPU works.
Processors differed from one another by the instruction set. If the same program can run on two different
computer brands they are said to be compatible. Programs written for IBM compatible computers will not
run on Apple computers because these two architectures are not compatible.
The control Unit and the Arithmetic and Logic unit of a computer system are jointly known as the
Central Processing Unit (CPU). The CPU is the brain of any computer system. In a human body, all major
decisions are taken by the brain and the other parts of the body function as directed by the brain.
Similarly, in a computer system, all major calculations and comparisons are made inside the CPU and the
CPU is also responsible for activating and controlling the operations of other units of a computer system
Activity 3: TEST YOUR COMPREHENSION
Enumeration:

1. Example of Input Devices

2. Example of output devices

________________

3. 2 basic types of memory storage unit

_____

4. Mention briefly the steps involved in the execution of a program CPU

______
LESSON 4: COMPUTER MOTHERBOARD
LEARNING OBJECTIVES

 Know what a Motherboard is


 Know the different parts and ports of a Motherboard
 Know the basic working principle of Motherboard.
THE MOTHERBOARD

A computer has many components, each with their own roles and functions. The role of the
motherboard is to allow all these components to communicate with each other. Considering the fact that
all the other components are installed on the motherboard or connected to it, it is safe to say that the
motherboard is the central piece of a PC, the component that brings it all together.

DIFFERENT TYPES OF MOTHERBOARD

1. AT MOTHERBOARD - The oldest of the main boards, these motherboards were


used in earlier 286/386 or 486 computers. The AT means the board consists of advanced
technology(AT) power connectors. There are two power connectors of 6 pin each mounted on
the AT motherboards. The AT motherboards were available in the early 80‘s.

2. ATX MOTHERBOARDS - (Motherboard for P1/P2 processors)


The ATX motherboards started in 90‘s and are still available. The
ATX connector on the motherboard consists of a single connector. These boards are
used for P2/P3 or P/4 processors. COMPONENTS OF A MOTHERBOARD

1. EXPANSION SLOTS - Expansions have the role of letting you install additional components to
enhance or expand the functionality of your PC. You can install a TV tuner, a video capture card, a
better soundcard, etc. – you get the idea. These ports are located under the video card slot, and
come in the form of PCI slots (on older motherboards) or a scaled-down version of PCI-Express slots
(on newer motherboards). Some motherboards come with both types of expansion slots. The number
of slots is usually dependent on the format of the motherboard – larger motherboards (full ATX) have
more, while smaller formats (micro-ATX) have fewer, if any.

ISA slots. These were the oldest expansion slots in the history of motherboards. They were
found in AT boards and are identified by black color. Conventional display cards or sound cards were
installed in these slots. The full form of ISA is Industry Standard Architecture and is a 16- bit bus.
• PCI Slots. The full form of PCI is Peripheral Component Interconnect. The PCI slot is one
of the important motherboard components today and is vastly used to install add-on cards
on the motherboard. The PCI supports 64-bit high-speed bus.
• PCI express. Also known as PCIe, these are the latest and the fastest component of the
motherboard to support add-on cards. It supports full duplex serial bus.

• AGP slot. Accelerated graphics port (AGP) is specifically used to install a latest graphics
card. AGP runs on a 32-bit bus and both PCIe and AGP can be used to install high-end
gaming display cards.
2. RAM (MEMORY) SLOTS - Located in the upper-right part of the motherboard, the memory slots are
used to house the computer ‘s memory modules. The number of slots can vary, depending on the
motherboard, from 2, in low-end motherboards, all the way up to 8 memory slots, on high-end and
gaming motherboards.

• SIMM slots. The full form is a single in-line memory module. These slots were found in
older motherboards, up to 486-boards. The SIMM supports 32-bit bus.
• DIMM slots. The full form of DIMM is a Double inline memory module. These are the
latest RAM slots which run on a faster 64-bit bus. The DIMM used on Laptop boards are
called SO-DIMM.

3. CENTRAL PROCESSING UNIT (CPU) SOCKET - Another vital motherboard component is the CPU
socket which is used to install the processor on the
motherboard. Some important sockets are explained below.
• Socket7. It is a 321 pin socket that supported older processors like Intel Pentium
1/2/MMX, AMD k5/K6, and Cyrix M2.
• Socket370. It is a 370 pin socket that supports Celeron processors and Pentium-3
processors.
• Socket 775. It is a 775-pin socket that supports Inter dual core, C2D, P-4 and Xeon
processors.
• Socket 1156. Found on latest types of motherboards, it is an 1156-pin socket that
supports the latest Intel i-3, i-5 and i-7 processors.
• Socket 1366. The socket is of 1366 pins and supports latest i-7 900 processors.
4. BASIC INPUT/OUTPUT SYSTEM (BIOS) - The full form of BIOS is Basic Input Output System. It is a
motherboard component in the form of an Integrated chip. This chip contains all the information and
settings of the motherboard which you can modify by entering the BIOS mode from your computer
5. COMPLEMENTARY METAL-OXIDE SEMICONDUCTOR (CMOS) BATTERY -
The battery is a 3.0-Volt lithium type cell. The cell is responsible for storing the information in BIOS.
6. POWER CONNECTORS - No computer component can operate without power, and a
motherboard is no exception. The power connector, commonly a 20 or 24-pin connector, can be situated
either near the right edge of the motherboard, or somewhere close to the processor socket on older
motherboards. This is where the power supply ‘s main connector gets attached, providing power to the
motherboard and all the other components.

Newer motherboards have an additional 4-pin or 8-pin connector near the processor, used to supply additional
power directly to the processor.

7. IDE AND SATA PORTS - IDE and SATA ports are used to provide connectivity for the storage
devices and optical drives. The IDE interface is somewhat outdated, so you shouldn’t ‘t be surprised if
you see a lot of new motherboards coming without this type of port. It was replaced by the smaller and
much faster SATA interface, which currently reached its 3rd revision, being able to achieve maximum
speeds of up to 600 MB/s, as opposed to the IDE interface, which can reach a maximum of 133 MB/s.

8. PROCESSOR SOCKET - The processor socket is the central piece of a motherboard, usually
being located near the center of the motherboard. It’s also the central piece because it holds the
processor – the brain of your computer.

9. NORTHBRIDGE AND SOUTHBRIDGE - If you have a look at your motherboard, chances are
you ‘ll see a square metal component somewhere in the lower-right part of the board. This metal
component is actually a heatsink, and its role is to provide thermal protection for the Northbridge – one of
the most important components of a motherboard. The northbridge is responsible for coordinating the
data flow between the memory, the video card and the processor. A secondary chip, known as
Southbridge, has a similar function, coordinating the data flow between the processor and peripherals
such as sound cards or network cards.

10. CABINET CONNECTIONS - The cabinet in which the motherboard is installed has many buttons
that connect to the motherboard. Some of the common connectors are Power Switch, Reset Switch, Front
USB, Front Audio, Power indicator (LED) and HDD LED.

11. INPUT/OUTPUT INTERFACE CONNECTORS - The input–output interface


connects the computer to the outside world. It decodes the address and identifies the unique computer
peripheral with which a data transfer operation is to be executed. The interface also has to interpret the
command on the control bus so that the timing of data transfer is correct. One further very important
function of the input–output interface is to provide a physical electronic highway for the flow of data
between the computer data bus and the external peripheral

Activity 4: Specs Design Challenge.

Prepare a matrix design (table format) of complete computer system unit specification is up to you what kind of
processor that you want to put on the table.

LESSON 5: E-WASTE

LEARNING OBJECTIVES

 Learn about Technological Disposal


 Know what are E-Waste
 Know the impacts of E-Waste to the Society

What is E-waste?

Electronic waste, also called e-waste, various forms of electric and electronic equipment that have ceased to
be of value to their users or no longer satisfy their original purpose.
Electronic waste (e-waste) products have exhausted their utility value through either redundancy,
replacement, or breakage and include both ―white goods‖ such as refrigerators, washing machines, and
microwaves and ―brown goods‖ such as televisions, radios, computers, and cell phones. Given that the
information and technology revolution has exponentially increased the use of new electronic equipment, it
has also produced growing volumes of obsolete products; e-waste is one of the fastest-growing waste
streams. Although e-waste contains complex combinations of highly toxic substances that pose a danger
to health and the environment, many of the products also contain recoverable precious materials, making
it a different kind of waste compared with traditional municipal waste.
Electronic system Waste product

Globally, e-waste constitutes more than 5 percent of all municipal solid waste and is increasing with the
rise of sales of electronic products in developing countries. The majority of the world ‘s e-waste is
recycled in developing countries, where informal and hazardous setups for the extraction and sale of
metals are common. Recycling companies in developed countries face strict environmental regulatory
regimes and an increasing cost of waste disposal and thus may find exportation to small traders in
developing countries more profitable than recycling in their own countries. There is also significant illegal
transboundary movement of e-waste in the form of donations and charity from rich industrialized nations
to developing countries. E-waste profiteers can harvest substantial profits owing to lax environmental
laws, corrupt officials, and poorly paid workers, and there is an urgent need to develop policies and
strategies to dispose of and recycle e-waste safely in order to achieve a sustainable future.

Impacts On Human Health

The complex composition and improper handling of e-waste adversely affect human health. A growing
body of epidemiological and clinical evidence has led to increased concern about the potential threat of e-
waste to human health, especially in developing countries such as India and China. The primitive
methods used by unregulated backyard operators to reclaim, reprocess, and recycle e-waste materials
expose the workers to a number of toxic substances. Processes such as dismantling components, wet
chemical processing, and incineration are used and result in direct exposure and inhalation of harmful
chemicals. Safety equipment such as gloves, face masks, and ventilation fans are virtually unknown, and
workers often have little idea of what they are handling.

For instance, in terms of health hazards, open burning of printed wiring boards increases the
concentration of dioxins in the surrounding areas. These toxins cause an increased risk of cancer if
inhaled by workers and residents. Toxic metals and poison can also enter the bloodstream during the
manual extraction and collection of tiny quantities of precious metals, and workers are continuously
exposed to poisonous chemicals and fumes of highly concentrated acids. Recovering resalable copper by
burning insulated wires causes neurological disorders, and acute exposure to cadmium, found in
semiconductors and chip resistors, can damage the kidneys and liver and cause bone loss. Long-term
exposure to lead on printed circuit boards and computer and television screens can damage the central
and peripheral nervous system and kidneys, and children are more susceptible to these harmful effects.

Environmental Impacts

Although electronics constitute an indispensable part of everyday life, their hazardous effects on the
environment cannot be overlooked or underestimated. The interface between electrical and electronic
equipment and the environment takes place during the manufacturing, reprocessing, and disposal of
these products. The emission of fumes, gases, and particulate matter into the air, the discharge of liquid
waste into water and drainage systems, and the disposal of hazardous wastes contribute to
environmental degradation. In addition to tighter regulation of e-waste recycling and disposal, there is a
need for policies that extend the responsibility of all stakeholders, particularly the producer.

Activity 5: Critical Thinking Analysis


Answer the question at short bond paper printed output 2 slides of power point/captured picture for non-
internet connection. The third page will be your explanation using msword.

Develop the best idea to present the E-waste


Question: Why is it important to recycle e-waste

LESSON 6: SOFTWARE CONCEPTS


LEARNING OBJECTIVES

• To understand fundamental software concepts


• To understand and identify different kinds of software
• To know the importance of debugging in computer systems and software development
• Understand the usage of CAPTCHA
• Identify and understand the various computer components
• To understand and identify security threats and the importance of security
Software

Software is a set of instructions, data or programs used to operate computers and


execute specific tasks. Opposite of hardware, which describes the physical aspects of a
computer, software is a generic term used to refer to applications, scripts and programs that run
on a device. Software can be thought of as the variable part of a computer and hardware the
invariable part.

Software is often divided into application software, or user downloaded programs that
fulfill a want or need, and system software, which includes operating systems and any program
that supports application software. The term middleware is sometimes used to describe
programming that mediates between application and system software or between two different
kinds of application software. For example, middleware could be used to send a remote work
request from an application in a computer that has one kind of operating system to an application
in a computer with a different operating system.

An additional category of software is the utility, which is a small, useful program with
limited capability. Some utilities come with operating systems. Like applications, utilities tend to be
separately installable and capable of being used independently from the rest of the operating
system.

Similarly, applets are small applications that sometimes come with the operating system
as accessories. They can also be created independently using the Java or other programming
languages.

Software can be purchased or acquired in the following ways:

Shareware- usually distributed on a free or trial basis with the intention of sale when the
period is over.

Lite ware- a type of shareware with some capabilities disabled until the full version is
purchased.

Freeware- can be downloaded for free but with copyright restrictions.

Public domain software- can be downloaded for free without restrictions.

Open source- a type of software where the source code is furnished and users agree not
to limit the distribution of improvements.
Today, much of the purchased software, shareware and freeware is directly downloaded
over the Internet. In these cases, software can be found on specific vendor websites or
application service providers. However, software can also be packaged on CD-ROMs or diskettes
and sold physically to a consumer.

Some general kinds of application software include:

Productivity software, which includes tools such as word processors and spreadsheets.

Presentation software, also known as slideware.

Graphics software.

CAD/CAM.

Vertical market or industry-specific software, for example, banking, insurance and


retail applications.

Examples and types of software

Below is a list of the different kinds of software a computer may have installed with examples of
related programs. Click any of the links below for additional information.

It should be noted that although application software is thought of as a program, it can be


anything that runs on a computer. The table below also includes a program column to clarify any
software that is not a program.

A specialized type of software that allows hardware to run is firmware. This is a type of
programming that is embedded onto a special area of the hardware's nonvolatile memory, such
as a microprocessor or read-only memory, on a one-time or infrequent basis so that thereafter it
seems to be part of the hardware.

Software can be purchased at a retail computer store or online and come in a box containing all
the disks (floppy diskette, CD, DVD, or Blu-ray), manuals, warranty, and other documentation.

Software can also be downloaded to a computer over the Internet. Once downloaded, setup files
are run to start the installation process on your computer.

Free software

There are also a lot of free software programs available that are separated into different
categories.

• Shareware or trial software is software that gives you a few days to try the software before
you have to buy the program. After the trial time expires, you'll be asked to enter a code or
register the product before you can continue to use it.

• Freeware is completely free software that never requires payment, as long as it is not
modified.
• Open source software is similar to freeware. Not only is the program given away for free, but
the source code used to make the program is as well, allowing anyone to modify the program
or view how it was created.

How do you use computer software?

Once the software is installed on the computer hard drive, the program can be used anytime by
finding the program on the computer. On a Windows computer, a program icon is added to the
Start menu or Start screen, depending on your version of Windows.

How to maintain software

[23] After the software is installed on your computer, it may need to be updated to fix any found
errors. Updating a program can be done using software patches. Once updates are installed, any
problems may that may have been experienced in the program will no longer occur.

How is software created and how does it work?

A computer programmer (or several computer programmers) write the instructions using a
programming language, defining how the software should operate on structured data. The
program then be interpreted, or compiled into machine code.

When I save a document, is that file also considered software?

When you create or edit a file using your software — a Microsoft Word document, for instance, or
a Photoshop image — that file is considered a "resource" or "asset" used by the software.
However, the file itself is not considered "software" even though it is an essential part of what
your software is doing

What was the first piece of computer software?

The first software program that was held in electronic memory was written by Tom Kilburn. The
program calculated the highest factor of the integer 218 = 262,144, and was successfully executed
on June 21, 1948, at the University of Manchester, England. The computer that held that program
was called the SSEM (Small Scale Experimental Machine), otherwise known as the "Manchester
Baby." This event is widely celebrated as the birth of software.

SYSTEM SOFTWARE

[21-24] This type of software allows the direct interaction between the user and the
hardware components of the computer system. Humans compared to the machine speaks
and understand different language, there must be an interface that will allow the end user to
interact with the computer system. System Software is also called as the main or alpha software
of a computer system because it handles the major operations of the running hardware. This
System Software is further divided into four major types:
1. The Operating System (OS) – It is a major program that is responsible in the governance
and maintenance of the inner-cooperation of components of the computer system. eg.,
Microsoft Windows, Linux, Mac OS etc.

2. The Language Processor – The hardware components of a computer system’s language are
not understandable by humans. These are the three languages involved in human-machine
interaction:

o Machine-level Language: the machines can only understand the digital signals or the
binary codes of 0’s and 1’s. This language is a machine dependent language. o
Assembly-level Language: this language is also referred to as the Low-level Language
(LLL), That forms a correspondence between machine level instruction and general
assembly level statement. Mnemonics are used to represent each low-level machine
instructions or operation codes (op-codes). e.g., ADD for adding two entities, HALT for
stopping a process etc. It is also a machine dependent and it varies among processor to
processor.
o High-level Language: this is the language understandable by humans, this is used to
program and code. It is to read and understand. Examples of this language is Java, C,
C++, Python etc.

Machine Level Language is very complex, therefore the users choose the High-level Language
more often, because of its convenience in terms of coding. The codes will be converted to
machine language so the computer system will understand it. This conversion process is
performed by the Language Processor which is made up of three components: o Assembler:
Language processor that converts assembly language into a machine language. [Assembler
language  Machine Language]
o Compiler: This language processor converts High-Level Language into a machine-level
Language in one go and the execution is fast. It checks the error of the whole program;
thus the error detection is difficult. Languages like C, C++ and Scala use compiler. o
Interpreter: This language processor also converts High-Level Language into a machine-
level Language line-by-line thus the execution tie is longer. But, the error detection is
easier because it returns the line where the error occurred. Programming languages such
as Python, Ruby and Java uses interpreter.
3. The Device Drivers – this acts an interface between various Input-Output device and the
users or the operating systems. This includes Printers, External web Cameras come with
driver disk for the installation to the system before it can be used in the system.
4. The BIOS- stands for Basic Input Output System, it a small firmware, that controls the
peripheral or the input-output device attached to the system, this

is also responsible for the starting the OS or initiating booting process or the POST (Power
On Self-Test)
Application Software

These are the basic software used to run to accomplish a particular action and task. These are
the dedicated software, dedicated to performing simple and single tasks. For e.g., a single
software cannot serve to both the reservation system and banking system. These are divided into
two types:
1. The General Purpose Application Software: These are the types of application software
that comes in-built and ready to use, manufactures by some company or someone. For
e.g.,
• Microsoft Excel – Used to prepare excel sheets.
• VLC Media Player – Used to play audio/video files.
• Adobe Photoshop – Used for designing and animation and many more.
2. The Specific Purpose Application Software: These are the type of software that is
customizable and mostly used in real-time or business environment. For e.g.,
• Ticket Reservation System
• Healthcare Management System
• Hotel Management System
• Payroll Management System
WHAT IS A WORD PROCESSOR?

Sometimes abbreviated as WP, a word processor is a software program capable of creating,


storing, and printing typed documents. Today, the word processor is one of the most frequently
used software programs on a computer, with Microsoft Word being the most popular word
processor.

Word processors can be used to create multiple types of files, including text files (.txt), rich text
files (.rtf), HTML files (.htm & .html), and Word files (.doc & .docx). Some word processors can
also be used to create XML files (.xml).

OVERVIEW OF A WORD PROCESSOR

In a word processor, you are presented with a blank white sheet as shown below. The text is
added to the document area and after it has been inserted formatted or adjusted to your
preference. Below is an example of a blank Microsoft Word window with areas of the window
highlighted.

FEATURE OF A WORD PROCESSOR

A word processor offers dozens of additional features that can give your document or other text a
more professional appearance. Below is a listing of some of the most popular features of a word
processor. [1]

• Résumé - Create or maintain your résumé.

WHAT IS APRESENTATION TOOL?

A presentation tool is a software package used to display information in the form of a


slide show. It has three major functions: an editor that allows text to be inserted and formatted, a
method for inserting and manipulating graphic images, and a slide-show system to display the
content. Presentation software can be viewed as enabling a functionally-specific category of
electronic media, with its own distinct culture and practices as compared to traditional
presentation media.

OVERVIEW OF A PRESENTATION TOOL

[3] In a presentation tool, you are presented with a blank white sheet as shown below. The text is
added to the document area and after it has been inserted formatted or adjusted to your
preference. Below is an example of a blank Microsoft PowerPoint window.
FEATURES OF A PRESENTATION TOOL

PowerPoint is the presentation software of the Microsoft Office software suite. One of the
most widely used office programs, PowerPoint has applications for personal use, academics and
business. Below are five features you should be using – if you aren't already. Learn everything
about these tips: they will improve your presentation skills and allow you to communicate your
message
successfully.

• ADDING SMART ART


SmartArt is a comprehensive and flexible business diagram tool that greatly improves
upon the ‘Diagram Gallery’ feature found in previous versions of Office. SmartArt can be used to
create professional diagrams that include pictures and text or combinations of the two. An obvious
use of SmartArt would be to create an organization chart but it can be used for many different
kinds of diagrams and even to provide some variety to slides using text bullet points.

• INSERTING SHAPES
If you need to include some sort of diagram in your presentation, then the quickest and easiest
way is probably to use SmartArt. However, it is important to be able to include shapes
independently of SmartArt and worth being familiar with the various Drawing Tool format options.

Not only will they be useful if you do need to manually draw a diagram (and SmartArt doesn’t suit
all diagrams), but they can also be applied to objects on a slide that you might not immediately
think of as shapes.

As you can see, the gallery of available shapes is very extensive. Once you have selected your
chosen shape, you can just click in your slide to insert a default version of the shape or, to set a
particular size and position, click and drag with the mouse to create the shape and size you want.

• INSERTING AN IMAGE

Here are two content type icons which appear in new content Placeholders for inserting pictures.
You can Insert Picture from File or Insert Clip Art. Alternatively, the Illustrations group of the Insert
ribbon tab includes the same two tools.

Insert Picture from File allows you to browse to an image file saved somewhere on your system
whereas Clip Art is held in an indexed gallery of different media types. Clip Art is not limited to
pictures it also includes the following:

• Illustrations
• Photographs
• Video
• Audio

Once you have found the image you want to use, click on it to insert it into the current slide. You
can now re-size and move the image accordingly with further editing options available when you
right click the desired image.

• SLIDE TRANSITIONS
Properly used, slide transitions can make your presentations clearer and more interesting and,
when appropriate, more fun. Badly used, the effect of slide transitions can be closer to irritating or
even nauseating. Simple animation effects are often used to add interest to bullet point text. Much
more extreme animation effects are available but, in most cases, should be used sparingly if at
all.

Two main kinds of animation are available in a PowerPoint presentation: the transition from one
slide to the next and the animation of images/text on a specific slide.

In PowerPoint 2010 & 2013 there is also a separate Transitions ribbon tab that includes a gallery
of different transition effects. These can be applied to selected slides or all slides. If you want to
apply different transition effects to different groups of slides, then you might want to choose ‘Slide
Sorter’ view from the Presentation Views group of the View ribbon.

 ADDING ANIMATION
Whereas the transition effects are limited to a single event per slide, animations can be applied to
every object on a slide – including titles and other text boxes. Many objects can even have
animation applied to different components, for example each shape in a SmartArt graphic, each
paragraph in a text box and each column in a chart. Animations can be applied to three separate
‘events’ for each object:

• Entrance – how the object arrives on the slide


• Emphasis – an effect to focus attention on an object while it is visible
• Exit – how the object disappears from the slide

To apply an animation effect, choose the object or objects to be animated, then choose Animation
Styles or Add Animation from the Animations toolbar.

Where an animation is applied to an object with different components (for instance a SmartArt
graphic made up of several boxes), the Effect Options tool becomes available to control how each
component will be animated. So for example, your animation can be used to introduce elements
of an organization chart to your slide one by one.

EXAMPLE AND TOP USE OF PRESENTATION TOOL

• Presenting a topic
• Business presentations  Teaching

SOME OTHER EXAMPLE OF PRESENTATION TOOLS

These are some of the other presentation tools that are available:

• VISME- is a cloud-based presentation tool that allows you to create highly visual
presentations to engage viewers and communicate your ideas. It features an intuitive,
drag-and-drop design method for creating presentations. The business version also
prioritizes brand consistency and company-wide image storage. When you or your
employees create a presentation, it will feature colors, logos and images that are on
brand for your organization.
• HAIKU DECK-is a platform that prioritizes simplicity. Business owners can create
elegant, basic presentations with high-quality images. The spartan approach allows for
connecting with audiences instead of losing them in information overload due to text-
heavy slides.
• PITCHERIFIC-is not only a presentation solution, but also a platform for building and
practicing your presentation. It's a template-based program that guides you through the
presentation creation process. Instead of drafting a few slides, Pitcherific prompts you to
write out the areas of each part of your speech. The outline for an elevator pitch, for
example, includes a hook, problem, solution and closing.
• CANVA-is an online platform that provides templates for a wide range of business-related
publications, like resumes, newsletters, business cards, media kits, brochures and
infographics. You can also use it to construct presentations.
• SLIDECAMP- provide slide templates for creating company presentations. You can
adjust color schemes, add company logos, import charts and data, build infographics, and
organize presentations into sections with Slide Camp. This is a great solution for
maintaining presentation consistency across multiple presentations from your
organization.
• POWTOON-is an animated presentation and video platform for creating short
informational videos and presentations about your brand or product. Explainer videos are
an important part of a brand's message, and Powtoon is an affordable tool for creating
animated videos and presentations to educate consumers and clients about your
business. You can easily edit presentations and videos, add voiceover, and build a
professional experience for your customers.
• VIDEOSCRIBE-is a whiteboard video presentation platform that allows small businesses
to customize their presentations to fit their needs. These videos, which feature a
whiteboard and hand that "draws" different objects and slides in the presentation, are
ideal for quick explainers and marketing videos on your business or product. You can
easily place objects, insert text, and even draw your own objects or text with
VideoScribe's platform.

• PREZI- is another template-based presentation solution that you can use to create
persuasive and engaging presentations with unique movement between "slides" and key
points. Prezi maps out your whole presentation on an overall track that you decide. When
you switch slides, it doesn't simply advance to the next one; it takes the viewer through
the track to the point that needs to be made. This allows your audience to visualize the
progression of your presentation. You can arrange content under different sections and
create an overview so your audience can see your entire presentation plan. This method
keeps the presentation organized and your audience engaged. You can also navigate
freely through your presentation – your track is not locked in and you can adjust when you
address which points as you're presenting.

WHAT IS A SPREADSHEET?

[3] A spreadsheet or worksheet is a file made of rows and columns that help sort data,
arrange data easily, and calculate numerical data. What makes a spreadsheet software program
unique is its ability to calculate values using mathematical formulas and the data in cells. A good
example of how a spreadsheet may be utilized is creating an overview of your bank's balance.

OVERVIEW OF A SPREADSHEET
Below is a basic example of what a Microsoft Excel spreadsheet looks like, as well as all the
important features of a spreadsheet highlighted.

In the above example, this spreadsheet is listing three different checks, the date, their description,
and the value of each check. These values are then added together to get the total of $162.00 in
cell D6. That value is subtracted from the check balance to give an available $361.00 in cell D8.

Difference between a workbook, worksheet, and spreadsheet

Because the terms spreadsheet, workbook, and worksheet are so similar, there can be a
lot of confusion when trying to understand their differences. When you open Microsoft Excel (a
spreadsheet program), you're opening a workbook. A workbook can contain one or more different
worksheets that can be accessed through the tabs at the bottom of the worksheet you’re currently
viewing. What's often most confusing is that a worksheet is synonymous with a spreadsheet. In
other words, a spreadsheet and worksheet mean the same thing. However, most people only
refer to the program as a spreadsheet program and the files it creates as spreadsheet files or
worksheets.

Although spreadsheets are most often used with anything containing numbers, the uses of a
spreadsheet are almost endless. Below are some other popular uses of spreadsheets.

• Finance- Spreadsheets are ideal for financial data, such as your checking account
information, budgets, taxes, transactions, billing, invoices, receipts, forecasts, and any
payment system.
• Forms- Form templates can be created to handle inventory, evaluations, performance
reviews, quizzes, time sheets, patient information, and surveys.

• School and grades- Teachers can use spreadsheets to track students, calculate grades,
and identify relevant data, such as high and low scores, missing tests, and students who
are struggling.

• Lists- Managing a list in a spreadsheet is a great example of data that does not contain
numbers, but still can be used in a spreadsheet. Great examples of spreadsheet lists
include telephone, to-do, and grocery lists.
• Sports- Spreadsheets can keep track of your favorite player stats or stats on the whole
team. With the collected data, you can also find averages, high scores, and statistical
data. Spreadsheets can even be used to create tournament brackets.

What is an active worksheet?

An active worksheet is the worksheet that is currently open. For example, in the Excel picture
above, the sheet tabs at the bottom of the window show "Sheet1," "Sheet2," and "Sheet3," with
Sheet1 being the active worksheet. The active tab usually has a white background behind the tab
name.

How many worksheets open by default?


In Microsoft Excel 2016 and earlier and OpenOffice Calc, by default, there are three sheet tabs
that open (Sheet1, Sheet2, and Sheet3). In Google Sheets, your spreadsheets start with one
sheet (Sheet1).

In Microsoft Excel 365, by default, there is only one sheet tab that opens (Sheet1).

What is the length limit of a worksheet name?

Not to be confused with the file name, in Microsoft Excel, there is a 31 character limit for each
worksheet name.

How are rows and columns labeled?

In all spreadsheet programs, including Microsoft Excel, rows are labeled using numbers (e.g., 1
to 1,048,576). All columns are labeled with letters from A to Z, then with two letters. For example,
after the letter Z, the next column is AA, AB, AC, ..., AZ and then incrementing to BA, BB, BC,
etc., to the last column XFD.

When working with a cell, you combine the column with the row. For example, the very first cell is
in column A and on row 1, so the cell is labeled as A1.

Why not use a word processor instead of a spreadsheet?

While it may be true that some of the things mentioned above could be done in a wordprocessor,
spreadsheets have a huge advantage over word processors when it comes to numbers. It would
be impossible to calculate multiple numbers in a word processor and have the value of the
calculation immediately appear. Spreadsheets are also much more dynamic with the data and
can hide, show, and sort information to make processing lots of information easier.

WHAT IS A LIBRARY MANAGEMENT?

[20] Library Management Software is an application that allows automation of libraries


and book databases. The software is commonly used by libraries and librarians to be able to
manage and access their library resources through a single, computer-based platform. Such
applications make it easy for library staff to manage books and records. Self-service or web-
based library management applications allow users to efficiently search online libraries for desired
book or material and read it online.

FEATURES OF A LIBRARY MANAGEMENT SOFTWARE

In a library management software, it gives you a lot of option and features that you can
use in order to have a more accessible system.

• Acquisition management: helping a library keep track of new print and digital additions to
the collection
• Barcode scanning: simply being able to check items in and out
• Barcoding: the capacity to add a barcode to a new or damaged acquisition
• Catalog management: keeping track digitally of what is available in the collection or,
when interlibrary loan is relevant, in the broader available system
• Circulation management: tracking who has what and when items are due
• Fee collection: keeping track of fines owed to the library
• OPAC: access to an online public access catalogue of various public libraries

• Patron management: keeping track of information about patrons and their records
• Periodicals management: managing journals and magazines available digitally in print
• Reserve shelf management: for libraries that allow teachers to keep items on reserve
• Search functions: allows patrons and librarians to complete a catalog search on various
levels
• Self check-in/check-out: allowing patrons to check their own items in and out
• Serials management: keeping track of the serials in the library.

Activity 6 (ASSIGNMENT)
Directions: Self-Assessment Write your answer on the SEPARATE PAPER determine the possible
answers.

1. What are the examples of applications software

2. How do I pass a software engineer interview?

3. What is the most useful application software for you?

Note:

After this chapter review for a while and take the


midterm exam

LESSON 7: UTILITY SOFTWARE


LEARNING OBJECTIVES
 Know what is the utility software
 Know the types of utility software and the uses
 Know the basic principle and strategies of software utilities

Utility Software

Utility software, often referred as utility is a system software that is designed to help analyze,
configure, optimize or maintain a computer and enhance the computer’s performance. It is a
program that performs a specific task, which is usually related to managing the system resources.
Utilities are sometimes also installed as memory-resident programs. Utility software usually
focuses on how the computer infrastructure that includes computer hardware, application
software, operating system and data storage programs operates. These utilities could range from
the small and simple to the large and complex that can perform either a single task or a multiple
task. Some of the functions performed by these utilities are data compression, disk
defragmentation, data recovery, management of computer resources and files, system diagnosis,
virus detection, and many more. Utility software has been designed specifically to help in
management and tuning of operating system, computer hardware and application software of a
system.

• It performs a specific and useful function to maintain and increase the efficiency of a
computer system

• Aids in keeping the computer free from unwanted software threats such as viruses or
spyware

• Adds functionality that allow the user to customize your desktop and user interface

• Manages computer memory and enhances performance in general, these programs assist
the user to make in general, these programs assist the user to make and run their computer
better. They are also used for password protection, memory management, virus protection, and
file compression in order to manage all the computer functions, resources and files efficiency

Types of Utility Software

[6] These are the different types of utility software.

Application Launchers – is a computer program that helps a user to locate and start other
computer programs. ... In the comparison of desktop application
launchers that follows, each section is devoted to a different desktop environment.

Antivirus software -Antivirus software, or anti-virus software (abbreviated to AV software), also


known as anti-malware, is a computer program used to prevent, detect, and remove malware.
Antivirus software was originally developed to detect and remove computer viruses, hence the
name. However, with the proliferation of other kinds of malware, antivirus software started to
provide protection from other computer threats.

Backup software – Backup software are computer programs used to perform backup; they
create supplementary exact copies of files, databases or entire computers. These programs may
later use the supplementary copies to restore the original contents in the event of data loss.

Batch renaming - Batch renaming is a form of batch processing used to rename multiple
computer files and folders in an automated fashion, in order to save time and reduce the amount
of work involved. Some sort of software is required to do this. Such software can be more or less
advanced, but most have the same basic functions.

Compiler - A compiler is a computer program that translates computer code written in one
programming language (the source language) into another language (the target language).

Data compression - In signal processing, data compression, source coding, or bit-rate reduction
involves encoding information using fewer bits than the original representation. Compression can
be either lossy or lossless. Lossless compression reduces bits by identifying and eliminating
statistical redundancy.

Debugger – A debugger or debugging tool is a computer program that is used to test and debug
other programs (the "target" program).
Decompiler - A decompiler is a computer program that takes an executable file as input, and
attempts to create a high level source file which can be recompiled successfully. It is therefore the
opposite of a compiler, which takes a source file and makes an executable. Decompilers are
usually unable to perfectly reconstruct the original source code, and as such, will frequently
produce obfuscated code.

Diagnostic Program - A diagnostic program (also known as a Test Mode) is an automatic


computer program sequence that determines the operational status within the software,
hardware, or any combination thereof in a component, a system, or a network of systems.
Diagnostic programs ideally provide the user with guidance regarding any issues or problems
found during its operation.

Disk Cleaner – A disk utility is a utility program that allows a user to perform various functions on
a computer disk, such as disk partitioning and logical volume management, as well as multiple
smaller tasks such as changing drive letters and other mount points, renaming volumes, disk
checking, and disk formatting, which are otherwise handled separately by multiple other built-in
commands. Each operating system (OS) has its own basic disk utility, and there are also
separate programs which can recognize and adjust the different filesystems of multiple OSes.
Types of disk utilities include disk checkers, disk cleaners and disk space analyzers

Disk Cloning - Disk cloning is the process of creating a 1-to-1 copy of a hard disk drive (HDD) or
solid-state drive (SSD), not just its files. Disk cloning may be used for upgrading a disk or
replacing an aging disk with a fresh one. In this case, the clone can replace the original disk in its
host computer.

Disk Compression - A disk compression software utility increases the amount of information that
can be stored on a hard disk drive of given size. Unlike a file compression utility, which
compresses only specified files—and which requires the user to designate the files to be
compressed—an on-the-fly disk compression utility works automatically through resident software
without the user needing to be aware of its existence.

Disk Defragmenter - In the maintenance of file systems, defragmentation is a process that


reduces the degree of fragmentation. It does this by physically organizing the contents of the
mass storage device used to store files into the smallest number of contiguous regions
(fragments). It also attempts to create larger regions of free space using compaction to impede
the return of fragmentation.

Disk Editor - A disk editor is a computer program that allows its user to read, edit, and write raw
data (at character or hexadecimal, byte-levels) on disk drives (e.g., hard disks, USB flash disks
or removable media such as a floppy disks); as such, they are sometimes called sector editors,
since the read/write routines built into the electronics of most disk drives require to read/write
data in chunks of sectors (usually require to read/write data in chunks of sectors (usually 512
bytes).

Disk Partitioning- Disk partitioning or disk slicing is the creation of one or more regions on
secondary storage, so that each region can be managed separately. These regions are called
partitions. It is typically the first step of preparing a newly installed disk, before any file system is
created.

Disk Repair Software - will look at the file system on disk, look for inconsistencies, and attempt
to correct them.
Data Security - refers to protective digital privacy measures that are applied to prevent
unauthorized access to computers, databases and websites. Data security also protects data
from corruption. Data security is an essential aspect of IT for organizations of every size and type.

Data security is also known as information security (IS),

Disk space analyzer – [14] A disk space analyzer (or disk usage analysis software) is a software
utility for the visualization of disk space usage by getting the size for each folder (including sub-
folders) and files in a folder or drive. Most of these applications analyze this information to
generate a graphical chart showing disk usage distribution according to folders or other user
defined criteria.

Disk utility - A disk utility is a utility program that allows a user to perform various functions on a
computer disk, such as disk partitioning and logical volume management, as well as multiple
smaller tasks such as changing drive letters and other mount points, renaming volumes, disk
checking, and disk formatting, which are otherwise handled separately by multiple other built-in
commands.

File archiver- A file archiver is a computer program that combines a number of files together into
one archive file, or a series of archive files, for easier transportation or storage. File archivers may
employ lossless data compression in their archive formats to reduce the size of the archive.

File comparison - In computing, file comparison is the calculation and display of the differences and
similarities between data objects, typically text files such as source code.

File Compression - a software program that is used to compress or


decompress files. Most often such a software program is used to compress files of various
formats that are no longer being actively used and reduce their size so that they take up about
40 percent less space on hard disk.

File manager - A file manager or file browser is a computer program that provides a user
interface to manage files and folders. The most common operations performed on files or groups
of files include creating, opening (e.g. viewing, playing, editing or printing), renaming, moving or
copying, deleting and searching for files, as well as modifying file attributes, properties and file
permissions.

File synchronization- File synchronization (or syncing) in computing is the process of ensuring
that computer files in two or more locations are updated via certain rules.

Filesystem-level encryption- often called file-based encryption, FBE, or file/folder encryption, is


a form of disk encryption where individual files or directories are encrypted by the file system
itself.

Hex editor- A hex editor (or binary file editor or byte editor) is a computer program that allows for
manipulation of the fundamental binary data that constitutes a computer file. The name 'hex'
comes from 'hexadecimal': a standard numerical format for representing binary data.

Hook safe- is a hypervisor-based light system that Hook safe is a hypervisor based light system
that safeguards a computer's kernel from rootkit attacks.
Linker- In computing, a linker or link editor is a computer utility program that takes one or more
object files generated by a compiler or an assembler and combines them into a single executable
file, library file, or another 'object' file.

Memory tester- Memory testers are specialized test equipment used to test and verify memory
modules.

Network Managers – used to monitor, maintain and provision computer networks. It helps you to
keep track of the network's bandwidth, availability, performance and hardware.

Network utility- Network utilities are software utilities designed to analyze and configure various
aspects of computer networks. The majority of them originated on Unix systems, but several later
ports to other operating systems exist.

Registry cleaner- A registry cleaner is a class of third party software utility designed for the
Microsoft Windows operating system, whose purpose is to remove redundant items from the
Windows registry.

Screensaver- A screensaver (or screen saver) is a computer program that blanks the screen or
fills it with moving images or patterns when the computer is not in use. The original purpose of
screensavers was to prevent phosphor burn-in on CRT and plasma computer monitors (hence
the name).

Source-to-source compiler - (S2S compiler), trans compiler or transpiler is a type of compiler


that takes the source code of a program written in a programming language as its input and
produces the equivalent source code in the same or a different programming language. A source-
to-source compiler translates between programming languages that operate at approximately the
same level of abstraction, while a traditional compiler translates from a higher level programming
language to a lower level programming language.

System monitor - is a hardware or software component used to monitor system resources and
performance in a computer system. [1] Among the management issues regarding use of system
monitoring tools are resource usage and privacy.

System profiler - is a program that can provide detailed information about the software installed
and hardware attached to a computer. Typically work stations and personal computers have had
system profilers as a common feature since the mid-1990s.

Uninstaller- also called a deinstaller, is a variety of utility software designed to remove other
software or parts of it from a computer. It is the opposite of an installer. Uninstallers are useful
primarily when software components are installed in multiple directories, or where some software
components might be shared between the system being uninstalled and other systems that
remain in use.

MORE SOFTWARE CONCEPTS

1. Concept of Booting

The term boot is used to describe the process taken by the computer when turned on that loads
the operating system and prepares the system for use.
Booting, boot up, and start-up are all synonymous terms and generally describe the long
list of things that happen from the pressing of the power button to a fully-loaded and ready-to-use
session of an operating system, like Windows.

What Goes On During the Boot Process?

From the very beginning, when the power button is pressed to turn the computer on, the
power supply unit gives power to the motherboard and its components so that they can play their
part in the whole system.

The first part of the next step of the boot process is controlled by BIOS and begins after
the POST. This is when POST error messages are given if there's a problem with any of the
hardware.

Following the display of various information on the monitor, like the BIOS manufacturer
and RAM details, BIOS eventually hands the boot process over to the master boot code, that
hands it to the volume boot code, and then finally to the boot manager to handle the rest.

This is how BIOS finds the right hard drive that has the operating system. It does this by
checking the first sector of the hard drives it identifies. When it finds the right drive that has a boot
loader, it loads that into memory so that the boot loader program can then load the operating
system into memory, which is how you use the OS that's installed to the drive.

In newer versions of Windows, BOOTMGR is the boot manager that's used.

That boot process explanation you just read is a very simplistic version of what happens,
but it gives you some idea of what's involved.

There are two Types of Booting

1. Warm Booting: when the System Starts from the Starting or from initial State Means
when we Starts our System this is called as warm Booting. In the Warm Booting the
System will be Started from its beginning State means first of all, the user will press the
Power Button, then this will read all the instructions from the ROM and the Operating
System will be Automatically gets loaded into the System.

2. Cold Booting: The Cold Booting is that in which System Automatically Starts when
we are Running the System, For Example due to Light Fluctuation the system will
Automatically Restarts So that in this Chances Damaging of system are More. and the
System will not be start from its initial State So May Some Files will be Damaged
because they are not Properly Stored into the System.

2. Security

Security software is a broad term that encompasses a suite of different types of software that
deliver data and computer and network security in various forms. Security software can protect a
computer from viruses, malware, unauthorized users and other security exploits originating from
the Internet. Types of security software include anti-virus software, firewall software, network
security software, Internet security software, malware/spam ware removal and protection
software, cryptographic software, and more.
In end-user computing environments, anti-virus and anti-spam software is the most common type
of software used, whereas enterprise users add a firewall and intrusion detection system on top of
it.
Types of Security Software:
• Access Control
• Anti Key Logger
• Anti Malware
• Anti Spyware
• Anti Subversion Software
• Anti-tamper software
• Antivirus software
• Cryptographic software
• Computer Aided Dispatch (CAD)
• Firewall
• Intrusion detection system (IDS)
• Intrusion prevention system (IPS)
• Log management software
• Records Management
• Sandbox
• Security information management SIEM
3. Debugging

In the context of software engineering, debugging is the process of fixing a bug in the software. In
other words, it refers to identifying, analyzing and removing errors. This activity begins after the
software fails to execute properly and concludes by solving the problem and successfully testing
the software. It is considered to be an extremely complex and tedious task because errors need
to be resolved at all stages of debugging.

[18] Debugging is the process of detecting and removing of existing and potential errors (also
called as ‘bugs’) in a software code that can cause it to behave unexpectedly or crash. To prevent
incorrect operation of a software, debugging is used to find and resolve bugs or defects. When
various subsystems or modules are tightly coupled, debugging becomes harder as any change in
one module may cause more bugs to appear in another. Sometimes it takes more time to debug
a program than to code it.

Debugging Process: Steps involved in debugging are:

• Problem identification and report preparation.


• Assigning the report to software engineer to the defect to verify that it is genuine.
• Defect Analysis using modeling, documentations, finding and testing candidate flaws, etc.
• Defect Resolution by making required changes to the system.
• Validation of corrections

Debugging Strategies:
1. Study the system for the larger duration in order to understand the system. It helps debugger
to construct different representations of systems to be debugging depends on the need.
Study of the system is also done actively to find recent changes made to the software.
2. Backwards analysis of the problem which involves tracing the program backward from the
location of failure message in order to identify the region of faulty code. A detailed study of
the region is conducting to find the cause of defects.

3. Forward analysis of the program involves tracing the program forwards using
breakpoints or print statements at different points in the program and studying the results.
The region where the wrong outputs are obtained is the region that needs to be focused to
find the defect.
4. Using the past experience of the software debug the software with similar problems in
nature. The success of this approach depends on the expertise of the debugger.
Debugging Tools:
Debugging tool is a computer program that is used to test and debug other programs. A lot of
public domain software like gdb and dbx are available for debugging. They offer console-based
command line interfaces. Examples of automated debugging tools include code based tracers,
profilers, interpreters, etc.
5. THREATS
Herein, the term “threat” is defined as any kind of software potentially or directly capable of
inflicting damage to a computer or network and compromising the user's information or rights
(that is, malicious and other unwanted software). In a wider sense, the term "threat" may be used
to indicate any type of potential danger to the security of the computer or network (that is,
vulnerabilities that can result in hacker attacks).

All of the program types stated below have the ability to endanger the user’s data or
confidentiality. Programs that do not conceal their presence (e.g. spam distribution software and
various traffic analyzers) are usually not considered as computer threats, although they can
become threats under certain circumstances.

In Doctor Web classification, all threats are divided according to the level of severity into two
types:
•Major threats – classic computer threats that may perform destructive and illegal actions in
the system on their own (erase or steal important data, crash networks, etc.). This type of
computer threats consists of software that is traditionally referred to as malware (malicious
software), that is, viruses, worms and Trojans.
•Minor threats – computer threats that are less dangerous than major threats, but may be
used by a third person to perform malicious activity. Also, mere presence of minor threats in
the system indicates its low protection level. Among IT security specialists this type of
computer threats is sometimes referred to as grayware or PUP (potentially unwanted
programs) and consists of the following program types: adware, dialers, jokes, riskware, hack
tools.
6. CAPTCHA
What does CAPTCHA mean?
CAPTCHA stands for Completely Automated Public Turing test to tell Computers and Humans
Apart. In other words, CAPTCHA determines whether the user is real or a spam robot.
CAPTCHAs stretch or manipulate letters and numbers, and rely on human ability to determine
which symbols they are.
How Does a CAPTCHA Work?

CAPTCHAs were invented to block spammy software from posting comments on pages or
purchasing excess items at once. The most common form of CAPTCHA is an image with several
distorted letters. It is also common to choose from a variety of images where you need to select a
common theme.

The internet and computers are actually made up of a unique coding language. Computers find it
difficult to understand languages because of the strange and intricate rules human languages
take on, along with slang that humans use.

Who Uses CAPTCHA?


CAPTCHA is used on a variety of websites that want to verify that the user is not a robot. First
and foremost, CAPTCHA is used for verifying online polls. In 1999, Slashdot created a poll that
asked visitors to choose the graduate school that had the best program for computer science.
Students from the universities Carnegie Mellon and MIT created bots, or automated programs to
repeatedly vote for their schools.

These schools received thousands of votes, while other schools only hit a few hundred.
CAPTCHA came into play so that users could not take advantage of the polling system.

Another use of CAPTCHA is for registration forms on websites such as Yahoo! Mail or Gmail
where people can create free accounts. CAPTCHAs prevent spammers from using bots to create
a plethora of spam email accounts.

Ticket websites such as TicketMaster also use CAPTCHA to prevent ticket scalpers from over
purchasing tickets for large events. This allows legitimate customers to purchase tickets fairly and
keeps scalpers from placing thousands of ticket orders.

Lastly, web pages or blogs that contain message boards or contact forms use CAPTCHA to
prevent spammy messages or comments. It does not prevent against cyberbullying, but does
prevent bots from posting messages automatically.

Do CAPTCHA’s Work?
Unfortunately, as technology and hackers become more advanced, so do their scamming tactics.
While CAPTCHA is safe for the most part, cybercriminals have begun incorporating CAPTCHA
into their false or fraudulent websites to make their scams more believable.

Here are some ways that cyber criminals can trick internet users:

• The scam contains intriguing messages on your newsfeed. Ex. KIM KARDASHIAN NEVER
BEFORE SEEN VIDEO LEAKED. Once you click on this post, you will need to enter a fake
CAPTCHA code and be directed to a landing page. At this time, a virus takes over your
account.
• The scam contains an outlandish title ex. GIRL ACCIDENTALLY TEXTS MOM INSTEAD OF
BOYFRIEND that intrigues users to read a story. The link leads to a fake news site where
software hacking may begin.
How CAPTCHA Prevents Scammers
CAPTCHA has a variety of applications for keeping websites and users secure. These include but
are not limited to:

• Protecting email addresses from scammers


• Protect website registrations
• Protects online polling
• Protects against email worms/junk mail
• Prevents dictionary attacks
• Prevents comment spamming on blogs

7. History of software

From massive machines like the ENIAC computer to smartphones and other mobile devices,
computing has seen incredibly rapid technological change. These hardware upgrades would mean
little, however, without the accompanying birth and growth of software development.

From operating systems and spreadsheets to mobile apps and games, you interact with software
every time you use a computer. Here’s a brief overview of the origins of software development and
the current state of the field.

What Is Software?

Simply put, software is the interface between computer systems and the humans who use them.
Software consists of programming instructions and data that tell the computer how to execute
various tasks. These days, instructions are generally written in a higher-level language, which is
easier to use for human programmers, and then converted into low-level machine code that the
computer can directly understand.

The Early Days of Software

Computer scientist Tom Kilburn is responsible for writing the world’s very first piece of software,
which was run at 11 a.m. on June 21, 1948, at the University of Manchester in England. Kilburn
and his colleague Freddie Williams had built one of the earliest computers, the Manchester Small-
Scale Experimental Machine (also known as the “Baby”). The SSEM was programmed to perform
mathematical calculations using machine code instructions. This first piece of software took “only”
52 minutes to correctly compute the greatest divisor of 2 to the power of 18 (262,144).

For decades after this groundbreaking event, computers were programmed with punch cards in
which holes denoted specific machine code instructions. Fortran, one of the very first higher-level
programming languages, was originally published in 1957. The next year, statistician John Tukey
coined the word “software” in an article about computer programming. Other pioneering
programming languages like Cobol, BASIC, Pascal and C arrived over the next two decades.

The Personal Computing Era

In the 1970s and 1980s, software hit the big time with the arrival of personal computers. Apple
released the Apple II, its revolutionary product, to the public in April 1977. VisiCalc, the first
spreadsheet software for personal computing, was wildly popular and known as the Apple II’s
killer app. The software was written in specialized assembly language and appeared in 1979.

Other companies like IBM soon entered the market with computers such as the IBM PC, which
first launched in 1981. The next year, Time magazine selected the personal computer as its Man
of the Year. Again, software for productivity and business dominated these early stages of
personal computing. Many significant software applications, including AutoCAD, Microsoft Word
and Microsoft Excel, were released in the mid-1980s.

Open-source software, another major innovation in the history of software development, first
entered the mainstream in the 1990s, driven mostly by the use of the internet. The Linux kernel,
which became the basis for the open-source Linux operating system, was released in 1991.
Interest in open-source software spiked in the late 1990s, after the 1998 publication of the source
code for the Netscape Navigator browser, mainly written in C and C++. Also noteworthy is the
release of Java by Sun Microsystems in 1995.

The Mobile Device

The worlds very first mobile phone call was made on April 3, 1973. In 1993 IBM released the first
publicly available “smartphone” and in 1996 Palm OS hit the market, bringing PDA’s to the
masses. In 1999, RIM released the very first Blackberry 850 device and quickly became the
worlds’ fastest growing company. Then, in 2007, Apple changed computing with the release of
the iPhone. This is when mobile computing really found its place and mobile applications began
to explode. Mobile apps are now a major part of development using languages like Swift and
Java.

Software Development Today

Today, software has become ubiquitous, even in places that you might not expect it, from crock
pots to nuclear submarines. Some programming languages, like C and Cobol, have survived the
test of time and are still in use. Other languages, such as Java and Python, are somewhat
younger and have been used in countless software development projects. Still others, such as
Apple’s Swift programming language for iOS or Go Open source, are relatively new and exciting.

Activity 7: Finding Solutions


Answer the following question on the separate short bond paper and ready for
the submission.

1. What are the uses utility programs?

2. Provide tabulation of comparison of software application and utility software.

3. Captcha prevents data corruption and spam? How? Explain your answer.

LESSON 8: ORGANIZATIONAL LEARNING


LEARNING OBJECTIVES

• Know what is organizational learning


• Learn the types of organizational learning
WHAT IS ORGANIZATIONAL LEARNING?

Organizational learning is a learning process within organizations that involves the interaction of
individual and collective (group, organizational, and inter-organizational) levels of analysis and leads to
achieving organizations’ goals . In simple words, organizational learning is when an organization
continually improves itself through gaining knowledge and experience over time.

Organizational learning is not a one-man job. It is more effective when it comes to teams. It
requires systematic integration and collective interpretation of new knowledge that leads to collective
action and involves risk taking as experimentation. Organizational learning is important for all companies,
as the creation, retention and transfer of knowledge within the organization will strengthen the
organization as a whole.

An idea or product is conceived, the company creates the idea or product, then the company
must reflect. It is through this reflection of both process and outcome that learning will occur When
organizations dedicate time and resources to developing a learning culture and implementing
organizational learning, they are more competitive. This is why learning from experience is really
important, as an organization that contains more knowledge about best practices, and will be much more
able to adapt.

Another type of a Organizational learning is a Learning Management System (LMS) is an


incredibly valuable tool for training, and evaluating results. Basically, with a management system in place,
you can track who is learning what, and how far they have progressed. They allow the outlining of goals
and metrics in a way that it can be handled just like any other project, something you’re used to by now.
While LMS’s was at one time only used by large companies, with both the budget available and the large
workforce that required a tool to help manage their learning process, nowadays learning management
systems are used by companies of all sizes, as a way to monitor the success and impact of workforce
training.

Combined, self-guidance technology could be made to work for LMS’s as well. This solution is
good for this because while it tracks and trains workers on new technology or information, it can internally
grade their progress and proficiency, and present this information in any way you really want. From this,
you can build your metrics and track the progress of students very effectively. With these metrics, you can
spot problems and balance pace without the annoyances you otherwise would contend with.

Organizational Learning Theory: The Three Types of Learning

Argyris and Schon (1996) identify three levels of learning which may be present in the organization
 Single loop learning: Consists of one feedback loop when strategy is modified in response to an unexpected
result (error correction). E.g. when sales are down, marketing managers inquire into the cause, and tweak the
strategy to try to bring sales back on track.
 Double loop learning: Learning that results in a change in theory-in-use. The values, strategies, and
assumptions that govern action are changed to create a more efficient environment. In the above example,
managers might rethink the entire marketing or sales process so that there will be no (or fewer) such
fluctuations in the future.
 Deuterolearning: Learning about improving the learning system itself. This is composed of structural and
behavioral components which determine how learning takes place. Essentially deuterolearning is therefore
"learning how to learn."

ACTIVITY 8: SIMULATION EXERCISES (use short bond paper)


From organizational learning theory we can infer the following issues which may affect knowledge
management and knowledge management system.

1. Give one example by doing proposal plan for one company that can give some solutions perspective on the
different issues concern.

LESSON 9: DEVELOPMENT OF PRODUCTIVE TEAMS


LEARNING OBJECTIVES

• Define what is a productive team


• Know what is software and hardware
• Know tips for being productive

DEVELOPMENT OF PRODUCTIVE TEAMS

It's never easy to lead a team whether it's made up of 10 or 100 team members. Grouping
different types of people with different temperaments can often lead to confrontations, miscommunication,
and can hinder performance in the workplace. It can drive you crazy so much. If handled with little tact,
however, you can achieve great professional goals for your team. It's a different thing to have them on the
same page, but having them work together to accomplish a shared objective is no small feat.

And if your group is productive, there are still several ways you can integrate to bring efficiency in
the workplace to a whole new level. Before we jump to particular ways, let's get into the depth of
something that will be of real use to us. Let’s define first what is efficiency and productivity.

Efficiency means a level of performance that defines a system which uses the lowest input volume to
produce the highest output volume.

Productivity, on the other hand, is the average measure of production efficiency. It can be expressed as
the relationship between the outputs and the inputs used in the production process.

Now, these two terms will be a great need in order to attain something important. Moreover, in this case,
development of productive teams comes with two parts: the software and the hardware. These two makes
an organization or a company going and for the people working on it.

SOFTWARE AND HARDWARE


It's a real challenge to stay productive individually in today's environment, and the same
applies to team productivity. There are many ways to increase the speed of development without
increasing the number of employees or the number of hours on the working day. But using the
latest, most imaginative tools won't make your team more productive. It's just one element of the
equation. Not only does anyone need tools, but a complex system that allows each developer to
make the most of their days and eventually, as a team, to achieve their goals.
Moreover, there are many actual tips that can help anyone improve their skills and attain
values in order to be productive in any way.
Defining Productivity
If you have a warehouse, it's quite easy to calculate efficiency. Yet you can't measure efficiency as a
supply chain when it comes to your team.
According to Charles Duhigg:
“Productivity is about making certain choices in certain ways. The way we choose to see ourselves and
frame daily decisions; the stories we tell ourselves, and the easy goals we ignore; the sense of
community we build among teammates; the creative cultures we establish as leaders. These are the
things that separate the merely busy from the genuinely productive.”
With less time and effort, productivity gets the results you want. It's not about harder but more thoughtful
working. Because your developers are not machines, it is much harder to measure their performance.
The efficiency of their productivity can not be measured by knowledge workers. Performance is often
more important than quantity, so the sheer amount of time spent on a project or writing lines of code has
little to do with how successful the developer really was.
TAKE A DATA-DRIVEN APPROACH TO INCREASE TEAM
PRODUCTIVITY
First, you need to see clearly where the majority of the members of your group spend their time.
This will give you an overview of the greatest time-wasters, and the most important ones you can begin to
address. You can ask your developers to guess how they spend their time or ask them to track their
working days for a couple of weeks.
This way, you can see what developers ' activities spend most time on, and you can even add a
financial value to it, seeing not only the wasted hours, but also the money. Such surveys are
recommended to be performed anonymously as this way you can prevent data biases.
SEE THE BIG PICTURE: WHERE IS YOUR TEAM HEADING?
There is a need to express priorities clearly; otherwise there is ambiguity. Your team members
may be busy spinning their wheels without producing meaningful goal-oriented results.
A better plan means that your goals are more likely to be met, so setting goals is crucial to the
success of your team.
Defining targets: make sure that these goals are SMART and the team's predictions are made.
Since making the calculation themselves, programmers seem to be a little more efficient compared to
situations where the manager did it without even consulting them. The target of SMART is: o Specific
Your goal should be clear and specific, otherwise you will not be able to concentrate your
energies or be motivated to achieve it. Try to answer the five "W" questions when drawing up your goal:
[1] o What do I want to accomplish?
Which resources or limits are involved? o Who is involved? o Where is it located?
Why is this goal important?
Specific answers the questions, "what is to be achieved?" "how are you going to know it is
finished?" and explains the outcomes of the work to be done (end product). The definition is written in
such a way that it is more likely to be interpreted in the same way by anyone reading the objective. For
ensure a specific objective is to ensure that the manner in which it is presented can be observed.
Observable means that someone can see or hear someone doing something (physically observe).
Measurable
Measurable goals are crucial, so you can track your progress and stay motivated. Assessing
progress helps you stay focused, reach your goals, and experience the thrill of getting closer to your
target.
Measurable with measurement responds to the question, "how do you know it meets expectations?", and
describes the target using evaluable terms (quantity, cost, duration, quality, deadlines, etc.). It refers to
the degree that something can be measured against some norm. A quantity calculation goal uses
quantity terms, percentages, etc.
Attainable
In order to be successful, the goal always needs to be realistic and achievable. In other words,
your skills should be extended but still possible. You can find previously missed opportunities and tools
that can take you closer to it when you set an achievable goal.
Achievable answers the questions, "Can the individual do it?", "Can the person accomplish the
achievable goal?", "Does he/she have the expertise, knowledge and ability to fulfill expectations?". It also
answers the question "Can the time frame, potential and assets be given?".
Relevant
This step is to ensure that your goal is important to you and aligns with other relevant goals as well. We
all need support and help in achieving our goals, but maintaining control over them is crucial. And make
sure your dreams move others forward, but you still have the opportunity to accomplish your own goal.
Relevant answers the questions, "should it be done?", "why?", and "what will the effect be?".
Timely
Every aim needs a target date, so you have to concentrate on a timeline and work towards something.
Being aspect of the SMART target framework helps to avoid taking precedence over the longer-term
goals in everyday tasks.
Time-oriented answers the question, "When is it going to be done?" It refers to the fact that an objective
includes endpoints and checkpoints. A project can sometimes have only an end point or a due date.
Sometimes the end point or due date is the real end of the task, or sometimes the end point of one task is
another starting point. Sometimes a project has several benchmarks or check points to assist you or
others in determining how well something is going before it is done so that changes or adjustments can
be made if necessary, to ensure that the end result meets expectations. Sometimes, the nature of an
employee is such that there are due dates and deadlines to create a sense of urgency that allows them to
complete something.
GOOD FEEDBACKS AND MOTIVATE THEM
Team leaders need to provide input to team members as frequently as possible, helping them
develop further and ensuring that they remain productive in order to achieve the goal of the team. Holding
weekly or bi-weekly sprints will help the team stay on the right track and provide information about the
project's current state, recommendations for better allocation of time and where to concentrate.
Also, studies show that a well-motivated software development team pursuing time management
could deliver ten times more than a nonmotivated team. Once developers (and workers in general) are
inspired, productivity increases. Your team members need to feel like they're part of something important
and have a major impact on the project and industry. It will help the developers stay motivated by giving
some opportunities. You need to figure out the best motivation for them because each developer is
different. If there's a big incentive involved, everyone works better.
PRAISE A JOB WELL DONE
Although different things work to improve their productivity and efficiency at work for different
employees. But it's something as easy as being remembered for their contributions for a large number of
them. You know nothing can add to productivity if an employee thinks his effort is not being properly
acknowledged. While they can work wonders in front of the whole team to support them. Instead of a
digital congratulatory phrase, this public act of gratitude encourages others in the group to do their best.
In a company, it promotes a healthy work culture that will be an addition to improving group performance.
KNOWING ONE’S STRENGTHS AND WEAKNESSSES
Each human being has certain hidden talents and abilities that can be put to good use. Therefore,
it becomes the responsibility of a leader or group to identify and keep in mind certain talents when
assigning them tasks. It's the backbone of creating a competitive group to learn their skill set. Making
them

use their strengths will help make your workplace more productive and better than ever before.
Activity 9: Real Challenge Thinking of Illustrative project.
In the figure below think what could be the best idea to apply software and hardware concepts for team
productivity list down the possible ideas to have prouctivity development. Explain your concept.

LESSON 10: MODELING OF HUMAN


COMPETENCIES
LEARNING OBJECTIVES

• Define competency model


• Learn the benefits of human competency model
• Know the types of human competency model
• Know the ways how competency model be applied
• Learn the ways in developing competency model
COMPUTER ENGINEER.

Computer engineers combine expertise in software design and implementation with fundamental
engineering skills - a highly valuable skill set. They design and maintain websites, networks, massive
databases, and other applications, in the process developing and integrating new software and hardware.
Computer engineers specialize in the development of computer systems designed to carry out specific
functions in real-time, known as embedded systems, which operate airplanes, cell phones, vending
machines, medical equipment, etc. Their vast computing knowledge and problem-solving skills make
computer engineers an invaluable asset to industry. As individuals and corporations become more
dependent on computers, there is an abundant need for well-trained computer engineers.
MODELING OF HUMAN COMPETENCIES

A competency model is a guideline developed by a Human Resource department that sets out the
specific skills, knowledge and behavioral requirements that enable an employee to perform their job
successfully.

A job description and a competency model sound almost alike because they both seem to
describe what an employee is required to do in the job. The difference is that a job description is a general
summary of the skills required for a job, whereas a competency model provides specific behaviors that an
employee must do on the job in order to be successful.

BENEFITS OF HUMAN COMPETENCY MODEL

Here are some of the benefits of implementing the human competency model:

• Sets a concrete direction for workforce performance that aligns with organizational goals and
strategies.
• Enables HR to have a concrete understanding of all employee abilities and skills.
• Enables HR and Training to more accurately identify learning & development (L&D) needs.
• Allows employees to take ownership of the skills and behaviors required of them in their roles.
• Empowers organizations to keep track of what skills employees have so that strategy and
planning can work towards that future skills may be needed.
• Provides a consistent and fair system of measurement for performance evaluation.

TYPES OF HUMAN COMPETENCY MODEL

Competencies can be broken down into helpful categories to better understand the type of
information that might be included, such as:

1.Core competencies
Core competencies include the baseline skills required by the organization for all employees;
these are the basic things that employees must fulfill. This will vary from company to company, as it
depends on the values, philosophy and goals of each organization, but can include basic requirements
like communication skills or teamwork.
2,Functional competencies
Functional competencies are job-specific skills and behaviors that are unique for each role. For
example, a competency for a restaurant waiter may be the ability to effectively handle customer
complaints, where a competency for an accountant may be the ability to analyze a specific type of
financial data in order to prepare reports.
3, Leadership competencies

Leadership competencies are often used for supervisory and management related roles, although
can be applied to any job position that requires an employee to lead others. They include leadership skills
and behaviors like decision-making abilities.

APPLICATION OF COMPETENCY MODEL

Competency models are used for a variety of HR practices, including:

Recruitment
Fully developed competency models are often used for the development of job postings. When they are
well-defined and clear, organizations have a better chance of finding more closely matched candidates.

Talent/Performance Management

Defining what success should look like within the organization boils down to the performance of the
workforce; a competency model can define what performance success should look like for each role
within an organization. This benchmark helps HR to connect the function of each job with organizational
goals and also ensure that the talent of employees is developed.

Performance Appraisal

Competency models provide the framework needed to properly assess employees during a performance
review; both the employee and employer have a clearly defined list of behaviors and skills to work from.

WAYS IN DEVELOPING COMPETENCY MODEL

1. Determine what kind of process works for your organization


The research and development involved in creating well-defined competencies for a position can be
lengthy; it takes time to understand what is needed for each position. Due to today’s fast-paced and ever-
changing business environment, it could be beneficial for some organizations to have a shorter and more
intensive method of development. Competency models that are also designed as flexible can likewise
accommodate future changes.

2. Research available competency information

Developing competencies requires more than vague statements about what the job position will entail. As
well, the functional competencies need to reflect what “great” performance should be, not just the baseline
skills for “acceptable” performance.

Previously developed competencies for similar roles should be identified and used as a guideline, as well
as related role documentation, background information and organizational core competencies.

3. Interview relevant business units and executives


Interviews with relevant stakeholders provide the insight needed for the role’s required competencies.
Executives can provide the key organizational core competencies needed for the role that reflect both the
values, philosophy and goals of the organization. Managers and high-performers from relevant business
departments can be interviewed to find out the key skills and behaviors that are necessary and successful
for those roles. When interviewing, the focus should be on what skills and behaviors make for a
topperforming employee in that role.

4. Establish the core competencies

The core competencies should reflect the baseline behaviors and skills required by the organization. How
should employees act and contribute as part of the organization so that they can integrate into the
company’s work culture and philosophy? Utilize the research and interview content from executives and
relevant organizational stakeholders.

5. Establish job-specific competencies


Job-specific competencies should reflect the unique role skills and behaviors as outlined by departmental
managers and top-performers at the research stage. What did these individuals need to know and do to
perform well in their role?

6. Establish leadership competencies, where needed

When management-related competencies need to be drafted, it should be assumed that the individuals
are already familiar with the core competencies set out by the organization.

The focus should be on unique leadership attributes and skills. These can be determined by the executive
and senior management level at the research stage.

7. Finalize the competency list

Organize the findings, but avoid being unrealistic with narrowing down competencies. If the list is too far-
fetched, it could hinder recruitment initiatives and scare away potential applicants. If the list is too vague
or not specific-enough, it could result in an influx of candidates that are not perfectly suited to the position;
it will also not help employees to achieve organizational goals.

ACTIVITY 10. CAREER CHALLENGE


Put your answer on short bond paper and ready for the submission.
In modelling human competencies. Prepare a matrix table of the different classification jobs related to
computer engineering, in terms of task/job specialization/skills, certifications and Knowability are as
follows:

Hardware Engineer
Simulation Project Engineer
Technical Support Engineer
Software Engineer
Network Admin Engineer

Note: Review for a while and be ready for the Final Exam.

Congratulations you finished the module.!

You might also like