You are on page 1of 104

MODULE LIVING IN THE IT ERA – IT01

CHAPTER 1: INTRODUCTION TO INFORMATION


COMMUNICATION AND TECHNOLOGY

Objectives:
a) Discuss the importance of ICT in our daily lives.
b) Identify the advantages and disadvantages of ICT.
c) Discover the uses of ICT.

Lesson 1: Overview of ICT


Information and communications
technology (ICT) are an extensional term for
information technology (IT) that stresses the role of
unified communications and the integration of
telecommunications (telephone lines and wireless
signals) and computers, as well as necessary
enterprise software, middleware, storage and
audiovisual, that enable users to access, store,
transmit, and manipulate information.
The term ICT is also used to refer to the convergence of audiovisual and telephone
networks with computer networks through a single cabling or link system. There are large
economic incentives to merge the telephone network with the computer network system using
a single unified system of cabling, signal distribution, and management. ICT is an umbrella term
that includes any communication device, encompassing radio, television, cell phones, computer
and network hardware, satellite systems and so on, as well as the various services and appliances
with them such as video conferencing and distance learning.
ICT is a broad subject and the concepts are evolving. It covers any product that will store,
retrieve, manipulate, transmit, or receive information electronically in a digital form (e.g.,
personal computers, digital television, email, or robots). Theoretical differences between
interpersonal-communication technologies and mass-communication technologies have been
identified by the philosopher Piyush Mathur. Skills Framework for the Information Age is one of
many models for describing and managing competencies for ICT professionals for the 21st
century

Page | 1
MODULE LIVING IN THE IT ERA – IT01

The phrase "information and communication technologies" has been used by academic
researchers since the 1980s. The abbreviation "ICT" became popular after it was used in a report
to the UK government by Dennis Stevenson in 1997, and then in the revised National Curriculum
for England, Wales and Northern Ireland in 2000. However, in 2012, the Royal Society
recommended that the use of the term "ICT" should be discontinued in British schools "as it has
attracted too many negative connotations". From 2014 the National Curriculum has used the
word computing, which reflects the addition of computer programming into the curriculum.
Variations of the phrase have spread worldwide. The United Nations has created a
"United Nations Information and Communication Technologies Task Force" and an internal
"Office of Information and Communications Technology"

For more knowledge about what is ICT, please check the link provided:
https://www.youtube.com/watch?v=7Q67Poh7cGA&ab_channel=DilshanSoftLab

Lesson 2: Brief History of ICT


Recently it has become popular to broaden
the term to explicitly include the field of electronic
communication so that people tend to use the
abbreviation ICT (Information and
Communications Technology).

The term "information technology"


evolved in the 1970s. Its basic concept, however,
can be traced to the World War II alliance of the
military and industry in the development of
electronics, computers, and information theory. After the 1940s, the military remained the major
source of research and development funding for the expansion of automation to replace
manpower with machine power.
Since the 1950s, four generations of computers have evolved. Each generation reflected
a change to hardware of decreased size but increased capabilities to control computer
operations. The first generation used vacuum tubes, the second used transistors, the third used
integrated circuits, and the fourth used integrated circuits on a single computer chip. Advances
in artificial intelligence that will minimize the need for complex programming characterize the
fifth generation of computers, still in the experimental stage.

Page | 2
MODULE LIVING IN THE IT ERA – IT01

The first commercial computer was the UNIVAC I, developed by John Eckert and John W.
Mauchly in 1951. It was used by the Census Bureau to predict the outcome of the 1952
presidential election. For the next twenty-five years, mainframe computers were used in large
corporations to do calculations and manipulate large amounts of information stored in
databases. Supercomputers were used in science and engineering, for designing aircraft and
nuclear reactors, and for predicting worldwide weather patterns. Minicomputers came on to the
scene in the early 1980s in small businesses, manufacturing plants, and factories.
In 1975, the Massachusetts Institute of Technology developed microcomputers. In 1976, Tandy
Corporation's first Radio Shack microcomputer followed; the Apple microcomputer was
introduced in 1977. The market for microcomputers increased dramatically when IBM introduced
the first personal computer in the fall of 1981. Because of dramatic improvements in computer
components and manufacturing, personal computers today do more than the largest computers
of the mid-1960s at about a thousandth of the cost.
Computers today are divided into four categories by size, cost, and processing ability. They are
supercomputer, mainframe, minicomputer, and microcomputer, more commonly known as a
personal computer. Personal computer categories include desktop, network, laptop, and
handheld.
Current development
Every day, people use computers in new ways. Computers are increasingly affordable;
they continue to be more powerful as information-processing tools as well as easier to use.
Computers in Business :
One of the first and largest applications of computers is keeping and managing business
and financial records. Most large companies keep the employment records of all their workers in
large databases that are managed by computer programs. Similar programs and databases are
used in such business functions as billing customers; tracking payments received and payments
to be made; and tracking supplies needed and items produced, stored, shipped, and sold. In fact,
practically all the information companies need to do business involves the use of computers and
information technology.
On a smaller scale, many businesses have replaced cash registers with point-of-sale (POS)
terminals. These POS terminals not only print a sales receipt for the customer but also send
information to a computer database when each item is sold to maintain an inventory of items on
hand and items to be ordered. Computers have also become very important in modern factories.
Computer-controlled robots now do tasks that are hot, heavy, or hazardous. Robots are also used
to do routine, repetitive tasks in which boredom or fatigue can lead to poor quality work.

Page | 3
MODULE LIVING IN THE IT ERA – IT01

Computers in Medicine:
Information technology plays an important role in medicine. For example, a scanner takes
a series of pictures of the body by means of computerized axial tomography (CAT) or magnetic
resonance imaging (MRI). A computer then combines the pictures to produce detailed three-
dimensional images of the body's organs. In addition, the MRI produces images that show
changes in body chemistry and blood flow.

Computers in Science and Engineering:


Using supercomputers, meteorologists predict future weather by using a combination of
observations of weather conditions from many sources, a mathematical representation of the
behavior of the atmosphere, and geographic data.
Computer-aided design and computer-aided manufacturing programs, often called CAD/CAM,
have led to improved products in many fields, especially where designs tend to be very detailed.
Computer programs make it possible for engineers to analyze designs of complex structures such
as power plants and space stations.
Integrated Information Systems:
With today's sophisticated hardware, software, and communications technologies, it is
often difficult to classify a system as belonging uniquely to one specific application program.
Organizations increasingly are consolidating their information needs into a single, integrated
information system. One example is SAP, a German software package that runs on mainframe
computers and provides an enterprise-wide solution for information technologies. It is a
powerful database that enables companies to organize all their data into a single database, then
choose only the program modules or tables they want. The freestanding modules are customized
to fit each customer's needs.

For more knowledge about history of ICT, please check the link provided;
https://www.youtube.com/watch?v=UsOIbarTNmY&ab_channel=CreativeArt%27sOfLife

Page | 4
MODULE LIVING IN THE IT ERA – IT01

Lesson 3: ICT and Every day’s Life


Here are some examples of situations where ICT is having an impact on our everyday lives:
1. Finance
Every time you use a debit or credit card the shop till uses a terminal connected to other
computers via a network. Your identification details are automatically transferred from your card
to your bank or credit card company for verification, and your balance adjusted accordingly. This
also applies if you are shopping online, or over the phone (when booking a cinema ticket, for
example). ATMs (also known as cashpoints) allow you to check your bank balance or withdraw
cash from wherever you are in the world. The machines are networked to a central computer,
which has records of your account in a filing system known as a database. Many banks also
provide banking services via the internet, minimizing the need for customers to visit a branch.
Financial services have undergone huge changes in recent years as a result of the development
of IT systems. This has led to the need for increased security procedures to combat new types of
fraud. It has also led to changes in many areas of commerce; for example, the role of travel agents
has changed as more people book their own holidays directly online.
Some types of business have disappeared completely as online and computer-based information
have taken their place. For example, you rarely see door-to-door insurance salesmen these days!
Similarly, new types of business have been created, such as online auctions like eBay. Existing
business types have been transformed through the use of IT systems, for example the
development of online booksellers such as Amazon.
2. The internet
As well as impacting on the commercial world, the internet has had an enormous impact
on all areas of life. While there are still people in many parts of the world who do not have access
to an internet connection, the majority of people in the developed world now have access either
at home or at work, and have the opportunity to use online information resources, or
communicate with others using email, instant messaging or discussion groups. New online
communities have developed and existing communities have created new ways of
communicating. However, issues of identity and security have become a concern. New
technologies have engendered new types of crime, including identity theft and financial frauds.
These problems have fostered the development of new security technologies.
The internet has become a major factor in enabling information sharing and has had a
huge impact on the availability of information of all kinds. Material on the internet reflects widely
differing viewpoints and sources: from official news bulletins to unofficial rumours, and from
commercial megastores to community portals. The internet has revolutionised the way

Page | 5
MODULE LIVING IN THE IT ERA – IT01

information can be published, raising questions about the authority and regulation of content.
Because of the way the internet has been designed, no individual government, company or
person has control over it.
3. Entertainment
The world of entertainment is constantly evolving with the advent of new technologies.
Digital broadcasting has changed the way we experience television, with more interactive
programming and participation. Digital cameras, printers and scanners have enabled more
people to experiment with image production. Computer gaming has been an important influence
in the development of graphical interfaces. Technology has been at the forefront of changes in
the production and distribution of music, as well as in the ways in which people can access and
listen to music.
4. Public services
In the UK, in many NHS trusts, patient records are easily shared between departments within a
hospital. These electronic patient records may soon be transferable across the whole health
service, so that medical staff can access them from any part of the NHS. In some places, especially
remote rural areas, doctors may be able to make use of computer networks to make a diagnosis
if they are unable to see the patient in person.
Passenger information is increasingly available via networked computers: for example train
timetables, information in stations and airports, real-time information over the internet.
Networked communication systems are also crucial in the control of transport systems, from
traffic lights and pedestrian crossings to air traffic control and train signals.

ICT Careers and Job Types


Business Analyst
Business analysts examine an organisation (or part of a business) to determine how to better
achieve goals. Almost always, there's a strong information technology component. That's
because IT is integral to modern business operations. For example, analysts may scope out the
potential effects of changing computer software. Analysts need to be adaptable because job
requirements vary from company to company. To become a business analyst, you’ll probably
need to obtain an entry-level position in the field and build a career from there. Business
education in addition to advanced IT training confers an advantage.
Job titles: business analyst, business and technology analyst, business development manager
(ICT/networking), ICT business analyst, IT continuity risk analyst, manager (business systems

Page | 6
MODULE LIVING IN THE IT ERA – IT01

maintenance), pre-sales customer technology strategist, reporting analyst, reporting and insights
specialist, senior data business analyst, senior digital reporting analyst, senior forecast analyst,
senior insights analyst, team leader (IT business systems).

Computer Service Technician


Computer service technicians (also referred to as computer repair technicians) repair computer
hardware and software. Some of the common tasks are replacing defective components,
removing spyware and viruses, dissembling hardware, and running diagnostic tests. If a job in
this field is your goal, start getting as much experience as you can in assembling and repairing
computers. CompTIA A+ certification is a helpful qualification. Also consider completing a
program at a tech school or college.
Job titles: CSI technician, computer service technician, field technician, ICT service technician, ICT
support technician, IT support technician, IT systems technician, onsite support technician.

Cyber Security Specialist


Cyber security specialists protect the security of computer systems and networks. They need
broad technical knowledge since security is an important consideration across most parts of a
modern computer system. An IT-related degree is normally required for cyber security specialist
jobs. Experience is critical for all but graduate or assistant positions, and certifications may give
you a strong advantage over other applicants. Cyber security specialists enjoy an excellent
average salary. Demonstrated expertise in a difficult field can place you in a commanding career
position.
Job titles: cyber security analyst, deputy director (operational cyber security), director (service
operations and security), ICT risk and security specialist, ICT security analyst, ICT security
specialist, information security manager, information security officer, IT security consultant, IT
security engineer, IT security operations officer, IT security operations specialist, IT security
specialist, security sales specialist (cyber security), senior systems officer (security).

Data Analyst
These professionals develop insight and gain information through the collection, analysis and
interpretation of data. They work for businesses and other types of organizations, identifying and
helping to solve problems. As a data analyst, you'll use programming and computer software
skills to complete statistical analysis of data. If you want to start a career as a data analyst, learn
Page | 7
MODULE LIVING IN THE IT ERA – IT01

some programming languages and get a bachelor's degree in Information Technology and Data
Analysis.
Job titles: academic data analyst, associate data analyst, data analyst, data analyst – digital, data
classification analyst, data quality analyst, digital data analyst, junior data analyst, marketing data
analyst, master data analyst, people data analyst, privacy and data protection senior analyst,
property data analyst, senior data analyst

Data scientist
A data scientist is in the same broad career stream as a data analyst (see above). Perhaps the
main different is that data scientists are expected to use advanced programming skills more
routinely. They don't just gain insights from data, but also do things like building complex
behavioural models using big data. You can transition from being a data analyst to a data
scientist. A master's degree in data science is also a way to get into this line of work.
Job titles: data analyst / scientist, data engineer, data science consultant, data scientist, data
scientist – machine learning, director – data science, junior data analyst / scientist, lead data
scientist, lecturer – data science, senior data analyst / scientist

Database Administrator
Database administrators (DBAs) handle database security, integrity, and performance. They
ensure data standards are consistent, data is accessible by users as needed, and they solve any
problems encountered by users. These professionals might also be involved in database planning
and development. A degree in an IT-related field is usually required and it’s useful to
have programming experience. Experienced DBAs have strong applied knowledge of database
operating systems and technologies.
Job titles: database administrator, e-health systems administrator, ICT database administrator,
information management officer, senior information management specialist.

Database Analyst
Database analysts design, evaluate, review, and implement databases. In doing so, they organise
and analyse collected information. They’re often hired to update and maintain existing
databases. To gain employment in this field, you generally need a degree in computer science or
another IT field. Software development experience is also required for some jobs. Useful
Page | 8
MODULE LIVING IN THE IT ERA – IT01

strengths include data modelling, database queries creation, and PHP, HTML, CSS, Javascript and
SQL programming languages.
Job titles: asset knowledge systems analyst, data analyst and information manager, database
analyst, database coordinator/analyst.

Hardware Engineer
Hardware engineers (also referred to as computer hardware engineers) oversee the manufacture
and installation of computer systems, servers, circuit boards, and chips, as well as the testing of
equipment. They also work with routers, printers, and keyboards. People wanting a career in this
lucrative field require a degree in computer engineering. Depending on the employer, a degree
in electrical engineering or computer science might be an acceptable alternative. Creativity and
good communications skills are useful complements to technical skills.
Job titles: computer hardware engineer, hardware engineer, hardware test engineer, research
assistant/junior engineer.

IT consultant
IT consultants are professionals with significant IT experience and the confidence to find work by
competing for service contracts. While they’re often independent contractors, regular
employment is sometimes available with large manufacturers of software and computing
equipment; software and systems houses; and management consultancy firms. IT consultants
can find clients across most industries. You can choose to specialise in fields such as security,
software for a specific market, internet solutions, or web design.
Job titles: associate technical specialist, environmental management information systems (emis)
consultant, ICT contracts specialist, ICT project support officer, ICT security consultant, IT
consultant, Oracle application technical consultant, senior IT recruitment consultant, senior
technical specialist, senior technology specialist, test consultant.

IT manager
IT managers are responsible for the electronic networks and IT teams of organisations. They
ensure information system requirements are fulfilled. The job can be mainly supervisory at senior
levels within large organisations. For small business, it can instead be very hands on. IT managers
can work within organisations or as consultants doing discrete projects. Several years of

Page | 9
MODULE LIVING IN THE IT ERA – IT01

experience in the field is normally required to take on a senior role and you can benefit from
doing an IT management masters.
Job titles: chief technology manager, client delivery manager, ICT category manager, ICT
coordinator, ICT project manager, ICT program director, ICT procurement officer, ICT resource
officer, information and communication technology (ICT) officer, information technology
coordinator, IT administrator, IT manager, IT project administrator, project manager (information
systems), program director, technical operations manager.

Multimedia Developer
Multimedia developers are skilled in computer programming and visual artistry. They design
software and create multimedia applications by generating and manipulating animations, graphic
images, text, sound, and video. Some examples of applications include multimedia presentations,
educational and entertainment products, and computer-based interactive training. You might
consider this career if you’re an IT-graduate strong in visual arts. While a degree is useful, many
people also start work in the field with only a relevant certificate.
Job titles: digital content producer, eContent development specialist, multimedia coordinator,
multimedia developer, multimedia producer, multimedia specialist, software developer, web
producer.

Network Administrator
This professional manages and troubleshoots computer networks. The network administrator is
responsible for organising and maintaining computer systems. He or she is often at the highest
level of an organisation’s technical staff. To become a network administrator, you’ll need a
degree in an IT-related field. Employers also look for network-specific experience.
Specialised certification in network administration might also be required. Most professionals in
this area complete high-level training in specific hardware or software used in the network.
Job titles: ICT network and systems administrator, network administrator, network and systems
administrator, network infrastructure administrator.

Network Engineer
Network engineers design and set up networks. Duties may include placing physical equipment,
setting up electronic equipment needed to activate equipment, and determining the appropriate
Page | 10
MODULE LIVING IN THE IT ERA – IT01

antenna to ensure the best possible coverage. A career in this field frequently requires a
computer science or closely related degree. Specialised certification is worth pursuing as it gives
you an advantage in job search. Network engineers enjoy impressive salaries.
Job titles: ICT network and systems engineer, network engineer, network project specialist,
senior network engineer.

Computer programmer
While software developers design applications, it’s programmers who write the code needed for
programs to function. Programmers also test software and update existing software. Many are
employed by software companies. Necessary soft skills include problem solving, reading
comprehension, active listening, attention to detail, and critical thinking. You might consider
entering this field if you enjoy working with code for extended periods and testing the power of
programming languages. As experience is an important asset, it’s helpful to do an internship or
gain other hands-on experience while completing your formal education.
Job titles: digital back end developer, game programmer, graduate analyst / programmer,
machine programmer, programmer, SAS programmer, senior analyst programmer, SQL
programmer, test consultant, UI programmer.

Software Analyst
Software analysts bring software solutions to the people. They are the ones who connect the
work of software developers to the use of software in the workplace. They help organisations
develop software solutions to fit their needs. To succeed in this field, you should be strong at
both computer programming and dealing with people. Many software analyst jobs require a
degree in computer science or a related discipline. Some employers might additionally ask for
expertise in the industry (such as finance or healthcare). A related role to software analysis is ICT
software sales.
Job titles: enterprise solution architect, ICT sales representative, implementation analyst, lead
application analyst, national applications specialist, research intelligence analyst, senior
application analyst, software analyst.

Page | 11
MODULE LIVING IN THE IT ERA – IT01

Software Developer
Employers may use the term “software developer” interchangeably with “software engineer”.
However, be aware that a “software engineering” job might specifically require you to apply
engineering principles to software creation. Professionals in software development create and
build out software. They provide detailed instructions and guidelines for the programmers who
write the code. Occasionally, developers will code themselves. A bachelor’s degree is required
for most positions in this field, which produces excellent salaries.
Job titles: applications support engineer, data visualisation developer, enterprise reporting and
ETL developer, ICT applications development specialist, ICT engineer, ICT senior drupal
developer, python developer, python/integration developer, senior software engineer, senior
user experience designer, software application integrator, software developer, software
engineer, team leader (applications support), technical lead (applications delivery).

Systems Administrator
Systems administrators (or managers) configure, maintain, and ensure the continued reliability
of computer systems. They mostly deal with multi-use computers, including severs. An
organisation’s system administrator manages IT infrastructure, including servers and network
equipment. The role is essential to the successful operation of any company with a computer
system. A degree in a field such as information technology or computer science is often required
for administrator positions.
Job titles: client services and information officer, ICT network and systems administrator, ICT
systems administrator, ICT systems manager, information and user support officer, linux systems
administrator, people systems administrator, senior Windows system administrator, software
administrator, systems administrator, system administration support officer, systems operation
manager.

Systems Analyst
Systems analysts use their expertise to introduce computer systems, or to modify existing
systems as a way to boost technical efficiency and business productivity. For a given job, the
starting point may be to assess the client’s system requirements. You then formulate solutions
based on the latest technologies and considering the budget constraint. A computer science,
information management systems, or other IT-related degree is necessary to make you
competitive in this field. You also need relevant work experience, as well as programming
knowledge and project management skills.
Page | 12
MODULE LIVING IN THE IT ERA – IT01

Job titles: applications support analyst, asset knowledge systems analyst, cluster IT specialist,
eServices systems team lead, ICT procurement sourcing analyst, ICT support analyst, ICT
systems/data support analyst, ICT systems trainer, incident response analyst, infra support
analyst, senior spatial information team leader (IT business systems), support analyst technical
analyst officer, systems analyst.

Systems Engineer
Systems engineers design, set up and manage computer systems. They often work closely with
programmers, administrators and engineers. These professionals not only develop and test but
also evaluate personal computers, circuits, software, and other system elements. If you want to
become a systems engineer, you’ll probably need a computer science, information technology,
or engineering degree. You’ll also need to develop excellent communication and organisation
management skills.
Job titles: control systems engineer, ICT network and systems engineer, ICT systems engineer,
senior/principal ICT engineer, senior systems engineer, systems administration field support
engineer, systems developer (database applications), systems engineer, senior support engineer.

Tech Support
Tech support workers (help desk technicians) give essential technical support and
troubleshooting services to end-users. In-house technicians provide support exclusively for
employees of the company, while remote help desk technicians provide technical support to
customers (mostly online). The job requires a strong understanding of software and computer
hardware, and excellent communication skills. The role is generally considered entry-level where
you provide customer service directly (and doesn’t necessarily require an IT degree). Senior
positions are also available where you organise and manage support teams and/or systems.
Job titles: desktop administrator, ICT helpdesk technician, ICT on-site support engineer, ICT
service desk officer, ICT service support officer, ICT support officer, IT service desk analyst
desktop support technician, field service technician, field tech coordinator, IT support specialist,
school technical officer, tech support, technical support officer, technology support officer.
Web Developer
Web developers design and establish websites. They are skilled in both programming and the
design of pages, navigation and user interfaces. Knowledge of search engine optimisation
techniques is often important. Some jobs in this field require a bachelor’s degree in a relevant

Page | 13
MODULE LIVING IN THE IT ERA – IT01

field, but all demand experience (which can be easily gained by creating a website). Web
developers can find work in a variety of different workplaces since many different types of
organisations need a strong web presence.
Job titles: e-learning specialist, freelance web designer, frontend web developer, junior web
designer, python developer, python integration developer, quality assurance technician,
responsive web developer, SEM assistant, web designer, web developer, web developer
internship.

For more knowledge ICT careers, please check the link provided;
https://www.youtube.com/watch?v=rQI1GWgCZew

REFERENCES

https://en.wikipedia.org/wiki/Information_and_communications_technology#:~:text=In
formation%20and%20communications%20technology%20(ICT,enterprise%20software%
2C%20middleware%2C%20storage%20and
https://wiki.nus.edu.sg/display/cs1105groupreports/History+of+ICT#:~:text=The%20ter
m%20%22information%20technology%22%20evolved,%2C%20computers%2C%20and%
20information%20theory
https://www.open.edu/openlearn/ocw/mod/oucontent/view.php?id=2846&printable=
1
https://mallory.com.au/information-technology-jobs-descriptions/

Page | 14
MODULE LIVING IN THE IT ERA – IT01

CHAPTER 2: SOFTWARE

Objectives:
a) Identify the different types of software.
b) Discuss the differences between applications and programs.
c) Discover the major functions of operating system.

Lesson 1: Software Overview


Software
Software, which is abbreviated as SW or
S/W, is a set of programs that enables the
hardware to perform a specific task. All
the programs that run the computer are
software. The software can be of three
types: system software, application
software, and programming software.

1) System Software
The system software is the main software that runs the computer. When you turn on the
computer, it activates the hardware and controls and coordinates their functioning. The
application programs are also controlled by system software. An operating system is an example
of system software.
2) Application Software:
Application software is a set of programs designed to perform a specific task. It does not control
the working of a computer as it is designed for end-users. A computer can run without application
software. Application software can be easily installed or uninstalled as required. It can be a single
program or a collection of small programs. Microsoft Office Suite, Adobe Photoshop, and any
other software like payroll software or income tax software are application software. As we
know, they are designed to perform specific tasks. Accordingly, they can be of different types
such as:

Page | 1
MODULE LIVING IN THE IT ERA – IT01

Word Processing Software: This software allows users to create, edit, format, and
manipulate the text and more. It offers lots of options for writing documents, creating
images, and more. For example, MS Word, WordPad, Notepad, etc.
Spreadsheet Software: It is designed to perform calculations, store data, create charts,
etc. It has rows and columns, and the data is entered in the cell, which is an intersection
of a row and column, e.g., Microsoft Excel.
Multimedia Software: These software are developed to perform editing of video, audio,
and text. It allows you to combine texts, videos, audio, and images. Thus, you can improve
a text document by adding photos, animations, graphics, and charts through multimedia
software. For example, VLC player, Window Media Player, etc.
Enterprise Software: These software are developed for business operational functions. It
is used in large organizations where the quantum of business is too large. It can be used
for accounting, billing, order processing and more. For example, CRM (Customer
Relationship Management), BI (Business Intelligence), ERP (Enterprise Resource
Planning), SCM (Supply Chain Management), customer support system, and more.
3) Programming Software:
It is a set or collection of tools that help developers in writing other software or programs. It
assists them in creating, debugging, and maintaining software or programs or applications. We
can say that these are facilitator software that helps translate programming language such as
Java, C++, Python, etc., into machine language code. So, it is not used by end-users. For example,
compilers, linkers, debuggers, interpreters, text editors, etc. This software is also called a
programming tool or software development tool.
Some examples of programming software include:
Eclipse: It is a java language editor.
Coda: It is a programming language editor for Mac.
Notepad++: It is an open-source editor for windows.
Sublime text: It is a cross-platform code editor for Linux, Mac, and Windows.

For more knowledge about software, please check the link provided:
https://www.youtube.com/watch?v=MSA3WsGeTNI&ab_channel=IAmDevGrant

Page | 2
MODULE LIVING IN THE IT ERA – IT01

Lesson 2: Program
What is a program?
A computer program is a set of instructions and as a term it can be used as a verb as well as a
noun. In terms of a verb, it is used as a process of creating a software program by using
programming language. In terms of a noun, an application, program, or application software is
used to perform a specific task on the computer. For example, Microsoft PowerPoint is an
application, which provides a way to create documents related to the presentation. Furthermore,
a browser is also an application, which allows us to browse any website.
Difference between Applications and programs
All applications can be called a program, but a program cannot be an application. An application
is a collection of programs that are designed to help the end-users to achieve a purpose. These
programs communicate with each other to perform tasks or activities. It cannot exist without a
program and functions to carry out end-user commands. Whereas, a program is a collection of
instructions that describes the computer what task to perform.
What is the purpose of a program?
The program enables the computer to perform a particular operation. As without application
software (programs), a computer is able to operate with the operating system, but it cannot
perform any specific task. For example, if you want to create a Word document, you have to
install Microsoft Word on your computer. It is a program or application software that instructs
the computer how to create, edit, and save a document or a file.
Basic functions of a program
The function of a program depends upon the type of program. For example, the function of the
Microsoft Excel program is to create, edit, and view documents related to calculation and data
analysis, etc.The function of an internet browser is to find information on the World Wide Web
and display it on the screen. Basically, a program is designed to execute a particular task or
function. For example, an Excel program is able to create a document, but it cannot find the
information on the World Wide Web like a browser.
What was the first program?
Tom Kilburn wrote the first software program to hold in electronic memory. It was successfully
executed at the University of Manchester, England, on 21 June1948. This program was computed
as the greatest factor of the integer 218 = 262,144. The computer was called the mall Scale
Experimental Machine (SSEM), which was known as the Manchester Baby. This occurrence is
considered as the birth of the first software.

Page | 3
MODULE LIVING IN THE IT ERA – IT01

Examples of computer programs


Today, there are various types of programs available for mobile phones, computers, and other
devices. The below table contains some examples of programs with their category and brief
description.

Program Category Description

Google Internet Browser It is an internet browser that was introduced by Google


Chrome on 11 December 2008. It is used to retrieve the
information available on the World Wide Web and
display it on the device screen. It provides various types
of features to help the users, such as tabbed browsing,
synchronization with Google services and accounts, and
spell check and automatic translation of web pages.
Additionally, it has a search bar or Omnibox, which
allows users to search for any query.

C Programming It is a general-purpose programming language, which is


Language used to develop the software. It was released
in 1972 after it was developed at Bell Labs by Dennis
Ritchie. It is widely used for writing complex programs
such as Python, Git, Oracle database, etc.
Furthermore, it includes more features such as simple
and efficient, portability, rich library, extensible, high-
speed, and more.

Skype Chat and VoIP Skype is a program that allows users to chat and make
VOIP (voice over internet protocol) calls anywhere in the
World. A Skype user can call for free to another Skype
user anywhere in the world.

Adobe Photo Editor It is an image editing program, runs on MacOS or


Photoshop Windows computers.Itsupports all types of file formats
as well as JPEG, Targa, GIF, BMP, HEIF, etc.It provides
users many tools to create, edit, and enhance the quality
of an image, including a real-life painting, create an
animated GIF from an image or short video files.

Microsoft Word processor It is a word processor program. It was developed by


Word Charles Simonyi and Richard Brodie, and published by

Page | 4
MODULE LIVING IN THE IT ERA – IT01

Microsoft. It was introduced on 25 October 1983.You


can use the Word program on Microsoft Windows,
Android, Apple iOS, and Apple macOS. Furthermore, it
can also be run on the Linux OS with the help of WINE.

FileZilla FTP The FileZilla is an open-source software program that


allows users to transfer files from a local computer to a
remote computer. It is usable as a client version as well
as a server version. It includes more features, including
the important features such as Transfer Queue, Site
Manager, File, and Folder View, Directory Comparison.

Microsoft Spreadsheet It is a software program, which provides a spreadsheet


Excel to create documents related to calculation, data
analysis, and more. It is developed by Microsoft on 30
September 1985. When it was in the developing phase,
its code name was Odyssey. If you want to create a
monthly budget report, salary sheet, Bill order, and
more, you can use the Microsoft Excel program.

Microsoft Presentation It is a part of Microsoft Office that is bundled with


PowerPoint Microsoft Word and Excel. It is used to create a
presentation by creating different types of slides. It is
widely used in school and business presentations. For
example, if you want to create a presentation of your
document to show at your college or at any organization,
you can use the Microsoft PowerPoint program.

Mozilla E-mail client It is an open-source e-mail client that allows users to


Thunderbird send, receive, and manage their e-mail on Microsoft
Windows, Linux, MacOS, and other supported systems.
It provides users the option to retrieve e-mail from their
e-mail provider with the help of IMAP or POP3, and users
can send an e-mail by using Simple Mail Transfer
Protocol (SMTP).

Norton Anti- Antivirus It is an anti-virus software product that is developed for


Virus computer security by Symantec Corporation in 1991. It
uses heuristics and signatures to detect viruses.
Furthermore, it is distributed by Symantec as a
download, copy, a box, and OEM software.

Page | 5
MODULE LIVING IN THE IT ERA – IT01

Audacity Audio software It is an open-source software program, which enables


users to record sound, including edit sound clips. It can
run on the MacOS, Linux, and Windows operating
systems. It is available for free to use as per the General
Public License (GPL).

Adobe PDF reader It is an application software introduced by Adobe, which


Acrobat is used to create, view, manage, print, and manipulates
files in PDF (Portable Document Format).

Comm Fax/Voice/Phone It is a program, which enables users to receive faxes,


Central including receiving voicemail on their personal
computers.

Adobe HTML editor It is a software program that is used to design web pages
Dreamweaver and released by Macromedia in 1997. It is a full-fledged
HTML and programming editor, which offers users
WYSIWYG (what you see is what you get) user-interface
to create and edit web pages. It supports HTML, CSS,
JavaScript, and XML as well as human languages such as
English, French, Spanish, Chinese, Japanese, Russian, etc.

For more knowledge about program, please check the link provided;
https://www.youtube.com/watch?v=taO7nfcoCT0&ab_channel=TheTechTrain

Lesson 3: Operating System


Operating System
As the name suggests, an operating system is a type of software
without which you cannot operate or run a computer. It acts as
an intermediary or translation system between computer
hardware and application programs installed on the computer. In
other words, you cannot directly use computer programs with
computer hardware without having a medium to establish a
connection between them.
Besides this, it is also an intermediary between the computer user
and the computer hardware as it provides a standard user

Page | 6
MODULE LIVING IN THE IT ERA – IT01

interface that you see on your computer screen after you switch on your computer. For example,
the Windows and the Mac OS are also operating systems that provide a graphical interface with
icons and pictures to enable users to access multiple files and applications simultaneously.
So, although the operating system is itself a program or software, it allows users to run other
programs or applications on the system. We can say that is works behind the scenes to run your
computer.
Major Functions of Operating System:
o Memory management: It manages both the primary and secondary memory such as
RAM, ROM, hard disk, pen drive, etc. It checks and decides the allocations and
deallocation of memory space to different processes. When a user interacts with a
system, the CPU is supposed to read or write operations, in this case, OS decides the
amount of memory to be allocated for loading the program instructions and data into
RAM. After this program is terminated, the memory area is again free and is ready to be
allocated to other programs by the OS.
o Processor Management: It facilitates processor management, where it decides the order
for the processes to access the processor as well as decides the processing time to be
allocated for each process. Besides this, it monitors the status of processes, frees the
processor when a process is executed then allocates it to a new process.
o Device/ hardware management: The operating system also contains drivers to manage
devices. A driver is a type of translation software that allows the operating system to
communicate with devices, and there are different drivers for different devices as each
device speaks a different language.
o Run software applications: It offers the environment to run or use software applications
developed to perform specific tasks, for example, Ms Word, Ms Excel, Photoshop, etc.
o Data management: It helps in data management by offering and displaying directories
for data management. You can view and manipulate files, folders, e.g., you can move,
copy, name, or rename, delete a file or a folder.
o Evaluates the system's health: It gives us an idea about the performance of the hardware
of the system. For example, you can see how busy the CPU is, how fast the data is
retrieved from the hard disk, etc.
o Provides user interface: It acts as an interface between the user and the hardware. It can
be a GUI where you can see and click elements on the screen to perform various tasks. It
enables you to communicate with the computer even without knowing the computer's
language.

Page | 7
MODULE LIVING IN THE IT ERA – IT01

o I/O management: It manages the input output devices and makes the I/O process smooth
and effective. For example, it receives the input provided by the user through an input
device and stores it in the main memory. Then it directs the CPU to process this input and
accordingly provides the output through an output device such as a monitor.
o Security: It has a security module to protect the data or information stored in the
memories of the computer against malware and unauthorized access. Thus, it not only
manages your data but also helps to protect it.
o Time Management: It helps CPU in time management. The Kernel OS keeps checking the
frequency of processes that requests CPU time. When two or more processes that are
equally important compete for the CPU time, then the CPU time is sliced into segments
and allocated to these processes in a round-robin fashion to prevent a single process from
monopolizing the CPU.
o Deadlock Prevention: Sometimes a resource that is supposed to be shared by two or
more processes is held by one process due to which the resource cannot continue. This
situation is known as deadlock. The OS does not let this situation arise by carefully
distributing the resources among the different processes.
o Interrupt Handling: OS also responds to interrupts, which are signals generated by a
program or a device to seek the attention of the CPU. The OS checks the priority of the
interrupt, and if it is more important than the currently running process, it stops the
execution of the current process and preserves this state of CPU then executes the
requested process. Thereafter the CPU returns to the same state where it was stopped.
Types of Operating System:
1) Batch Processing Operating System:

Page | 8
MODULE LIVING IN THE IT ERA – IT01

The interaction between a user and the computer does not occur in this system. The user is
required to prepare jobs on punch cards in the form of batches and submit them to the computer
operator. The computer operator sorts the jobs or programs and keeps similar programs or jobs
in the same batch and run as a group to speed up processing. It is designed to execute one job at
a time. Jobs are processed on a first-come, first-serve basis, i.e., in the order of their submission
without any human intervention.
For example, the credit card bill generated by banks is an example of batch processing. A separate
bill is not generated for each credit card purchase, rather a single bill that includes all purchases
in a month is generated through batch processing. The bill details are collected and held as a
batch, and then it is processed to generate the bill at the end of the billing cycle. Similarly, in a
payroll system, the salaries of employees of the company are calculated and generated through
the batch processing system at the end of each month.
Advantages of Batch processing operating system:
o Repeated jobs can be completed easily without any human intervention
o Hardware or system support is not required to input data in batch systems
o It can work offline, so it causes less stress on the processor as it knows which task to
process next and how long the task will last.
o It can be shared among multiple users.
o You can set the timing of batch jobs so that when the computer is not busy, it can start
processing the batch jobs such as at night or any other free time.
Disadvantages of batch processing operating systems:
o You need to train the computer operators for using the batch system.
o It is not easy to debug this system.
o If any error occurs in one job, the other jobs may have to wait for an uncertain time.
2) Time Sharing Operating System:

Page | 9
MODULE LIVING IN THE IT ERA – IT01

As the name suggests, it enables multiple users located at different terminals to use a computer
system and to share the processor's time simultaneously. In other words, each task gets time to
get executed, and thus all tasks are executed smoothly.
Each user gets the processor's time as they get while using a single system. The duration of time
allocated to a task is called quantum or time slice; when this duration is over, OS starts the next
task.
Advantages of time sharing operating system:
o It reduces CPU idle time and thus makes it more productive.
o Each process gets the chance to use the CPU.
o It allowed different applications run simultaneously.
Disadvantages of time sharing operating system:
o It requires a special operating system as it consumes more resources.
o Switching between tasks may hang up the system as it serves lots of users and runs lots
of applications at the same time, so it requires hardware with high specifications.
o It is less reliable.
3) Distributed Operating System:

It uses or runs on multiple independent processors (CPUs) to serve multiple users and multiple
real-time applications. The communication between processors is established through many
communication lines such as telephone lines and high-speed buses. The processors may differ
from each other in terms of size and function.

Page | 10
MODULE LIVING IN THE IT ERA – IT01

The availability of powerful microprocessor and advanced communication technology have


made it possible to design, develop, and use the distributed operating system. Besides this, it is
an extension of a network operating system that supports a high level of communication and
integration of machines on the network.
Advantages of distributed operating system:
o Its performance is higher than a single system as resources are being shared.
o If one system stops working, malfunctions, or breaks down, other nodes are not affected.
o Additional resources can be added easily.
o Shared access to resources like printer can be established.
o Delay in processing is reduced to a greater extent.
o Data sharing or exchange speed is high, owing to the use of electronic mail.
Disadvantages of distributed operating system:
o Security issue may arise due to sharing of resources
o Few messages may be lost in the system
o Higher bandwidth is required in case of handling a large amount of data
o Overloading issue may arise
o The performance may be low
o The languages which are used to set up a distributed system are not well defined yet
o They are very costly, so they are not easily available.
4)Network Operating System:
As the name suggests, this OS connects computers and devices
to a local area network and manages network resources. The
software in a NOS enables the devices of the network to share
resources and communicate with each other. It runs on a server
and allows shared access to printers, files, applications, files,
and other networking resources and functions over a LAN.
Besides this, all users in the network are aware of each other's
underlying configuration and individual connections. Examples:
Ms Windows Server 2003 and 2008, Linux, UNIX, Novell
NetWare, Mac OS X, etc.

Page | 11
MODULE LIVING IN THE IT ERA – IT01

Advantages of network operating system:


o The servers are centralized that can be accessed remotely from distant locations and
different systems.
o It is easy to integrate advanced and recent technologies and hardware in this system.
Disadvantages of network operating system:
o The servers used in the system may be expensive.
o The system depends on the central location and requires regular monitoring and
maintenance.
5) Real-Time Operating System:
It is developed for real-time applications where data
should be processed in a fixed, small duration of time. It
is used in an environment where multiple processes are
supposed to be accepted and processed in a short time.
RTOS requires quick input and immediate response, e.g.,
in a petroleum refinery, if the temperate gets too high
and crosses the threshold value, there should be an
immediate response to this situation to avoid the
explosion. Similarly, this system is used to control
scientific instruments, missile launch systems, traffic
lights control systems, air traffic control systems, etc.
This system is further divided into two types based on the time constraints:
Hard Real-Time Systems:
These are used for the applications where timing is critical or response time is a major factor;
even a delay of a fraction of the second can result in a disaster. For example, airbags and
automatic parachutes that open instantly in case of an accident. Besides this, these systems lack
virtual memory.
Soft Real-Time Systems:
These are used for application where timing or response time is less critical. Here, the failure to
meet the deadline may result in a degraded performance instead of a disaster. For example, video
surveillance (cctv), video player, virtual reality, etc. Here, the deadlines are not critical for every
task every time.
Advantages of real-time operating system:
o The output is more and quick owing to the maximum utilization of devices and system

Page | 12
MODULE LIVING IN THE IT ERA – IT01

o Task shifting is very quick, e.g., 3 microseconds, due to which it seems that several tasks
are executed simultaneously
o Gives more importance to the currently running applications than the queued application
o It can be used in embedded systems like in transport and others.
o It is free of errors.
o Memory is allocated appropriately.
Disadvantages of real-time operating system:
o A fewer number of tasks can run simultaneously to avoid errors.
o It is not easy for a designer to write complex and difficult algorithms or proficient
programs required to get the desired output.
o Specific drivers and interrupt signals are required to respond to interrupts quickly.
o It may be very expensive due to the involvement of the resources required to work.
Generations of Operating System:
The first generation (1945 to 1955):
It was the time before the Second World War when the digital computer was not developed, and
there were calculating engines with mechanical relays at this point in time. Later mechanical
relays were replaced by vacuum tubes as they were very slow. But, the performance issue was
not resolved even with vacuum tubes, besides these machines were too bulky and large as there
were made of tens of thousands of vacuum tubes.
Furthermore, each of the machines was designed, programmed, and maintained by a single
group of people. The programming languages and operating systems were not known, and
absolute machine language was being used for programming.
These systems were designed for numerical calculations. The programmer was required to sign
up for a block of time and then insert his plug board into the computer. In the 1950s, punch cards
were introduced, which improved the computer performance. It allowed programmers to write
programs on punch cards and read them into the system; the rest of the procedure was the same.
The second generation (1955 to 1965):
This generation started with the introduction of transistors in the mid-1950s. The use of
transistors made the computers more reliable, and they began to be sold to customers. These
machines were called mainframes. Only the big organization and government corporations could
afford it. In this machine, the programmer was required to write the program on a paper then
punch it on cards. The card would be taken to the input room and handed over to an operator to

Page | 13
MODULE LIVING IN THE IT ERA – IT01

get the output. The printer provides the output which was taken to the output room. These steps
made it a time-consuming task. So, the batch system was adopted to address this issue.
In a batch system, the tasks were collected in a tray in the form of batches in the input room and
read onto a magnetic tape, which was taken to the machine room, where it was mounted on a
tape drive. Then using a special program, the operator was to read the first task or job from the
tape and run it, and the output was generated onto a second tape. OS automatically read the
next job from the tape, and Jobs were completed one by one. After the completion of the batch,
the input and output tapes were taken off, and the next batch was started. The printouts were
taken from the output tape. It was mainly used for engineering and scientific calculations. The
first OS was used in this generation in computers was called FMS (Fortran Monitor System), and
IBMSYS, and FORTRAN were used as a high-level language.
The third generation (1965 to 1979):
This generation began with the introduction of 360 family of computers of IBM in 1964. In this
generation, transistors were replaced by silicon chips, and the operating system was developed
for multiprogramming, some of them even supported batch processing, time sharing, real-time
processing, at the same time.
The fourth-generation operating system (1979 to Present):
This generation of OS started with the introduction of personal computers and workstations.
Chips that contain thousands of transistors were introduced in this generation that made possible
the development of personal computers that supported the growth of networks and thus the
development of network operating systems and distributed operating systems. DOS, Linux, and
window operation systems were are few examples of OS of this generation.

For more knowledge about operating system, please check the link provided;
https://www.youtube.com/watch?v=rWp3dSrWx-w&ab_channel=ClayDeskE-Learning

REFERENCES

https://www.javatpoint.com/software
https://www.javatpoint.com/program
https://www.javatpoint.com/operating-system

Page | 14
MODULE LIVING IN THE IT ERA – IT01

CHAPTER 3: COMPUTER HARDWARE

Objectives:
a) Identify the commonly used hardware in a computer.
b) Discuss the input and output devices.
c) Discover the components of a Central Processing Unit.

Lesson 1: Hardware Overview


Hardware
Hardware, which is abbreviated as HW, refers to all physical components of a computer system,
including the devices connected to it. You cannot create a computer or use software without
using hardware. The screen on which you are reading this information is also a hardware.
What is a hardware upgrade?
A hardware upgrade refers to a new hardware, or a replacement for the old one, or additional
hardware developed to improve the performance of the existing hardware. A common example
of a hardware upgrade is a RAM upgrade that increases the computer's total memory, and video
card upgrade, where the old video card is removed and replaced with the new one.
Some of the commonly used hardware in your computer are described below:
1) Motherboard:
The motherboard is generally a thin circuit board that holds together almost all parts of a
computer except input and output devices. All crucial hardware like CPU, memory, hard drive,
and ports for input and output devices are located on the motherboard. It is the biggest circuit
board in a computer chassis.
It allocates power to all hardware located on it and enables them to communicate with each
other. It is meant to hold the computer's microprocessor chip and let other components connect
to it. Each component that runs the computer or improves its performance is a part of the
motherboard or connected to it through a slot or port.
There can be different types of motherboards based on the type and size of the computers. So,
a specific motherboard can work only with specific types of processors and memory.
Components of a Motherboard:

Page | 1
MODULE LIVING IN THE IT ERA – IT01

CPU Slot: It is provided to install the CPU. It is a link between a microprocessor and a
motherboard. It facilitates the use of CPU and prevents the damage when it is installed or
removed. Furthermore, it is provided with a lock to prevent CPU movement and a heat sink to
dissipate the extra heat.
RAM Slot: It is a memory slot or socket provided in the motherboard to insert or install the RAM
(Random Access Memory). There can be two or more memory slots in a computer.
Expansion Slot: It is also called the bus slot or expansion port. It is a connection or port on the
motherboard, which provides an installation point to connect a hardware expansion card, for
example, you can purchase a video expansion card and install it into the expansion slot and then
can install a new video card in the computer. Some of the common expansion slots in a computer
are AGP, AMR, CNR, PCI, etc.
Capacitor: It is made of two conductive plates, and a thin insulator sandwiched between them.
These parts are wrapped in a plastic container.
Inductor (Coil): It is an electromagnetic coil made of a conducting wire wrapped around an iron
core. It acts as an inductor or electromagnet to store magnetic energy.
Northbridge: It is an integrated circuit that allows communications between the CPU interface,
AGP, and memory. Furthermore, it also allows the southbridge chip to communicate with the
RAM, CPU, and graphics controller.
USB Port: It allows you to connect hardware devices like mouse, keyboard to your computer.
PCI Slot: It stands for Peripheral Component Interconnect slot. It allows you to connect the PCI
devices like modems, network hardware, sound, and video cards.
AGP Slot: It stands for Accelerated Graphics Port. It provides the slot to connect graphics cards.
Heat Sink: It absorbs and disperses the heat generated in the computer processor.
Power Connector: It is designed to supply power to the motherboard.
CMOS battery: It stands for complementary metal-oxide-semiconductor. It is a memory that
stores the BIOS settings such as time, date, and hardware settings.
2) Monitor:
A monitor is the display unit of a computer on which the processed data, such as text, images,
etc., is displayed. It comprises a screen circuity and the case which encloses this circuity. The
monitor is also known as a visual display unit (VDU).
Types of Monitors:

Page | 2
MODULE LIVING IN THE IT ERA – IT01

1. CRT Monitor: It has cathode ray tubes which produce images in the form of video signals.
Its main components are electron gun assembly, deflection plate assembly, glass
envelope, fluorescent screen, and base.
2. LCD Monitor: It is a flat panel screen. It uses liquid crystal display technology to produce
images on the screen. Advanced LEDs have thin-film transistors with capacitors and use
active-matrix technology, which allows pixels to retain their charge.
3. LED Monitor: It is an advanced version of an LCD monitor. Unlike an LCD monitor, which
uses cold cathode fluorescent light to backlight the display, it has LED panels, each of
which has lots of LEDs to display the backlight.
4. Plasma Monitor: It uses plasma display technology that allows it to produce high
resolutions of up to 1920 X 1080, wide viewing angle, a high refresh rate, outstanding
contrast ration, and more.
3) Keyboard:
It is the most important input device of a computer. It is designed to allow you input text,
characters, and other commands into a computer, desktop, tablet, etc. It comes with different
sets of keys to enter numbers, characters, and perform various other functions like copy, paste,
delete, enter, etc.
Types of Keyboards:
1. QWERTY Keyboards
2. AZERTY Keyboards
3. DVORAK Keyboards
4) Mouse:
It is a small handheld device designed to control or move the pointer (computer screen's cursor)
in a GUI (graphical user interface). It allows you to point to or select objects on a computer's
display screen. It is generally placed on a flat surface as we need to move it smoothly to control
the pointer. Types of Mouse: Trackball mouse, Mechanical Mouse, Optical Mouse, Wireless
Mouse, etc.
Main functions of a mouse:
o Move the cursor: It is the main function of the mouse; to move the cursor on the screen.
o Open or execute a program: It allows you to open a folder or document and execute a
program. You are required to take the cursor on the folder and double click it to open it.
o Select: It allows you to select text, file, or any other object.

Page | 3
MODULE LIVING IN THE IT ERA – IT01

o Hovering: Hovering is an act of moving the mouse cursor over a clickable object. During
hovering over an object, it displays information about the object without pressing any
button of the mouse.
o Scroll: It allows you to scroll up or down while viewing a long webpage or document.
Parts of a mouse:
o Two buttons: A mouse is provided with two buttons for right click and left click.
o Scroll Wheel: A wheel located between the right and left buttons, which is used to scroll
up and down and Zoom in and Zoom out in some applications like AutoCAD.
o Battery: A battery is required in a wireless mouse.
o Motion Detection Assembly: A mouse can have a trackball or an optical sensor to provide
signals to the computer about the motion and location of the mouse.

For more knowledge about computer hardware, please check the link provided:
https://www.youtube.com/watch?v=_2MB8F9JSa8&ab_channel=EyeonTech

Lesson 2: Input and Output Devices


Input Devices
Input device enables the user to send data, information, or control signals to a computer. The
Central Processing Unit (CPU) of a computer receives the input and processes it to produce the
output.
Some of the popular input devices are:
1. Keyboard
2. Mouse
3. Scanner
4. Joystick
5. Light Pen
6. Digitizer
7. Microphone
8. Magnetic Ink Character Recognition (MICR)
9. Optical Character Reader (OCR)
10. Digital Camera
11. Paddle
12. Steering Wheel
13. Gesture recognition devices

Page | 4
MODULE LIVING IN THE IT ERA – IT01

14. Light Gun


15. Touch Pad
16. Remote
17. Touch screen
18. VR
19. Webcam
20. Biometric Devices

1) Keyboard
The keyboard is a basic input device that is used to enter data into a computer or any other
electronic device by pressing keys. It has different sets of keys for letters, numbers, characters,
and functions. Keyboards are connected to a computer through USB or a Bluetooth device for
wireless communication.
Types of keyboards: There can be different types of keyboards based on the region and language
used. Some of the common types of keyboards are as follows:
i) QWERTY Keyboard:

It is the most commonly used keyboard with computers in modern times. It is named after the
first six letters of the top row of buttons and is even popular in countries that do not use Latin-
based alphabet. It is so popular that some people think that it is the only type of keyboard to use
with computers as an input device.
ii) AZERTY Keyboard:

It is considered the standard French keyboard. It is developed in France as an alternative layout


to the QWERTY layout and is mainly used in France and other European countries. Some countries
have manufactured their own versions of AZERTY.

Page | 5
MODULE LIVING IN THE IT ERA – IT01

Its name is derived from the first six letters that appear on the top left row of the keyboard. The
Q and W keys in AZERTY keyboard are interchanged with A and Z keys in QWERTY keyboard.
Furthermore, in AZERTY keyboard M key is located to the left of the L key.
AZERTY keyboard differs from QWERTY keyboard not only in the placement of letters but also in
many other ways, e.g., it gives emphasis on accents, which is required for writing European
languages like French.
iii) DVORAK Keyboard:

This type of keyboard layout was developed to increase the typing speed by reducing the finger
movement while typing. The most frequently used letters are kept in a home row to improve
typing.

2) Mouse
The mouse is a hand-held input device which is used to move cursor or pointer across the screen.
It is designed to be used on a flat surface and generally has left and right button and a scroll wheel
between them. Laptop computers come with a touchpad that works as a mouse. It lets you
control the movement of cursor or pointer by moving your finger over the touchpad. Some
mouse comes with integrated features such as extra buttons to perform different buttons.
The mouse was invented by Douglas C. Engelbart in 1963. Early mouse had a roller ball integrated
as a movement sensor underneath the device. Modern mouse devices come with optical
technology that controls cursor movements by a visible or invisible light beam. A mouse is
connected to a computer through different ports depending on the type of computer and type
of a mouse.
Common types of the mouse:
i) Trackball Mouse:
It is a stationary input device that has ball mechanism to move the
pointer or cursor on the screen. The ball is half inserted in the device
and can be easily rolled with finger, thumb or the palm to move the
pointer on the screen. The device has sensor to detect the rotation of
ball. It remains stationary; you don't need to move it on the operating
surface. So, it is an ideal device if you have limited desk space as you
don't need to move it like a mouse.

Page | 6
MODULE LIVING IN THE IT ERA – IT01

ii) Mechanical Mouse:


It has a system of a ball and several rollers to track its movement.
It is a corded type of mouse. A mechanical mouse can be used
for high performance. The drawback is that they tend to get dust
into the mechanics and thus require regular cleaning.
iii) Optical Mouse:
An optical mouse uses optical electronics to track its movement. It is
more reliable than a mechanical mouse and also requires less
maintenance. However, its performance is affected by the surface on
which it is operated. Plain non-glossy mouse mat should be used for
best results. The rough surface may cause problems for the optical
recognition system, and the glossy surface may reflect the light
wrongly and thus may cause tracking issues.
iv) Cordless or Wireless Mouse:

As the name suggests, this type of mouse lacks cable and uses
wireless technology such as IrDA (infrared) or radio (Bluetooth or
Wi-Fi) to control the movement of the cursor. It is used to improve
the experience of using a mouse. It uses batteries for its power
supply.

3) Scanner
The scanner uses the pictures and pages of text as input. It scans the picture or a document. The
scanned picture or document then converted into a digital format or file and is displayed on the
screen as an output. It uses optical character recognition techniques to convert images into
digital ones. Some of the common types of scanners are as follows:
Types of Scanner:
i) Flatbed Scanner:
It has a glass pane and a moving optical CIS or CCD array. The light
illuminates the pane, and then the image is placed on the glass pane. The
light moves across the glass pane and scans the document and thus
produces its digital copy. You will need a transparency adapter while
scanning transparent slides.

Page | 7
MODULE LIVING IN THE IT ERA – IT01

ii) Handheld Scanner:

It is a small manual scanning device which is held by hand and is rolled


over a flat image that is to be scanned. The drawback in using this device
is that the hand should be steady while scanning; otherwise, it may
distort the image. One of the commonly used handheld scanners is the
barcode scanner which you would have seen in shopping stores.
iii) Sheetfed Scanner:

In this scanner, the document is inserted into the slot provided in the
scanner. The main components of this scanner include the sheet-feeder,
scanning module, and calibration sheet. The light does not move in this
scanner. Instead, the document moves through the scanner. It is suitable
for scanning single page documents, not for thick objects like books,
magazines, etc.
iv) Drum Scanner:

Drum scanner has a photomultiplier tube (PMT) to scan images.


It does not have a charge-coupled device like a flatbed scanner.
The photomultiplier tube is extremely sensitive to light. The
image is placed on a glass tube, and the light moves across the
image, which produces a reflection of the image which is
captured by the PMT and processed. These scanners have high
resolution and are suitable for detailed scans.
v) Photo Scanner:

It is designed to scan photographs. It has high resolution and


color depth, which are required for scanning photographs.
Some photo scanners come with in-built software for
cleaning and restoring old photographs.

Page | 8
MODULE LIVING IN THE IT ERA – IT01

4) Joystick

A joystick is also a pointing input device like a mouse. It is made up


of a stick with a spherical base. The base is fitted in a socket that
allows free movement of the stick. The movement of stick controls
the cursor or pointer on the screen.
The frist joystick was invented by C. B. Mirick at the U.S. Naval
Research Laboratory. A joystick can be of different types such as
displacement joysticks, finger-operated joysticks, hand operated,
isometric joystick, and more. In joystick, the cursor keeps moving in
the direction of the joystick unless it is upright, whereas, in mouse, the cursor moves only when
the mouse moves.

5) Light Pen

A light pen is a computer input device that looks like a pen. The
tip of the light pen contains a light-sensitive detector that
enables the user to point to or select objects on the display
screen. Its light sensitive tip detects the object location and
sends the corresponding signals to the CPU. It is not compatible
with LCD screens, so it is not in use today. It also helps you draw
on the screen if needed. The first light pen was invented around
1955 as a part of the Whirlwind project at the Massachusetts
Institute of Technology (MIT).

6) Digitizer

Digitizer is a computer input device that has a flat surface and


usually comes with a stylus. It enables the user to draw images
and graphics using the stylus as we draw on paper with a pencil.
The images or graphics drawn on the digitizer appear on the
computer monitor or display screen. The software converts the
touch inputs into lines and can also convert handwritten text
to typewritten words.

Page | 9
MODULE LIVING IN THE IT ERA – IT01

It can be used to capture handwritten signatures and data or images from taped papers.
Furthermore, it is also used to receive information in the form of drawings and send output to a
CAD (Computer-aided design) application and software like AutoCAD. Thus, it allows you to
convert hand-drawn images into a format suitable for computer processing.

7) Microphone
The microphone is a computer input device that is used to input the
sound. It receives the sound vibrations and converts them into audio
signals or sends to a recording medium. The audio signals are
converted into digital data and stored in the computer. The
microphone also enables the user to telecommunicate with others. It
is also used to add sound to presentations and with webcams for
video conferencing. A microphone can capture audio waves in
different ways; accordingly the three most common types are
described below:
i) Dynamic:

It is the most commonly used microphone with a simple design. It has a


magnet which is wrapped by a metal coil and a thin sheet on the front
end of the magnet. The sheet transfers vibrations from sound waves to
the coil and from coil to electric wires which transmit the sound like an
electrical signal.
ii) Condenser:

It is designed for audio recording and has a very sensitive and flat
frequency response. It has a front plate called diaphragm and a back
plate parallel to the front plate. When sound hits the diaphragm, it
vibrates the diaphragm and alters the distance between the two
plates. The changes in distance are transmitted as electric signals.

Page | 10
MODULE LIVING IN THE IT ERA – IT01

iii) Ribbon:

It is known for its reliability. It has a thin ribbon made of aluminum,


duraluminum, or nanofilm suspended in a magnetic field. The sound
waves cause vibrations in the ribbon, which generate a voltage
proportional to the velocity of the vibration. The voltage is transmitted
as an electrical signal. Early ribbon microphones had a transformer to
increase the output voltage, but modern ribbon microphones come
with advanced magnets to produce a strong signal.

8) Magnetic Ink Character Recognition (MICR)


MICR computer input device is designed to read the text printed
with magnetic ink. MICR is a character recognition technology
that makes use of special magnetized ink which is sensitive to
magnetic fields. It is widely used in banks to process the cheques
and other organizations where security is a major concern. It can
process three hundred cheques in a minute with hundred-percent
accuracy. The details on the bottom of the cheque (MICR No.) are
written with magnetic ink. A laser printer with MICR toner can be
used to print the magnetic ink.
The device reads the details and sends to a computer for processing. A document printed in
magnetic ink is required to pass through a machine which magnetizes the ink, and the magnetic
information is then translated into characters.

9) Optical Character Reader (OCR)


OCR computer input device is designed to convert the scanned
images of handwritten, typed or printed text into digital text.
It is widely used in offices and libraries to convert documents
and books into electronic files.
It processes and copies the physical form of a document using
a scanner. After copying the documents, the OCR software
converts the documents into a two-color (black and white),
version called bitmap. Then it is analyzed for light and dark
areas, where the dark areas are selected as characters, and the light area is identified as

Page | 11
MODULE LIVING IN THE IT ERA – IT01

background. It is widely used to convert hard copy legal or historic documents into PDFs. The
converted documents can be edited if required like we edit documents created in ms word.

10) Digital camera:


It is a digital device as it captures images and records videos digitally
and then stores them on a memory card. It is provided with an image
sensor chip to capture images, as opposed to film used by traditional
cameras. Besides this, a camera that is connected to your computer
can also be called a digital camera.
It has photosensors to record light that enters the camera through
the lens. When the light strikes the photosensors, each of the sensors
returns the electrical current, which is used to create the images.

11) Paddle:

It is a simple input device that is widely used in games.


It is a wheel that is held by hand and looks like a
volume knob on a stereo that is used to increase or
decrease the volume. Paddle moves or controls cursor
or any other objects in the game in a back-and-forth
motion. It is widely used as an alternative to the
joystick. Besides this, the term paddle also refers to
many handheld devices designed to control a function
in an electronic device, computer, etc.

12) Steering wheel:


It is used as an input device in racing video games such as car
racing games or in driving programs as virtual simulators to
steer a vehicle. It works like the real steering wheel by
allowing you to take a right or left turn. A steering wheel may
be provided with acceleration and brake pedal devices and a
mechanism for shifting gears. Thus, it makes racing games
more adventurous and entertaining.

Page | 12
MODULE LIVING IN THE IT ERA – IT01

13) Gesture recognition devices:


These devices take human gestures as input. There
are many such devices that respond to gestures. For
example, Kinect is one such device that observes the
movement of a player's body and interprets these
movements as inputs to video games. This feature is
also available in certain tablets and smartphones
where you can perform certain tasks such as taking
pictures using finger gestures such as swiping, pinching, etc.

14) Light Gun:


As the name suggests, it is a pointing input device that is
designed to point at and shoot the targets on the screen in
a video game, or arcade, etc. The light gun was used for
the first time on the MIT Whirwind computer. When the
gun is pointed at the target on the screen and the trigger
is pulled, the screen goes blank for a fraction of a second.
During this moment, the photodiode, which is present in
the barrel, determines where the gun is pointed. For example, shooting ducks in a duck hunt
game.
15) Touchpad:
It is usually found in laptops as a substitute for the mouse. It
allows you to move or control the cursor on the screen using
your finger. Just like a mouse, it also has two buttons for
right and left click. Using the touchpad, you can perform all
the tasks that you do with a mouse, such as selecting an
object on the screen, copy, paste, delete, open a file or
folder, and more.

16) Remote:
It is a hardware device designed to control the
functioning of a device, e.g., a TV remote that can be
used to change channels, increase or decrease the
volume, from a distance without leaving the seat. The
first cordless TV remote was invented by Dr. Robert
Adler of Zenith in 1956. The remote sends the

Page | 13
MODULE LIVING IN THE IT ERA – IT01

electromagnetic waves to communicate with the device. These waves can be infrared rays, radio
waves, etc.

17) Touch screen:


It is the display screen of a device such as a smartphone,
tablet, etc., that allows users to interact or provide
inputs to the device by using their finger. Today, most of
the electronic devices come with touchscreen as an
alternative to a mouse for navigating a graphical user
interface. For example, by touching, you can unlock your
phone, open emails, open files, play videos, etc. Besides
this, it is used in lots of devices such as Camera, Car GPS,
Fitness machine, etc.
The concept of the touch screen was first introduced and published by E.A. Johnson in 1965. The
first touch screen was developed at the beginning of the 1970s by CERN engineers Frank Beck
and Bent Stumpe.

18) VR:
VR stands for virtual reality. It is an artificial or virtual environment which is generated by
computers. A person can interact with virtual objects of
this artificial environment using some input devices such
as headsets, gloves, headphones, etc. For example, he or
she can find himself or herself walking on a beach,
watching a football match, walking in the sky, etc.,
without actually doing all this.

19) Webcam:
Any camera which is connected to a computer is called a webcam.
The in-built camera provided on a computer can also be considered a
webcam. It is an input device as it can take pictures, and can be used
to record videos if required. The pictures and videos are stored in
the computer memory and can be displayed on the screen if
required. Although it works almost the same as the digital camera, it
is different from a digital camera, as it is designed to take compact

Page | 14
MODULE LIVING IN THE IT ERA – IT01

digital photos that can be uploaded easily on the webpages and shared with others through the
internet.

20) Biometric Devices:


Biometrics refers to a process in which a person is identified through his or her biological features
such as fingerprints, eye cornea, face structure, etc. It is done by using biometric devices, which
can be of different types based on their scanning features and abilities, such as:
i) Face Scanner:

It is designed to identify a person by scanning his or her face. It takes


the face measurements of a person. For example, the distance
between eyes, nose, and mouth, etc., accordingly, it confirms the
identity of a person. Besides this, it is smart enough to differentiate
between a person's picture and the real person.
ii) Hand Scanner:

The hand of a person can also be used to verify his or her identity as
every person has a unique pattern of veins in the palm, just like
fingerprints. This device takes advantage of this feature; it identifies
a person by scanning the palm of his hand. It uses infrared light to
scan veins' patterns and blood flowing in them. Palm is even more
unique than fingerprints.
iii) Fingerprint Scanner:
It scans the fingerprints to identify people or for biometric
authentication. This device is developed, keeping in mind the fact
that no two persons in the world can have the same fingerprints.
It is widely used in companies as a fingerprint attendance system
to mark the attendance of employees. This type of scanners
captures the pattern of valleys and ridges found on a finger and
store it in the memory or database. When you press your finger
on the given space, it verifies the identity by using its pattern-
matching software.

Page | 15
MODULE LIVING IN THE IT ERA – IT01

iv) Retina or Iris Scanner:


It scans the retina or iris of a person's eye to confirm the
identity. This device is more secure than others as it is next
to impossible to copy the retina or iris. It works by mapping
the retina's blood vessel patterns of the eye. The blood
vessels of retina absorb light more easily as well as can be
identified with appropriate lighting.
In this scan, a beam of low-energy infrared light falls on the retina through the scanner's
eyepiece. Then, the software captures the network of blood vessels in the retina and uses it to
verify a person's identity.
v) Voice Scanner:
It records the voice of a person and digitizes it to create a
distinctive voice print or template. The voiceprints are
stored in the database, and are used to verify the voice of
a person to confirm his or her identity. The person is
required to speak in the normal or same voice that was
used to create a voice template. It is not much reliable as
it can be misused using a tape recording.

For more knowledge about input and output devices, please check the link provided;
https://www.youtube.com/watch?v=jzwa-
HegLk4&ab_channel=Micro%3AbitEducationalFoundation

Lesson 3: Central Processing Unit (CPU)


Central Processing Unit (CPU)
A Central Processing Unit is also called a processor, central processor, or microprocessor.
It carries out all the important functions of a computer. It receives instructions from both the
hardware and active software and produces output accordingly. It stores all important programs
like operating systems and application software. CPU also helps Input and output devices to
communicate with each other. Owing to these features of CPU, it is often referred to as the brain
of the computer.
CPU is installed or inserted into a CPU socket located on the motherboard. Furthermore, it is
provided with a heat sink to absorb and dissipate heat to keep the CPU cool and functioning
smoothly.

Page | 16
MODULE LIVING IN THE IT ERA – IT01

Generally, a CPU has three components:


o ALU (Arithmetic Logic Unit)
o Control Unit
o Memory or Storage Unit

Control Unit: It is the circuitry in the control unit, which makes use
of electrical signals to instruct the computer system for executing
already stored instructions. It takes instructions from memory and
then decodes and executes these instructions. So, it controls and
coordinates the functioning of all parts of the computer. The
Control Unit's main task is to maintain and regulate the flow of information across the processor.
It does not take part in processing and storing data.
ALU: It is the arithmetic logic unit, which performs arithmetic and logical functions. Arithmetic
functions include addition, subtraction, multiplication division, and comparisons. Logical
functions mainly include selecting, comparing, and merging the data. A CPU may contain more
than one ALU. Furthermore, ALUs can be used for maintaining timers that help run the computer.
Memory or Storage Unit/ Registers: It is called Random access memory (RAM). It temporarily
stores data, programs, and intermediate and final results of processing. So, it acts as a temporary
storage area that holds the data temporarily, which is used to run the computer.
What is CPU Clock Speed?
The clock speed of a CPU or a processor refers to the number of instructions it can process in a
second. It is measured in gigahertz. For example, a CPU with a clock speed of 4.0 GHz means it
can process 4 billion instructions in a second.
Types of CPU:
CPUs are mostly manufactured by Intel and AMD, each of which manufactures its own types of
CPUs. In modern times, there are lots of CPU types in the market. Some of the basic types of CPUs
are described below:
Single Core CPU: Single Core is the oldest type of computer CPU, which was used in the 1970s. It
has only one core to process different operations. It can start only one operation at a time; the
CPU switches back and forth between different sets of data streams when more than one
program runs. So, it is not suitable for multitasking as the performance will be reduced if more
than one application runs. The performance of these CPUs is mainly dependent on the clock
speed. It is still used in various devices, such as smartphones.

Page | 17
MODULE LIVING IN THE IT ERA – IT01

Dual Core CPU: As the name suggests, Dual Core CPU contains two cores in a single Integrated
Circuit (IC). Although each core has its own controller and cache, they are linked together to work
as a single unit and thus can perform faster than the single-core processors and can handle
multitasking more efficiently than Single Core processors.
Quad Core CPU: This type of CPU comes with two dual-core processors in one integrated circuit
(IC) or chip. So, a quad-core processor is a chip that contains four independent units called cores.
These cores read and execute instructions of CPU. The cores can run multiple instructions
simultaneously, thereby increases the overall speed for programs that are compatible with
parallel processing.
Quad Core CPU uses a technology that allows four independent processing units (cores) to run in
parallel on a single chip. Thus by integrating multiple cores in a single CPU, higher performance
can be generated without boosting the clock speed. However, the performance increases only
when the computer's software supports multiprocessing. The software which supports
multiprocessing divides the processing load between multiple processors instead of using one
processor at a time.
History of CPU:
Some of the important events in the development of CPU since its invention till date are as
follows:
o In 1823, Baron Jons Jackob Berzelius discovered silicon that is the main component of CPU
till date.
o In 1903, Nikola Tesla got gates or switches patented, which are electrical logic circuits.
o In December 1947, John Bardeen, William Shockley, and Walter Brattain invented the first
transistor at the Bell Laboratories and got it patented in 1948.
o In 1958, the first working integrated circuit was developed by Robert Noyce and Jack
Kilby.
o In 1960, IBM established the first mass-production facility for transistors in New York.
o In 1968, Robert Noyce and Gordon Moore founded Intel Corporation.
o AMD (Advanced Micro Devices) was founded in May 1969.
o In 1971, Intel introduced the first microprocessor, the Intel 4004, with the help of Ted
Hoff.
o In 1972, Intel introduced the 8008 processor; in 1976, Intel 8086 was introduced, and in
June 1979, Intel 8088 was released.

Page | 18
MODULE LIVING IN THE IT ERA – IT01

o In 1979, a 16/32-bit processor, the Motorola 68000, was released. Later, it was used as a
processor for the Apple Macintosh and Amiga computers.
o In 1987, Sun introduced the SPARC processor.
o In March 1991, AMD introduced the AM386 microprocessor family.
o In March 1993, Intel released the Pentium processor. In 1995, Cyrix introduced the
Cx5x86 processor to give competition to Intel Pentium processors.
o In January 1999, Intel introduced the Celeron 366 MHz and 400 MHz processors.
o In April 2005, AMD introduced its first dual-core processor.
o In 2006, Intel introduced the Core 2 Duo processor.
o In 2007, Intel introduced different types of Core 2 Quad processors.
o In April 2008, Intel introduced the first series of Intel Atom processors, the Z5xx series.
They were single-core processors with a 200 MHz GPU.
o In September 2009, Intel released the first Core i5 desktop processor with four cores.
o In January 2010, Intel released many processors such as Core 2 Quad processor Q9500,
first Core i3 and i5 mobile processors, first Core i3 and i5 desktop processors. In the same
year in July, it released the first Core i7 desktop processor with six cores.
o In June 2017, Intel introduced the first Core i9 desktop processor.
o In April 2018, Intel released the first Core i9 mobile processor.

For more knowledge about CPU, please check the link provided;
https://www.youtube.com/watch?v=FB8bphbLuX0&ab_channel=rasgo.official

REFERENCES

Page | 19
MODULE LIVING IN THE IT ERA – IT01

CHAPTER 4: SOCIAL IMPLICATIONS OF ICT


IN THE SOCIETY

Objectives:
a) Discuss the social implications of ICT in the society.
b) Identify the negative effects of technology to the society.
c) Discover the how ICT is developing social businesses.

Lesson 1: The social impact of ICT


Information communications technology
(ICT) has the power to transform society. It plays
a key role in each of the United Nations’
Sustainable Development Goals, providing the
infrastructure needed to achieve them. It also
enables financial inclusion through m-
commerce and allows people to connect with
millions instantaneously.
The impact of ICT on business is particularly significant. It empowers people to share knowledge
and advice instantaneously and set up an online shop or website at a low cost, dramatically
lowering the barriers to starting a business. As such, it is an important enabler of change and ICT
maturity is closely linked to economic growth.
Advances in technology have always been used by for-profits to increase revenue. However,
government bodies and NGOs have struggled to successfully apply them for social good. An
emerging type of business, the social business, is bridging the gap between the two.
Technology Has Made Our Lives Far Easier and Better Through Better Communication
The role of technology has successfully made the communication aspect much easier and
better for us humans. Earlier, (a couple of decades ago) we had to wait for the message for days
and even, in some cases, for months.
And we can drastically see the change which has taken place.

Page | 1
MODULE LIVING IN THE IT ERA – IT01

Now, all it takes is a few clicks of our fingers on the smartphones to send out a mail, message to
our loved ones or office colleagues. The user experience and interface have drastically improved
with the upcoming modern age technology.
With Technology Advertising Has Been Made Easier
Technology has not left a single aspect of our lives which it has not touched. And one such aspect
is that of the advertisement. Nowadays, owing to the rise of digital technology and online
marketing, advertising has become way easier and comfortable then it was before.
Some examples would be Facebook marketing, Google Ads.
Amazing Change in Travel Industry
The travel industry has been impacted by technology in a very huge way. There are Google Maps,
Google Earth and so on which the user can operate and use as per their convenience. And
moreover, there have been new and upcoming business models in the travel sector such as car
rentals where a person can hire a luxury car or a mid-level car according to their needs. Overall,
we can say that things are becoming more and more interesting.
Technology Has Made Learning Easier and Efficient
There is no doubt in the fact that with the surge on the internet over the last couple of years, it
has become very easy for people to rummage the internet and get the necessary information.
And owing to this, learning and grabbing new information regarding any subject has become way
easier which is good for the geeks.
Role of Technology in Data Storing
A couple of decades ago retrieving data was a very tough process to deal with as people had to
scour various files and specifically hand-pick the file by narrowing it down.
But now, it is not like that at all. Nowadays, all you need to do is save it on your computer, tablet,
and even smartphones.
And whenever, you need, you can search for the specific file and within seconds it will be in your
hands. Not just it is time-saving but also has made our lives significantly easy.

For more knowledge about the social impact of ICT, please check the link provided:
https://www.youtube.com/watch?v=JFabyFx5dWw&ab_channel=TeacherJessebelTe
ves

Page | 2
MODULE LIVING IN THE IT ERA – IT01

Lesson 2: Negative effects of technology


People are more connected than ever, thanks in large part to rapid advancements in
technology. While some forms of technology may have made positive changes in the world, there
is evidence for the negative effects of technology and its overuse, as well.
Social media and mobile devices may lead to psychological and physical issues, such as
eyestrain and difficulty focusing on important tasks. They may also contribute to more serious
health conditions, such as depression. The overuse of technology may have a more significant
impact on developing children and teenagers.
Psychological effects
Overuse or dependence on technology may have adverse psychological effects, including:
Isolation
Technologies, such as social media, are designed to bring people together, yet they may have the
opposite effect in some cases.
A 2017study in young adults aged 19–32 years found that people with higher social media use
were more than three times as likely to feel socially isolated than those who did not use social
media as often.
Finding ways to reduce social media use, such as setting time limits for social apps, may help
reduce feelings of isolation in some people.
Depression and anxiety
The authors of a 2016 systematic review discussed the link between social networks and mental
health issues, such as depression and anxiety.
Their research found mixed results. People who had more positive interactions and social support
on these platforms appeared to have lower levels of depression and anxiety.
However, the reverse was also true. People who perceived that they had more negative social
interactions online and who were more prone to social comparison experienced higher levels of
depression and anxiety.
So, while there does appear to be a link between social media and mental health, a significant
determining factor is the types of interactions people feel they are having on these platforms.
Physical health effects
Technology use may increase the risk of physical issues as well, including:
Eyestrain

Page | 3
MODULE LIVING IN THE IT ERA – IT01

Technologies, such as handheld tablets, smartphones, and computers, can hold a person’s
attention for long periods. This may lead to eyestrain.
Symptoms of digital eyestrain can include blurred vision and dry eyes. Eyestrain may also lead to
pains in other areas of the body, such as the head, neck, or shoulders.
Several technological factors may lead to eyestrain, such as:
• screen time
• screen glare
• screen brightness
• viewing too close or too far away
• poor sitting posture
• underlying vision issues

Taking regular breaks away from the screen may reduce the likelihood of eyestrain.
Anyone regularly experiencing these symptoms should see an optometrist for a checkup.
The 20-20-20 rule for digital viewing
When using any form of digital screen for longer periods of time, the American Optometric
Association recommend using the 20-20-20 rule.
To use the rule, after every 20 minutes of screen time, take a 20-second break to look at
something at least 20 feet away.
Doing this may help reduce the strain on the eyes from staring at a screen for a continuous period.
Poor posture
The way many people use mobile devices and computers may also contribute to incorrect
posture. Over time, this may lead to musculoskeletal issues.
Many technologies promote a “down and forward” user position, meaning the person is hunched
forward and looking down at the screen. This can put an unnecessary amount of pressure on the
neck and spine.
A 5-year study in the journal Applied Ergonomics found an association between texting on a
mobile phone and neck or upper back pain in young adults.
The results indicated the effects were mostly short term, though some people continued to have
long-term symptoms.
However, some studies challenge these results.
A 2018 study in the European Spine Journal found that the posture of the neck while texting
made no difference in symptoms such as neck pain.

Page | 4
MODULE LIVING IN THE IT ERA – IT01

This study concluded that texting and “text neck” did not influence neck pain in young adults.
However, the study did not include a long-term follow-up.
It may be that other factors influence neck pain, as well, such as age and activity levels.
Correcting posture problems while using technology may lead to an overall improvement in
posture and strength in the core, neck, and back.
For example, if a person finds themselves sitting in the same position for hours at a time, such as
sitting at a desk while working, regularly standing or stretching may help reduce strain on the
body.
Additionally, taking short breaks, such as walking around the office every hour, may also help
keep the muscles loose and avoid tension and incorrect posture.
Sleep problems
Using technology too close to bedtime may cause issues with sleep. This effect has to do with the
fact that blue light, such as the light from cell phones, e-readers, and computers, stimulates the
brain.
Authors of a 2014 study found that this blue light is enough to disturb the body’s natural circadian
rhythm. This disturbance could make it harder to fall asleep or lead to a person feeling less alert
the next day.
To avoid the potential impact of blue light on the brain, people can stop using electronic devices
that emit blue light in the hour or two before bedtime.
Gentle activities to wind down with instead, such as reading a book, doing gentle stretches, or
taking a bath, are alternatives.
Reduced physical activity
Most everyday digital technologies are sedentary. More extended use of these technologies
promotes a more sedentary lifestyle, which is known to have negative health effects, such as
contributing to:
• obesity
• cardiovascular disease
• type 2 diabetes
• premature death
Finding ways to take breaks from sedentary technologies may help promote a more active
lifestyle.
Other forms of technology may help, however.

Page | 5
MODULE LIVING IN THE IT ERA – IT01

Research from 2017 indicates that active technologies, such as app notifications, emails, and
wearable technologies that promote exercise may reduce short-term sedentary behavior.
This could help people set healthful patterns and become more physically active.

For more knowledge about negative effects of technology, please check the link
provided;
https://www.youtube.com/watch?v=qUO4Bsv8tYA&ab_channel=TaylorHofstrandBu
nn

Lesson 3: ICT and the social business


Social businesses are driven by a social cause, but seek financial stability in order to
further their impact. ICT is playing a central role in the emergence and development of social
businesses. These companies are driven by a social cause, but seek financial stability through
profit making in order to further their impact. This combination is allowing them to effectively
utilize technology for good.
The impact of ICT infrastructure on social businesses cannot be understated. It has made social
impact affordable, social impact scalable, and enables new ways to connect to and engage with
local communities (a key characteristic of the social business).
ICT is developing social businesses in three main ways:
Affordability
Starting any business can be financially challenging. But for social entrepreneurs, whose
primary intent is to engage with local communities rather than to make profit, it can be
particularly daunting. In the initial stages, it can also be difficult to convince investors to part with
their money for a social cause.
ICT solutions have decreased set-up costs in an unprecedented way. This helps social
entrepreneurs to make it through this uncertain period without major investments or losses –
and advice is only the click of a button away.
Scalability
ICT infrastructure allows us to connect instantaneously with millions. For social
entrepreneurs, this means that their initiatives aren’t just limited to one community; they can
easily reach the people they want to empower and spread their message far and wide.
Many social businesses also utilize ICT solutions to optimize processes, reduce costs and increase
accuracy, enabling the business to be scaled up faster. For example, Sanergy in Nairobi, Kenya,
uses radiofrequency identification sensors to alert the waste team to when a toilet needs to be

Page | 6
MODULE LIVING IN THE IT ERA – IT01

emptied. SiembraViva in Medellín, Colombia, is developing solutions that will allow them to
monitor harvests remotely and alert farmers to problems quickly.
Community
To ensure lasting empowerment, social businesses work from within the community. As
previously highlighted, ICT allows social entrepreneurs to continuously connect with the people
they wish to empower in a direct and engaging way.
Online channels are also an incredibly powerful way to broadcast business messages. Although
the presence of a social business is very much on the ground within the community, its story can
be told online through webpages and social media, reaching a global audience.

For more knowledge about benefits of ICT in Business, please check the link
provided;
https://www.youtube.com/watch?v=3iAp9me4P1c&ab_channel=TheProcessConsultant

REFERENCES

https://www.medicalnewstoday.com/articles/negative-effects-of-technology#physical-
health-effects
https://www.ericsson.com/en/reports-and-papers/networked-society-insights/social-
business/social-impact-of-
ict#:~:text=It%20empowers%20people%20to%20share,closely%20linked%20to%20econ
omic%20growth.

Page | 7
MODULE LIVING IN THE IT ERA – IT01

CHAPTER 5: DIGITAL TECHNOLOGY


AND MEDIA LITERACY

Objectives:
a) Discuss the meaning of digital technology and media literacy.
b) Identify the impact of digital media to the society.
c) Discover the different challenges of digital media.

Lesson 1: Digital Media


Digital media means any
media that are encoded in
machine-readable formats. Digital
media can be created, viewed,
distributed, modified and
preserved on digital electronics
devices. Digital can be defined as
any data represented with a series
of digits, and Media refers to a
method of broadcasting or communicating information. Together, digital media refers to any
information that is broadcast to us through a screen. This includes text, audio, video, and graphics
that is transmitted over the internet, for viewing on the internet.
Examples of digital media include software, digital images, digital video, video games, web pages
and websites, social media, digital data and databases, digital audio such as MP3, electronic
documents and electronic books. Digital media often contrasts with print media, such as printed
books, newspapers and magazines, and other traditional or analog media, such as photographic
film, audio tapes or video tapes.
Digital media has had a significantly broad and complex impact on society and culture. Combined
with the Internet and personal computing, digital media has caused disruptive innovation in
publishing, journalism, public relations, entertainment, education, commerce and politics. Digital
media has also posed new challenges to copyright and intellectual property laws, fostering an
open content movement in which content creators voluntarily give up some or all of their legal
rights to their work. The ubiquity of digital media and its effects on society suggest that we are
at the start of a new era in industrial history, called the Information Age, perhaps leading to a
paperless society in which all media are produced and consumed on computers. However,

Page | 1
MODULE LIVING IN THE IT ERA – IT01

challenges to a digital transition remain, including outdated copyright laws, censorship, the
digital divide, and the spectre of a digital dark age, in which older media becomes inaccessible to
new or upgraded information systems. Digital media has a significant, wide-ranging and complex
impact on society and culture.
History of Digital Media
Codes and information by machines were first
conceptualized by Charles Babbage in the early 1800s.
Babbage imagined that these codes would give him
instructions for his Motor of Difference and Analytical
Engine, machines that Babbage had designed to solve the
problem of error in calculations. Between 1822 and
1823, Ada Lovelace, mathematics, wrote the first
instructions for calculating numbers on Babbage engines.
Lovelace's instructions are now believed to be the first
computer program. Although the machines were designed to
perform analysis tasks, Lovelace anticipated the possible
social impact of computers and programming, writing. "For
in the distribution and combination of truths and formulas of analysis, which may become easier
and more quickly subjected to the mechanical combinations of the engine, the relationships and
the nature of many subjects in which science necessarily relates in new subjects, and more deeply
researched there are in all extensions of human power or additions to human knowledge, various
collateral influences, in addition to the primary and primary object reached." Other old machine
readable media include instructions for pianolas and weaving machines.
It is estimated that in the year 1986 less than 1% of the world's media storage capacity was digital
and in 2007 it was already 94%. The year 2002 is assumed to be the year when human kind was
able to store more information in digital than in analog media (the "beginning of the digital age").

For more knowledge about digital media, please check the link provided:
https://www.youtube.com/watch?v=XnkFYKTDCvU&ab_channel=UHDigitalMedia

Lesson 2: Impact of Digital Media


The digital revolution
Since the 1960s, computing power and storage capacity have increased exponentially,
largely as a result of MOSFET scaling which enables MOS transistor counts to increase at a rapid
pace predicted by Moore's law. Personal computers and smartphones put the ability to access,
modify, store and share digital media in the hands of billions of people. Many electronic devices,
from digital cameras to drones have the ability to create, transmit and view digital media.

Page | 2
MODULE LIVING IN THE IT ERA – IT01

Combined with the World Wide Web and the Internet, digital media has transformed 21st
century society in a way that is frequently compared to the cultural, economic and social impact
of the printing press. The change has been so rapid and so widespread that it has launched an
economic transition from an industrial economy to an information-based economy, creating a
new period in human history known as the Information Age or the digital revolution.

The transition has created some uncertainty about definitions. Digital media, new media,
multimedia, and similar terms all have a relationship to both the engineering innovations and
cultural impact of digital media. The blending of digital media with other media, and with cultural
and social factors, is sometimes known as new media or "the new media." Similarly, digital media
seems to demand a new set of communications skills, called transliteracy, media literacy, or
digital literacy. These skills include not only the ability to read and write—traditional literacy—
but the ability to navigate the Internet, evaluate sources, and create digital content. The idea
that we are moving toward a fully digital, paperless society is accompanied by the fear that we
may soon—or currently—be facing a digital dark age, in which older media are no longer
accessible on modern devices or using modern methods of scholarship. Digital media has a
significant, wide-ranging and complex effect on society and culture.
Disruption in industry
Compared with print media, the mass media, and other analog technologies, digital media are
easy to copy, store, share and modify. This quality of digital media has led to significant changes
in many industries, especially journalism, publishing, education, entertainment, and the music
business. The overall effect of these changes is so far-reaching that it is difficult to quantify. For
example, in movie-making, the transition from analog film cameras to digital cameras is nearly
complete. The transition has economic benefits to Hollywood, making distribution easier and
making it possible to add high-quality digital effects to films. At the same time, it has affected the
analog special effects, stunt, and animation industries in Hollywood. It has imposed painful costs
on small movie theaters, some of which did not or will not survive the transition to digital. The
effect of digital media on other media industries is similarly sweeping and complex.
Between 2000–2015, the print newspaper advertising revenue has fallen from $60 billion to a
nearly $20 billion. Even one of the most popular days for papers, Sunday, has seen a 9%
circulation decrease the lowest since 1945.
In journalism, digital media and citizen journalism have led to the loss of thousands of jobs in
print media and the bankruptcy of many major newspapers. But the rise of digital journalism has
also created thousands of new jobs and specializations. E-books and self-publishing are changing
the book industry, and digital textbooks and other media-inclusive curricula are changing primary
and secondary education.

Page | 3
MODULE LIVING IN THE IT ERA – IT01

In academia, digital media has led to a new form of scholarship, also called digital scholarship,
making open access and open science possible thanks to the low cost of distribution. New fields
of study have grown, such as digital humanities and digital history. It has changed the way
libraries are used and their role in society. Every major media, communications and academic
endeavor is facing a period of transition and uncertainty related to digital media.
Often time the magazine or publisher have a Digital edition which can be referred to an electronic
formatted version identical to the print version. There is a huge benefit to the publisher here and
it's the cost, avoiding the expense to print and deliver brings an additional benefit for the
company.
Since 2004, there has been a decrease in newspaper industry employment, with only about
40,000 people working in the workforce currently. Alliance of Audited Media & Publishers
information during the 2008 recession, over 10% of print sales are diminished for certain
magazines, with a hardship coming from only 75% of the sales advertisements as before.
However, in 2018, major newspapers advertising revenue was 35% from digital ads.
In contrast, mobile versions of newspapers and magazines came in second with a huge growth
of 135%. The New York Times has noted a 47% year of year rise in their digital subscriptions. 43%
of adults get news often from news websites or social media, compared with 49% for television.
Pew Research also asked respondents if they got news from a streaming device on their TV – 9%
of U.S. adults said that they do so often.
Individual as content creator
Digital media has also allowed individuals to be much more active in content creation. Anyone
with access to computers and the Internet can participate in social media and contribute their
own writing, art, videos, photography and commentary to the Internet, as well as conduct
business online. The dramatic reduction in the costs required to create and share content have
led to a democratization of content creation as well as the creation of new types of content, like
blogs, memes and video essays. Some of these activities have also been labelled citizen
journalism. This spike in user created content is due to the development of the internet as well
as the way in which users interact with media today. The release of technologies such mobile
devices allow for easier and quicker access to all thing’s media. Many media creation tools that
were once available to only a few are now free and easy to use. The cost of devices that can
access the Internet is steadily falling, and personal ownership of multiple digital devices is now
becoming the standard. These elements have significantly affected political participation. Digital
media is seen by many scholars as having a role in Arab Spring, and crackdowns on the use of
digital and social media by embattled governments are increasingly common. Many governments
restrict access to digital media in some way, either to prevent obscenity or in a broader form of
political censorship.

Page | 4
MODULE LIVING IN THE IT ERA – IT01

Over the years YouTube has grown to become a website with user generated media. This content
is oftentimes not mediated by any company or agency, leading to a wide array of personalities
and opinions online. Over the years YouTube and other platforms have also shown their
monetary gains, as the top 10 YouTube performers generating over 10 million dollars each year.
Many of these YouTube profiles over the years have a multi camera set up as we would see on
TV. Many of these creators also creating their own digital companies as their personalities grow.
Personal devices have also seen an increase over the years. Over 1.5 billion users of tablets exist
in this world right now and that is expected to slowly grow About 20% of people in the world
regularly watch their content using tablets in 2018.
User-generated content raises issues of privacy, credibility, civility and compensation for cultural,
intellectual and artistic contributions. The spread of digital media, and the wide range of literacy
and communications skills necessary to use it effectively, have deepened the digital divide
between those who have access to digital media and those who don't.
The rising of digital media has made the consumer's audio collection more precise and
personalized. It is no longer necessary to purchase an entire album if the consumer is ultimately
interested in only a few audio files.
Copyright challenges
Digital media pose several challenges to the current copyright and intellectual property laws. The
ease of creating, modifying and sharing digital media makes copyright enforcement a challenge,
and copyright laws are widely seen as outdated. For example, under current copyright law,
common Internet memes are probably illegal to share in many countries. Legal rights are at least
unclear for many common Internet activities, such as posting a picture that belongs to someone
else to a social media account, covering a popular song on a YouTube video, or writing fanfiction.
Over the last decade the concept of fair use has been applied to many online medias.
Copyright challenges have gotten to all parts of digital media. Even as a personal content creator
on YouTube, they must be careful and follow the guidelines set by copyright and IP laws. As
YouTube creators very easily get demonetized for their content. Oftentimes we see digital
creators loose monetization in their content, get their contend deleted, or get criticized for their
content. Most times this has to do with accidently using a copyrighted audio track or background
scenes that are copyright by another company.
To resolve some of these issues, content creators can voluntarily adopt open or copyleft licenses,
giving up some of their legal rights, or they can release their work to the public domain. Among
the most common open licenses are Creative Commons licenses and the GNU Free
Documentation License, both of which are in use on Wikipedia. Open licenses are part of a
broader open content movement that pushes for the reduction or removal of copyright
restrictions from software, data and other digital media. To facilitate the collection and
consumption of such licensing information and availability status, tools have been developed like

Page | 5
MODULE LIVING IN THE IT ERA – IT01

the Creative Commons Search engine (mostly for images on the web) and Unpaywall (for
scholarly communication).
Additional software has been developed in order to restrict access to digital media. digital rights
management (DRM) is used to lock material and allows users to use that media for specific cases.
For example, DRM allows a movie producer to rent a movie at a lower price than selling the
movie, restricting the movie rental license length, rather than only selling the movie at full price.
Additionally, DRM can prevent unauthorized sharing or modification of media.
Digital Media is numerical, networked and interactive system of links and databases that allows
us to navigate from one bit of content or webpage to another.
One form of digital media that is becoming a phenomenon is in the form of an online magazine
or digital magazine. What exactly is a digital magazine? Due to the economic importance of digital
magazines, the Audit Bureau of Circulations integrated the definition of this medium in its latest
report (March 2011): a digital magazine involves the distribution of a magazine content by
electronic means; it may be a replica. This is an outdated definition of what a digital magazine is.
A digital magazine should not be, in fact, a replica of the print magazine in PDF, as was common
practice in recent years. It should, rather, be a magazine that is, in essence, interactive and
created from scratch to a digital platform (Internet, mobile phones, private networks, iPad or
other device). The barriers for digital magazine distribution are thus decreasing. At the same time
digitizing platforms are broadening the scope of where digital magazines can be published, such
as within websites and on smartphones. With the improvements of tablets and digital magazines
are becoming visually enticing and readable magazines with its graphic arts.

For more knowledge about impact of digital media, please check the link provided;
https://www.youtube.com/watch?v=V2mrvhMY4QA&ab_channel=MediaSmarts

Lesson 3: Media Literacy


What is Media Literacy?
Media are powerful forces in the lives of youth. Music, TV, video games, magazines and
other media all have a strong influence on how we see the world, an influence that often begins
in infancy. To be engaged and critical media consumers, kids need to develop skills and habits
of media literacy. These skills include being able to access media on a basic level, to analyze it in
a critical way based on certain key concepts, to evaluate it based on that analysis and, finally, to
produce media oneself. This process of learning media literacy skills is media education.
‘Media’ (and its singular form ‘medium’) is from the Latin medius, meaning ‘middle’ or ‘between
two things.’ The Canadian Marshall McLuhan (1911–80) was the first to use this term to mean
‘means of mass communication.’

Page | 6
MODULE LIVING IN THE IT ERA – IT01

Media literacy is defined by the Trent Think Tank on Media Literacy as ‘the ability to decode,
analyze, evaluate, and produce communication in a variety of forms.’1 According to the
Information Competence Project at California Polytechnic State University, a person who is media
literate:

• has the ability to assess the credibility of information received as well as the credibility of
the information source;
• has the ability to recognize metaphor and uses of symbols in entertainment, advertising,
and political commentary;
• has the ability to discern between appeals to emotion and logic, and recognizes covert
and overt appeals;
• is sensitive to verbal as well as visual arguments;
• has the ability to use critical faculties to assess the truth of information gleaned from
various sources.
The empowerment approach was advocated by Johnson, in ‘Digital Literacy: Re-Thinking
Education and Training in a Digital World’:

• Media literacy is essential for citizenship.


• The media are powerful social and cultural forces.
• The media are social constructions.
• Audiences are active creators of their own meaning.
The Ancient Greeks believed it was vital for a democratic society and government to have literate
and educated citizens. According to the empowerment approach, it is equally important in the
digital information age to be media literate – to be able to understand, evaluate,
and use digital, multimedia information. As McLuhan noted, the new media are new
languages and one must be fluent in those languages to be considered media literate.

For more knowledge about media literacy, please check the link provided;
https://www.youtube.com/watch?v=GIaRw5R6Da4&ab_channel=MediaLiteracyNow

REFERENCES

https://en.wikipedia.org/wiki/Digital_media

Page | 7
MODULE LIVING IN THE IT ERA – IT01

https://www.digitallogic.co/blog/what-is-digital-
media/#:~:text=Digital%20media%20is%20any%20form,social%20media%2C%20and%2
0online%20advertising.
https://mediasmarts.ca/digital-media-literacy/general-information/digital-media-
literacy-fundamentals/media-literacy-fundamentals

Page | 8
MODULE LIVING IN THE IT ERA – IT01

CHAPTER 6: THE INTERNET AND THE WORLD WIDE WEB

Objectives:
a) Differentiate the internet and its components.
b) Identify how does internet works and its advantages.
c) Discuss the differences between internet, intranet and extranet.

Lesson 1: The Internet


Internet
Internet is a global network that connects
billions of computers across the world with each
other and to the World Wide Web. It uses
standard internet protocol suite (TCP/IP) to
connect billions of computer users worldwide. It
is set up by using cables such as optical fibers and
other wireless and networking technologies. At
present, internet is the fastest mean of sending or
exchanging information and data between computers across the world.
It is believed that the internet was developed by "Defense Advanced Projects Agency" (DARPA)
department of the United States. And, it was first connected in 1969.
Why is the Internet Called a Network?
Internet is called a network as it creates a network by connecting computers and servers across
the world using routers, switches and telephone lines, and other communication devices and
channels. So, it can be considered a global network of physical cables such as copper telephone
wires, fiber optic cables, tv cables, etc. Furthermore, even wireless connections like 3G, 4G, or
Wi-Fi make use of these cables to access the Internet.
Internet is different from the World Wide Web as the World Wide Web is a network of computers
and servers created by connecting them through the internet. So, the internet is the backbone
of the web as it provides the technical infrastructure to establish the WWW and acts as a medium
to transmit information from one computer to another computer. It uses web browsers to display
the information on the client, which it fetches from web servers.
The internet is not owned by a single person or organization entirely. It is a concept based on
physical infrastructure that connects networks with other networks to create a global network

Page | 1
MODULE LIVING IN THE IT ERA – IT01

of billions of computers. As of 12 August 2016, there were more than 300 crores of internet users
across the world.

For more knowledge about the internet, please check the link provided:
https://www.youtube.com/watch?v=Dxcc6ycZ73M&ab_channel=Code.org

Lesson 2: How does internet work?


Before understanding this let us understand some basics related to internet:
The internet works with the help of clients and servers. A device such as a laptop, which is
connected to the internet is called a client, not a server as it is not directly connected to the
internet. However, it is indirectly connected to the internet through an Internet Service Provider
(ISP) and is identified by an IP address, which is a string of numbers. Just like you have an address
for your home that uniquely identifies your home, an IP address acts as the shipping address of
your device. The IP address is provided by your ISP, and you can see what IP address your ISP has
given to your system.
A server is a large computer that stores websites. It also has an IP address. A place where a large
number of servers are stored is called a data center. The server accepts requests send by the
client through a browser over a network (internet) and responds accordingly.
To access the internet we need a domain name, which represents an IP address number, i.e.,
each IP address has been assigned a domain name. For example, youtube.com, facebook.com,
paypal.com are used to represent the IP addresses. Domain names are created as it is difficult for
a person to remember a long string of numbers. However, internet does not understand the
domain name, it understands the IP address, so when you enter the domain name in the browser
search bar, the internet has to get the IP addresses of this domain name from a huge phone book,
which is known as DNS (Domain Name Server).
For example, if you have a person's name, you can find his phone number in a phone book by
searching his name. The internet uses the DNS server in the same way to find the IP address of
the domain name. DNS servers are managed by ISPs or similar organizations.

Page | 2
MODULE LIVING IN THE IT ERA – IT01

Now after understanding the basics, let us see how internet works?

When you turn on your computer


and type a domain name in the
browser search bar, your browser
sends a request to the DNS server to
get the corresponding IP address.
After getting the IP address, the
browser forwards the request to the
respective server.
Once the server gets the request to
provide information about a
particular website, the data starts flowing. The data is transferred through the optical fiber cables
in digital format or in the form of light pulses. As the servers are placed at distant places, the data
may have to travel thousands of miles through optical fiber cable to reach your computer.
The optical fiber is connected to a router, which converts the light signals into electrical signals.
These electrical signals are transmitted to your laptop using an Ethernet cable. Thus, you receive
the desired information through the internet, which is actually a cable that connects you with
the server.
Furthermore, if you are using wireless internet using wife or mobile data, the signals from the
optical cable are first sent to a cell tower and from where it reaches to your cell phone in the
form of electromagnetic waves.
The internet is managed by ICANN (Internet Corporation for Assigned Names and Numbers)
located in the USA. It manages IP addresses assignment, domain name registration, etc.
The data transfer is very fast on the internet. The moment you press enter you get the
information from a server located thousands of miles away from you. The reason for this speed
is that the data is sent in the binary form (0, 1), and these zeros and ones are divided into small
pieces called packets, which can be sent at high speed.
Advantages of the Internet:
o Instant Messaging: You can send messages or communicate to anyone using internet,
such as email, voice chat, video conferencing, etc.
o Get directions: Using GPS technology, you can get directions to almost every place in a
city, country, etc. You can find restaurants, malls, or any other service near your
location.

Page | 3
MODULE LIVING IN THE IT ERA – IT01

o Online Shopping: It allows you to shop online such as you can be clothes, shoes, book
movie tickets, railway tickets, flight tickets, and more.
o Pay Bills: You can pay your bills online, such as electricity bills, gas bills, college fees, etc.
o Online Banking: It allows you to use internet banking in which you can check your
balance, receive or transfer money, get a statement, request cheque-book, etc.
o Online Selling: You can sell your products or services online. It helps you reach more
customers and thus increases your sales and profit.
o Work from Home: In case you need to work from home, you can do it using a system
with internet access. Today, many companies allow their employees to work from
home.
o Entertainment: You can listen to online music, watch videos or movies, play online
games.
o Cloud computing: It enables you to connect your computers and internet-enabled
devices to cloud services such as cloud storage, cloud computing, etc.
o Career building: You can search for jobs online on different job portals and send you CV
through email if required.

For more knowledge about how does internet works, please check the link provided:
https://www.youtube.com/watch?v=x3c1ih2NJEg&ab_channel=LearnEngineering

Lesson 3: Intranet and Extranet


Intranet
The intranet is a private network that belongs
to a particular organization. It is designed for
the exclusive use of an organization and its
associates, such as employees, customers,
and other authorized people. It offers a
secure platform to convey information and
share data with authorized users.
Confidential information, database, links,
forms, and applications can be made
available to the staff through the intranet. So,
it is like a private internet or an internal

Page | 4
MODULE LIVING IN THE IT ERA – IT01

website that is operating within an organization to provide its employees access to its
information and records. Each computer in intranet is identified by a unique IP Address.
It is based on internet protocols (TCP/IP) and is protected from unauthorized access with firewalls
and other security systems. The firewall monitors the incoming and outgoing data packets to
ensure they don't contain unauthorized requests. So, users on the intranet can access
the internet, but the internet users can't access the intranet if they are not authorized for it.
Furthermore, to access the intranet, the authorized user is required to be connected to
its LAN (Local Area Network).
Some of the benefits of the intranet are:
o It is cheap and easy to implement and run, and is more safe than the internet
and extranet.
o It streamlines communication that enables the company to share its data, information,
and other resources among employees without any delay. The entire staff can receive
company's announcements, ask questions, and access internal documents.
o It provides a secure space to store and develop applications to support business
operations.
o It improves the efficiency of the company by speeding up workflow and reducing errors.
Thus, it helps achieve targets by completing the tasks on time.
o It offers a testing platform for new ideas before they are uploaded on the company's
internet webpage. Thus, it helps maintain the credibility of the company
o Information is shared in real-time, or updates are reflected immediately to all the
authorized users.
o Modern intranets also offer a mobile app that allows employees to stay connected on the
go.
o It aids in project management and tracking workflow and teams' progress.
o It can work with mobile devices, which means it can provide information that exists on
intranet directly to mobile devices of employees such as phones, tablets, etc.
o It can also be used to motivate employees, facilitate employee recognition, and to reward
them for performing beyond expectations.
How the Intranet Works:
Intranet basically comprises three components: a web server, an intranet platform, and
applications. The web server is hardware that contains all the intranet software and data. It

Page | 5
MODULE LIVING IN THE IT ERA – IT01

manages all requests for files hosted over the server and finds the requested files and then
delivers it to the user's computer.
The intranet platform, which is
software, allows communication tools,
collaboration apps, and databases to
work seamlessly with each other. It is
tailored to the specific needs of a
business.
The applications are required to enable
users to work smoothly. They are the
computing tools that allow users to do
their work, communicate, and
coordinate with each other and
retrieve and store information.
Furthermore, the user who wants to access the intranet is required to have a special network
password and should be connected to the LAN. A user who is working remotely can gain access
to the intranet through a virtual private network (VPN) that allows them to sign in to the intranet
to access the information.
Disadvantages of Intranet:
o It may be costly to set up an Intranet due to hidden costs and complexity.
o If the firewall does not work properly or not installed, it can be hacked by someone
o High-security passwords are required, which cannot be guessed by outside users
o There is always a fear of losing control over the intranet
o Sometimes document duplication may happen which can cause confusion among
employees
o You have to give access to multiple users, so you may find it hard to control this network.
Examples of Intranet:
Educational Intranet: It is generally found in a school, college, etc., For example, a school intranet
is intended to allow teaching staff to communicate with each other and get information about
upcoming updates such as exam dates, schools functions, holidays, etc.
Real Estate Intranet: The intranet of a real estate company allows its sales team to have access
to all important brochures, templates, forms that they may need to close a sale. Employees also
remain up to date with important events like meetings, training, sessions, etc. It can also be used
to share motivational messages with the team.

Page | 6
MODULE LIVING IN THE IT ERA – IT01

Health Care Intranet: In the healthcare sector, in big hospitals, the Intranet helps health care
professionals to work as a team to provide proper care and treatment to their patients. Doctors
can share reports, treatment procedures, bills and claims can be settled easily without moving
from one department to another department.
IT Sector Intranet: In the IT sector three is always a lot of information that needs to be shared
with all the employees at one go. It may be related to a project that needs to be completed within
the given time frame, such as guidelines, terms and conditions, and rules that are to be followed
while working on a project.
Difference between Intranet and Internet:

Internet Intranet

It is a medium such as optical fiber cable that connects It is a small, private network as it
billions of computers with each other to establish a belongs to a specific organization.
worldwide network.

It has billions of users as it is a public network with a It has limited users.


worldwide presence.

It is not as safe as an intranet. It is a safer network than the


internet.

It can be assessed or used by anyone using an internet- Only authorized persons can use this
enable device, such as laptop, mobile phone, etc. network.

It offers a wide range of information, such as news, It offers limited information related
blogs, websites, etc. to its organization's work, policies,
updates, etc.

It is not owned by a single person or an organization. It can be owned by a person or an


organization.

Extranet
Extranet is a part of an organization's intranet. It is a communication network that is based on
internet protocols (TCP/IP). It provides controlled access to firm's intranet to its trading partners,
customers, and other businesses. So, it is a private network that securely shares internal
information and operations of a firm with authorized people outside the firm without giving
access to the company's entire network. The users are required to have IDs, passwords, and other
authentication mechanisms to access this network.

Page | 7
MODULE LIVING IN THE IT ERA – IT01

Some of the benefits of extranet:


o It acts as a single interface between the company and its trading partners.
o It automates the firm's processes like automatically places an order with suppliers when
inventory drops.
o It improves customer service by providing customers a platform to resolve their queries
and complaints.
o It enables the firm to share information with trading partners without engaging in paper-
based publishing processes.
o It streamlines business processes that are repetitive in nature, such as ordering from a
vendor on a regular basis.
How is Extranet Established?
It is set up in the form of a Virtual Private Network as it is prone to security threats due to the
use of the internet to connect outsiders to an organization's intranet. VPN can assure you a safe
network in a public network such as the internet. The transmission control protocol (TCP) and
internet protocol (IP) are used for the data transfer.
VPN assures secure transactions based on Internet Protocol Security Architecture (IPSEC)
protocol as it provides an extra security layer to TCP/IP protocol, which is used for data transfer
in the extranet. In this layer, the IP packet is encapsulated to form a new IP packet, as shown
below:

Page | 8
MODULE LIVING IN THE IT ERA – IT01

Furthermore, to provide more security to Intranet, the following two measures are also taken by
an organization:
o Firewall: It prevents unauthorized users from accessing the extranet.
o Passwords: It also prevents unauthorized users, including the company's employees from
accessing the data stored on its server.
Limitations of Extranet:
o Hosting: If you host extranet pages on your own server, it requires a high bandwidth
internet connection, which is may be very expensive.
o Security: You need extra firewall security if you host it on your own server. It increases
the workload and makes security mechanism very complex.
o Dependency: It is dependent on the internet as outsiders cannot access information
without using the internet.
o Less Interaction: It reduces the face to face interaction between customers, business
partners, vendors, etc., which results in poor relationship building.

For more knowledge about internet, intranet and extranet, please check the link
provided;
https://www.youtube.com/watch?v=wjocrAP0t60&ab_channel=GreggLearning

REFERENCES
https://www.javatpoint.com/internet
https://www.javatpoint.com/intranet

Page | 9
MODULE LIVING IN THE IT ERA – IT01

CHAPTER 7: The World Wide Web

Objectives:
a) Discuss the definition and components of World Wide Web.
b) Identify the differences between internet and the World Wide
Web.
c) Discover how does the world wide web works.

Lesson 1: World Wide Web


What is World Wide Web?
World Wide Web, which is also known as a Web, is a collection of websites or web pages stored
in web servers and connected to local computers through the internet. These websites contain
text pages, digital images, audios, videos, etc. Users can access the content of these sites from
any part of the world over the internet using their devices such as computers, laptops, cell
phones, etc. The WWW, along with internet, enables the retrieval and display of text and media
to your device.
The building blocks of the Web are
web pages which are formatted in
HTML and connected by links called
"hypertext" or hyperlinks and
accessed by HTTP. These links are
electronic connections that link
related pieces of information so that
users can access the desired
information quickly. Hypertext offers
the advantage to select a word or
phrase from text and thus to access other pages that provide additional information related to
that word or phrase.
A web page is given an online address called a Uniform Resource Locator (URL). A particular
collection of web pages that belong to a specific URL is called a website,
e.g., www.facebook.com, www.google.com, etc. So, the World Wide Web is like a huge electronic
book whose pages are stored on multiple servers across the world.
Small websites store all of their WebPages on a single server, but big websites or organizations
place their WebPages on different servers in different countries so that when users of a country
search their site they could get the information quickly from the nearest server.

Page | 1
MODULE LIVING IN THE IT ERA – IT01

So, the web provides a communication platform for users to retrieve and exchange information
over the internet. Unlike a book, where we move from one page to another in a sequence, on
World Wide Web we follow a web of hypertext links to visit a web page and from that web page
to move to other web pages. You need a browser, which is installed on your computer, to access
the Web.
Difference between World Wide Web and Internet:
Some people use the terms 'internet' and 'World Wide Web' interchangeably. They think they
are the same thing, but it is not so. Internet is entirely different from WWW. It is a worldwide
network of devices like computers, laptops, tablets, etc. It enables users to send emails to other
users and chat with them online. For example, when you send an email or chatting with someone
online, you are using the internet.

But, when you have opened a website like google.com


for information, you are using the World Wide Web; a
network of servers over the internet. You request a
webpage from your computer using a browser, and the
server renders that page to your browser. Your
computer is called a client who runs a program (web
browser), and asks the other computer (server) for the
information it needs.

For more knowledge about internet vs www, please check the link provided;
https://www.youtube.com/watch?v=CX_HyY3kbZw&ab_channel=TheTechCave

Lesson 2: History of the World Wide Web


History of the World Wide Web:
The World Wide Web was invented by a British scientist, Tim Berners-Lee in
1989. He was working at CERN at that time. Originally, it was developed by
him to fulfill the need of automated information sharing between scientists
across the world, so that they could easily share the data and results of their
experiments and studies with each other.
CERN, where Tim Berners worked, is a community of more than 1700
scientists from more than 100 countries. These scientists spend some time
on CERN site, and rest of the time they work at their universities and national laboratories in their
home countries, so there was a need for reliable communication tools so that they can exchange
information.

Page | 2
MODULE LIVING IN THE IT ERA – IT01

Internet and Hypertext were available at this time, but no one thought how to use the internet
to link or share one document to another. Tim focused on three main technologies that could
make computers understand each other, HTML, URL, and HTTP. So, the objective behind the
invention of WWW was to combine recent computer technologies, data networks, and hypertext
into a user-friendly and effective global information system.
How the Invention Started:
In March 1989, Tim Berners-Lee took the initiative towards the invention of WWW and wrote the
first proposal for the World Wide Web. Later, he wrote another proposal in May 1990. After a
few months, in November 1990, along with Robert Cailliau, it was formalized as a management
proposal. This proposal had outlined the key concepts and defined terminology related to the
Web. In this document, there was a description of "hypertext project" called World Wide Web in
which a web of hypertext documents could be viewed by browsers. His proposal included the
three main technologies (HTML, URL, and HTTP).
In 1990, Tim Berners-Lee was able to run the first Web server and browser at CERN to
demonstrate his ideas. He used a NeXT computer to develop the code for his Web server and put
a note on the computer "The machine is a server. Do Not Power It DOWN!!" So that it was not
switched off accidentally by someone.
In 1991, Tim created the world's first website and Web Server. Its address was info.cern.ch, and
it was running at CERN on the NeXT computer. Furthermore, the first web page address
was http://info.cern.ch/hypertext/WWW/TheProject.html. This page had links to the
information related to the WWW project, and also about the Web servers, hypertext description,
and information for creating a Web server.
The Web Grows:
NeXT computer platform was accessible by a few users. Later, the development of 'line-mode'
browser, which could run on any system, started. In 1991, Berners-Lee introduced his WWW
software with 'line-mode' browser, Web server software and a library for developers.
In March 1991, it was available to colleagues who were using CERN computers. After a few
months, in August 1991, he introduced the WWW software on internet newsgroups, and it
generated interest in the project across the world. Graphic interface for the internet, first
introduced to the public on 6 August 1991 by Tim Berners-Lee. On 23 August 1991, it was
available to everyone.

Becoming Global:

Page | 3
MODULE LIVING IN THE IT ERA – IT01

The first Web server came online in December 1991 in the United States. At this time, there were
only two types of browsers; the original development version which was available only on NeXT
machines and the 'line-mode' browser which was easy to install and run on any platform but was
less user-friendly and had limited power.
For further improvement, Berners-Lee asked other developers via the internet to contribute to
its development. Many developers wrote browsers for the X-Window System. The first web
server, outside Europe, was introduced at Standard University in the United States in 1991. In the
same year, there were only ten known web servers across the world.
Later at the beginning of 1993, the National Center for Supercomputing Applications (NCSA)
introduced the first version of its Mosaic browser. It ran in the X Window System environment.
Later, the NCSA released versions for the PC and Macintosh environments. With the introduction
of user-friendly browsers on these computers, the WWW started spreading tremendously across
the world.
Eventually, the European Commission approved its first web project in the same year with CERN
as one of its partners. In April 1993, CERN made the source code of WWW available on a royalty-
free basis and thus made it free software. Royalty-free means one has the right to use copyright
material or intellectual property without paying any royalty or license fee. Thus, CERN allowed
people to use the code and web protocol for free. The technologies that were developed to make
the WWW became an open source to allow people to use them for free. Eventually, people
started creating websites for online businesses, to provide information and other similar
purposes.
At the end of 1993, there were more than 500 web servers, and the WWW has 1% of the total
internet traffic. In May 1994, the First International World Wide Web conference was held at
CERN and was attended by around 400 users and developers and popularly known as the
"Woodstock of the Web." In the same year, the telecommunication companies started providing
internet access, and people have access to WWW available at their homes.
In the same year, one more conference was held in the United States, which was attended by
over 1000 people. It was organized by the NCSA and the newly-formed International WWW
Conference Committee (IW3C2). At the end of this year (1994), the World Wide Web had around
10000 servers and 10 million users. The technology was continuously improved to fulfill growing
needs and security, and e-commerce tools were decided to be added soon.
Open standards:
The main objective was to keep the Web an open standard for all rather than a proprietary
system. Accordingly, CERN sent a proposal to the Commission of the European Union under the
ESPRIT program "WebCore." This project's objective was to form an international consortium in
collaboration with Massachusetts Institute of Technology (MIT), the US. In 1994, Berners-Lee left

Page | 4
MODULE LIVING IN THE IT ERA – IT01

CERN and joined MIT and established the International World Wide Web Consortium (W3C) and
a new European partner was needed for W3C.
The European Commission approached the French National Institute for Research in Computer
Science and Controls (INRIA), to substitute the CERN's role. Eventually, in April 1995, INRIA
became the first European W3C host and in 1996 Keio University of Japan became another host
in Asia.
In 2003, ERCIM (European Research Consortium in Informatics and Mathematics) replaced INRIA
for the role of European W3C Host. Beihang University was announced as the fourth Host by W3C
in 2013. In September 2018, there were over 400 member organizations around the world.
Since its inception, the Web has changed a lot and is still changing today. Search engines have
become more advanced at reading, understanding, and processing information. They can easily
find the information requested by users and can even provide other relevant information that
might interest users.

For more knowledge about the history of world wide web, please check the link
provided; https://www.youtube.com/watch?v=WlryJFlyr10&ab_channel=Mashable

Lesson 3: How the World Wide Web Works?


How the World Wide Web Works?
Now, we have understood that WWW is a collection of websites connected to the internet so
that people can search and share information. Now, let us understand how it works!
The Web works as per the internet's basic client-
server format as shown in the following image. The
servers store and transfer web pages or
information to user's computers on the network
when requested by the users. A web server is a
software program which serves the web pages
requested by web users using a browser. The
computer of a user who requests documents from
a server is known as a client. Browser, which is
installed on the user' computer, allows users to
view the retrieved documents.

Page | 5
MODULE LIVING IN THE IT ERA – IT01

All the websites are stored in web servers. Just as someone lives on rent in a house, a website
occupies a space in a server and remains stored in it. The server hosts the website whenever a
user requests its WebPages, and the website owner has to pay the hosting price for the same.
The moment you open the browser and type a URL in the address bar or search something on
Google, the WWW starts working. There are three main technologies involved in transferring
information (web pages) from servers to clients (computers of users). These technologies include
Hypertext Markup Language (HTML), Hypertext Transfer Protocol (HTTP) and Web browsers.
Hypertext Markup Language (HTML):
HTML is a standard markup language which is used for creating
web pages. It describes the structure of web pages through HTML
elements or tags. These tags are used to organize the pieces of
content such as 'heading,' 'paragraph,' 'table,' 'Image,' and more.
You don't see HTML tags when you open a webpage as browsers
don't display the tags and use them only to render the content of
a web page. In simple words, HTML is used to display text, images,
and other resources through a Web browser.
Web Browser:

Page | 6
MODULE LIVING IN THE IT ERA – IT01

A web browser, which is commonly known as a browser, is a program that displays text, data,
pictures, videos, animation, and more. It provides a software interface that allows you to click
hyperlinked resources on the World Wide Web. When you double click the Browser icon installed
on your computer to launch it, you get connected to the World Wide Web and can search Google
or type a URL into the address bar.
In the beginning, browsers were used only for browsing due to their limited potential. Today,
they are more advanced; along with browsing you can use them for e-mailing, transferring
multimedia files, using social media sites, and participating in online discussion groups and more.
Some of the commonly used browsers include Google Chrome, Mozilla Firefox, Internet Explorer,
Safari, and more.
Hypertext Transfer Protocol (HTTP):
Hyper Text Transfer Protocol (HTTP) is an
application layer protocol which enables WWW
to work smoothly and effectively. It is based on
a client-server model. The client is a web
browser which communicates with the web
server which hosts the website. This protocol
defines how messages are formatted and
transmitted and what actions the Web Server
and browser should take in response to different commands. When you enter a URL in the
browser, an HTTP command is sent to the Web server, and it transmits the requested Web Page.
When we open a website using a browser, a connection to the web server is opened, and the
browser communicates with the server through HTTP and sends a request. HTTP is carried over
TCP/IP to communicate with the server. The server processes the browser's request and sends a
response, and then the connection is closed. Thus, the browser retrieves content from the server
for the user.

For more knowledge about how does world wide web works, please check the link
provided; https://www.youtube.com/watch?v=hJHvdBlSxug&ab_channel=Academind

REFERENCES
https://www.javatpoint.com/what-is-world-wide-web

Page | 7
MODULE LIVING IN THE IT ERA – IT01

CHAPTER 8: INTRODUCTION TO SOCIAL MEDIA


Objectives:
a.) Discuss the brief history of Social Media.
b.) Provide an overview and definition of social media; and
c.) Discuss the different types of social media.

Lesson 1: Overview
The International Telecommunication Union (ITU)—the United Nations
specialized agency for information and communication technologies (ICTs)—has
observed that: “Social media has emerged in recent years as an essential tool for
hundreds of millions of Internet users
worldwide and a defining element of the Internet generation. By one estimate, over half (62 per
cent) of adults worldwide already use social media.
According to the Nielsen’s Social Media Report 2012, product research is a dominant use for
social media: Social media’s influence on purchase intent is strong across all regions, but
strongest among online consumers in the Asia-Pacific, Latin America and Middle East/Africa
markets. Thirty per cent of online consumers in the Middle East/Africa region and 29% in Asia-
Pacific use social media on a daily basis to learn more about brands/products/services, with one-
third of respondents in both regions connecting on a weekly basis.

Introduction - Brief History of Social Media Social Media before 1900


The earliest methods of communicating across great distances
used written correspondence delivered by hand from one
person to another. In other words, letters. The earliest form
of postal service dates back to 550 B.C., and this primitive
delivery system would become more widespread and
streamlined in future centuries.
In 1792, the telegraph was invented. This allowed messages
to be delivered over a long distance far faster than a horse and
rider could carry them. Although telegraph messages were

1
MODULE LIVING IN THE IT ERA – IT01

short, they were a revolutionary way to convey news and information.


Although no longer popular outside of drive-through banking, the pneumatic post, developed in
1865, created another way for letters to be delivered quickly between recipients. A pneumatic
post utilizes underground pressurized air tubes to carry capsules from one area to another.
Two important discoveries happened in the last decade of the 1800s: The telephone in 1890 and
the radio in 1891.
Both technologies are still in use today, although the modern versions are much more
sophisticated than their predecessors. Telephone lines and radio signals enabled people to
communicate across great distances instantaneously, something that mankind had never
experienced before.
Social Media in the 20th Century
Technology began to change very rapidly in the 20th Century. After the first super computers
were created in the 1940s, scientists and engineers began to develop ways to create networks
between those computers, and this would later lead to the birth of the Internet.
The earliest forms of the Internet, such as CompuServe, were developed in the 1960s. Primitive
forms of email were also developed during this time. By the 70s, networking technology had
improved, and 1979’s UseNet allowed users to communicate through a virtual newsletter.
By the 1980s, home computers were becoming more common and social media was becoming
more sophisticated. Internet relay chats, or IRCs, were first used in 1988 and continued to be
popular well into the 1990’s.
The first recognizable social media site, Six Degrees, was created in 1997. It enabled users to
upload a profile and make friends with other users. In 1999, the first blogging sites became
popular, creating a social media sensation that’s still popular today.
Social Media Today
After the invention of blogging, social
media began to explode in popularity.
Sites like MySpace and LinkedIn gained
prominence in the early 2000s, and sites
like Photobucket and Flickr facilitated
online photo sharing. YouTube came out
in 2005, creating an entirely new way for
people to communicate and share with
each other across great distances.

2
MODULE LIVING IN THE IT ERA – IT01

By 2006, Facebook and Twitter both became available to users throughout the world. These sites
remain some of the most popular social networks on the Internet. Other sites like Tumblr, Spotify,
Foursquare and Pinterest began popping up to fill specific social networking niches.
Today, there is a tremendous variety of social networking sites, and many of them can be linked
to allow cross-posting. This creates an environment where users can reach the maximum number
of people without sacrificing the intimacy of person-to-person communication. We can only
speculate about what the future of social networking may look in the next decade or even 100
years from now, but it seems clear that it will exist in some form for as long as humans are alive.

For more knowledge about history of social media, please check the link provided;
https://www.youtube.com/watch?v=cw0jRD7mn1k&ab_channel=Techquickie

Lesson 2: What is Social Media?

Social media is a group of Internet based


applications that build on the ideological and
technological foundations of Web 2.0, and that
allow the creation and exchange of user-
generated content.
A related definition of social media is that they
are platforms that provide users the ability and
tools to create and publish their own mini web
sites or web pages.
There are also definitions in “techno-biological” terms:
Social media is an ever-growing and evolving collection of online tools and toys, platforms and
applications that enable all of us to interact with and share information. Increasingly, it’s both
the connective tissue and neural net of the web.
A useful definition is one that identifies its essential elements:
Social media is best understood as a group of new kinds of online media, which share most or all
of the following characteristics:
➢ Participation – Social media encourages contributions and feedback from everyone who is
interested. It blurs the line between media and audience.
➢ Openness – Most social media services are open to feedback and participation. They
encourage voting, comments and the sharing of information. There are rarely any barriers
to accessing and making use of content, and password- protected content is frowned on.

3
MODULE LIVING IN THE IT ERA – IT01

➢ Conversation – Whereas traditional media is about “broadcast” (content transmitted


or distributed to an audience), social media is better seen as a two- way conversation.
➢ Community – Social media allows communities to form quickly and communicate
effectively. Communities share common interests, such as a love of photography, a political
issue or a favourite TV show.
➢ Connectedness – Most kinds of social media thrive on their connectedness, making use of
links to other sites, resources and people.

For more knowledge about social media, please check the link provided;
https://www.youtube.com/watch?v=uFxX3tXLaGU

Lesson 3: Types of Social Media


Different Types of Social Media

Social Networks
➢ These sites provide services that allow the users to connect with people of similar
interests and background.
➢ On a social network usually have a profile, groups,
messaging, photos and multiple other way to interact with
other users.
➢ eg: Facebook, WhatsApp, WeChat, Tumblr, Twitter,
Instagram, Google +, Skpe, Viber, Line, SnapChat, MySpace,
LinkedIn, Friendster.
Bookmarking Sites
➢ These sites allow the users to save, organize, and manage links to websites and any
number of online resources.
➢ On these sites, the users have the ability to tag
links, making those links easier to search for and
find again, or share with other followers.
➢ eg: StumbleUpon, Pinterest, Dribble, Pocket,
Digg, Reddit, Slashdot, scoop.it
Social News
➢ These sites allow the users to post news links,
articles, video, pictures, and other items to
outside articles.
4
MODULE LIVING IN THE IT ERA – IT01

➢ Users vote on the items and the items with the


highest number of votes are displayed the most
prominently with the most visibility on the sites.
➢ Eg: Digg, Reddit, Propeller, Frak, Slashdot,
Metafilter, Mixx, Shoutwire, Newsvine, Linkfilter,
NewsCloud.
Media Sharing
➢ These sites allow the users to upload, and share different types of media, such as
pictures and video.
➢ Most of this sites also social features, like
ability to create profiles and ability to
comments on other material.
➢ eg: Youtube, Flickr, Vilmeo, Instagram,
Snapchat
Micro Blogging
➢ This site allows the users to submit short written entries, photos, or media, such as
links to product and service sites, as well as
links to other social media sites.
➢ These short updates can be seen by anyone
subscribed to receive the updates. eg:
Twitter, FriendFeed, Tumblr, Dailybooth,
Youtube

Blogging , Blog Comment and Online Forums


➢ Blog – are sites of information and discussion.
➢ Blog Comments – are usually centered around the
specific subject of the attached blog,
➢ Online Forum – is a site that allows the users to
engage in conversations by posting and responding
to the Community messages.
➢ eg: Warrior Forum, Wicked Fire, Digital point Forum,
Business Forum

5
MODULE LIVING IN THE IT ERA – IT01

For more knowledge about types social media, please check the link provided;
https://www.youtube.com/watch?v=LgarEgc3PTc&ab_channel=BusinessWales%2FBusnesCy
mru

REFERENCES

➢ https://www.slideshare.net/NoirPiggott/a-brief-history-of- social-
media-by-c-piggott-2012
➢ https://www.epa.gov/sites/production/files/2014-
03/documents/social-media.pdf
➢ http://143.127.10.117/content/en/us/enterprise/media/security_re
sponse/whitepapers/the_risks_of_social_networking.pdf
➢ http://www.unapcict.org/sites/default/files/2018-
12/M11_Final_Web%20%281%29.pdf
➢ https://smallbiztrends.com/2013/05/the-complete-history-of- social-
media-infographic.html
➢ https://www.brandwatch.com/blog/10-popular-social-bookmarking-
websites/
➢ https://makeawebsitehub.com/social-media-sites/
➢ https://issaasad.org/list-top-50-social-news-websites/

6
MODULE LIVING IN THE IT ERA – IT01

CHAPTER 9: INTRODUCTION TO CYBER SECURITY

Objectives:
a) Discuss the importance of Cybersecurity.
b) Identify the goals of Cybersecurity.
c) Discover the different types cyber-attacks and attackers.

Lesson 1: Cyber Security Overview


Cyber Security Introduction
"Cybersecurity is primarily about people, processes, and technologies working together
to encompass the full range of threat reduction, vulnerability reduction, deterrence,
international engagement, incident response, resiliency, and recovery policies and activities,
including computer network operations, information assurance, law enforcement, etc."
Cybersecurity is the protection of Internet-connected systems, including hardware, software, and
data from cyber attacks. It is made up of two words one is cyber and other is security. Cyber is
related to the technology which contains systems, network and programs or data. Whereas
security related to the protection which includes systems security, network security and
application and information security.
It is the body of technologies, processes, and practices designed to protect networks, devices,
programs, and data from attack, theft, damage, modification or unauthorized access. It may also
be referred to as information technology security.

We can also define cybersecurity as


the set of principles and practices
designed to protect our computing
resources and online information
against threats. Due to the heavy
dependency on computers in a
modern industry that store and
transmit an abundance of
confidential and essential
information about the people,
cybersecurity is a critical function and needed insurance of many businesses.

Page | 1
MODULE LIVING IN THE IT ERA – IT01

Why is cybersecurity important?


We live in a digital era which understands that our private information is more vulnerable than
ever before. We all live in a world which is networked together, from internet banking to
government infrastructure, where data is stored on computers and other devices. A portion of
that data can be sensitive information, whether that be intellectual property, financial data,
personal information, or other types of data for which unauthorized access or exposure could
have negative consequences.
Cyber-attack is now an international concern and has given many concerns that hacks and other
security attacks could endanger the global economy. Organizations transmit sensitive data across
networks and to other devices in the course of doing businesses, and cybersecurity describes to
protect that information and the systems used to process or store it.
As the volume of cyber-attacks grows, companies and organizations, especially those that deal
information related to national security, health, or financial records, need to take steps to protect
their sensitive business and personal information.
History of Cyber Security
The origin of cybersecurity began with a research project. It only came into existence because of
the development of viruses.
How did we get here?

In 1969, Leonard Kleinrock, professor of UCLA and student, Charley Kline, sent the first
electronic message from the UCLA SDS Sigma 7 Host computer to Bill Duvall, a programmer, at
the Stanford Research Institute. This is a well-known story and a moment in the history of a digital
world. The sent message from the UCLA was the word "login." The system crashed after they
typed the first two letters "lo." Since then, this story has been a belief that the programmers
typed the beginning message "lo and behold." While factually believed that "login" was the
intended message. Those two letters of messages were changed the way we communicate with
one another.

Page | 2
MODULE LIVING IN THE IT ERA – IT01

In 1970's, Robert (Bob) Thomas who was a researcher for BBN Technologies in Cambridge,
Massachusetts created the first computer worm (virus). He realized that it was possible for a
computer program to move across a network, leaving a small trail (series of signs) wherever it
went. He named the program Creeper, and designed it to travel between Tenex terminals on the
early ARPANET, printing the message "I'M THE CREEPER: CATCH ME IF YOU CAN."
An American computer programmer named Ray Tomlinson, the inventor of email, was also
working for BBN Technologies at the time. He saw this idea and liked it. He tinkered (an act of
attempting to repair something) with the program and made it self-replicating "the first
computer worm." He named the program Reaper, the first antivirus software which would
found copies of The Creeper and delete it.
Where are we now?
After Creeper and Reaper, cyber-crimes became more powerful. As computer software and
hardware developed, security breaches also increase. With every new development came an
aspect of vulnerability, or a way for hackers to work around methods of protection. In 1986, the
Russians were the first who implement the cyber power as a weapon. Marcus Hess, a German
citizen, hacked into 400 military computers, including processors at the Pentagon. He intended
to sell secrets to the KGB, but an American astronomer, Clifford Stoll, caught him before that
could happen.
In 1988, an American computer scientist, Robert Morris, wanted to check the size of the internet.
He wrote a program for testing the size of the internet. This program went through networks,
invaded Unix terminals, and copied itself. The program became the first famous network virus
and named as Moris worm or internet worm. The Morris worm could be infected a computer
multiple times, and each additional process would slow the machine down, eventually to the
point of being damaged. Robert Morris was charged under the Computer Fraud and Abuse Act.
The act itself led to the founding of the Computer Emergency Response Team. This is a non-profit
research centre for issues that could endanger the internet as a whole.
Nowadays, viruses were deadlier, more invasive, and harder to control. We have already
experienced cyber incidents on a massive scale, and 2018 isn't close to over. The above is to name
a few, but these attacks are enough to prove that cybersecurity is a necessity for corporations
and small businesses alike.

Page | 3
MODULE LIVING IN THE IT ERA – IT01

For more knowledge about cybersecurity, please check the link provided:
https://www.youtube.com/watch?v=inWWhr5tnEA&ab_channel=Simplilearn

Lesson 2: Cyber Security Goals


Cyber Security Goals
The objective of Cybersecurity is to protect information from being stolen, compromised or
attacked. Cybersecurity can be measured by at least one of three goals-
1. Protect the confidentiality of data.
2. Preserve the integrity of data.
3. Promote the availability of data for authorized users.

These goals form the confidentiality, integrity, availability (CIA) triad, the basis of all security
programs. The CIA triad is a security model that is designed to guide policies for information
security within the premises of an organization or company. This model is also referred to as
the AIC (Availability, Integrity, and Confidentiality) triad to avoid the confusion with the Central
Intelligence Agency. The elements of the triad are considered the three most crucial components
of security.
The CIA criteria are one that most of the organizations and companies use when they have
installed a new application, creates a database or when guaranteeing access to some data. For
data to be completely secure, all of these security goals must come into effect. These are security
policies that all work together, and therefore it can be wrong to overlook one policy.
The CIA triad are-
1. Confidentiality
Confidentiality is roughly equivalent to privacy
and avoids the unauthorized disclosure of
information. It involves the protection of data,
providing access for those who are allowed to
see it while disallowing others from learning
anything about its content. It prevents essential
information from reaching the wrong people
while making sure that the right people can get
it. Data encryption is a good example to ensure
confidentiality.

Page | 4
MODULE LIVING IN THE IT ERA – IT01

Tools for Confidentiality


Encryption
Encryption is a method of transforming information to
make it unreadable for unauthorized users by using an
algorithm. The transformation of data uses a secret key
(an encryption key) so that the transformed data can only
be read by using another secret key (decryption key). It
protects sensitive data such as credit card numbers by
encoding and transforming data into unreadable cipher
text. This encrypted data can only be read by decrypting
it. Asymmetric-key and symmetric-key are the two
primary types of encryption.
Access control
Access control defines rules and policies for limiting access to a system or to physical or virtual
resources. It is a process by which users are granted access and certain privileges to systems,
resources or information. In access control systems, users need to present credentials before
they can be granted access such as a person's name or a computer's serial number. In physical
systems, these credentials may come in many forms, but credentials that can't be transferred
provide the most security.
Authentication
An authentication is a process that ensures and confirms a user's identity or role that someone
has. It can be done in a number of different ways, but it is usually based on a combination of-
o something the person has (like a smart card or a radio key for storing secret keys),
o something the person knows (like a password),
o something the person is (like a human with a fingerprint).

Authentication is the necessity of every organizations because it enables organizations to keep


their networks secure by permitting only authenticated users to access its protected resources.
These resources may include computer systems, networks, databases, websites and other
network-based applications or services.
Authorization

Authorization is a security mechanism which gives permission to do or have something. It is used


to determine a person or system is allowed access to resources, based on an access control policy,
including computer programs, files, services, data and application features. It is normally
preceded by authentication for user identity verification. System administrators are typically

Page | 5
MODULE LIVING IN THE IT ERA – IT01

assigned permission levels covering all system and user resources. During authorization, a system
verifies an authenticated user's access rules and either grants or refuses resource access.

Physical Security
Physical security describes measures designed to deny the unauthorized access of IT assets like
facilities, equipment, personnel, resources and other properties from damage. It protects these
assets from physical threats including theft, vandalism, fire and natural disasters.

2. Integrity
Integrity refers to the methods for ensuring that data is real, accurate and safeguarded from
unauthorized user modification. It is the property that information has not be altered in an
unauthorized way, and that source of the information is genuine.
Tools for Integrity
Backups

Backup is the periodic archiving of data. It is a process of


making copies of data or data files to use in the event when
the original data or data files are lost or destroyed. It is also
used to make copies for historical purposes, such as for
longitudinal studies, statistics or for historical records or to
meet the requirements of a data retention policy. Many
applications especially in a Windows environment,
produce backup files using the .BAK file extension.
Checksums
A checksum is a numerical value used to verify the integrity of a file or a data transfer. In other
words, it is the computation of a function that maps the contents of a file to a numerical value.
They are typically used to compare two sets of data to make sure that they are the same. A
checksum function depends on the entire contents of a file. It is designed in a way that even a
small change to the input file (such as flipping a single bit) likely to results in different output
value.

Data Correcting Codes


It is a method for storing data in such a way that small changes can be easily detected and
automatically corrected.

3. Availability

Page | 6
MODULE LIVING IN THE IT ERA – IT01

Availability is the property in which information is accessible and modifiable in a timely fashion
by those authorized to do so. It is the guarantee of reliable and constant access to our sensitive
data by authorized people.

Tools for Availability


o Physical Protections
o Computational Redundancies
Physical Protections
Physical safeguard means to keep information available even in the event of physical challenges.
It ensure sensitive information and critical information technology are housed in secure areas.
Computational redundancies

It is applied as fault tolerant against accidental faults. It protects computers and storage devices
that serve as fallbacks in the case of failures.

For more knowledge about cybersecurity goals, please check the link provided;
https://www.youtube.com/watch?v=azLckMQtbs0&ab_channel=GrantCollins

Lesson 3: Types of Cyber Attacks and Attackers


Types of Cyber Attacks
A cyber-attack is an exploitation of computer systems and networks. It uses malicious
code to alter computer code, logic or data and lead to cybercrimes, such as information and
identity theft.
We are living in a digital era. Now a day, most of the people use computer and internet. Due to
the dependency on digital things, the illegal computer activity is growing and changing like any
type of crime.
Cyber-attacks can be classified into the following categories:
Web-based attacks
These are the attacks which occur on a website or
web applications. Some of the important web-based
attacks are as follows-
1. Injection attacks
It is the attack in which some data will be injected into
a web application to manipulate the application and fetch the required information.

Page | 7
MODULE LIVING IN THE IT ERA – IT01

Example- SQL Injection, code Injection, log Injection, XML Injection etc.
2. DNS Spoofing
DNS Spoofing is a type of computer security hacking. Whereby a data is introduced into a DNS
resolver's cache causing the name server to return an incorrect IP address, diverting traffic to the
attacker?s computer or any other computer. The DNS spoofing attacks can go on for a long period
of time without being detected and can cause serious security issues.
3. Session Hijacking
It is a security attack on a user session over a protected network. Web applications create cookies
to store the state and user sessions. By stealing the cookies, an attacker can have access to all of
the user data.
4. Phishing
Phishing is a type of attack which attempts to steal sensitive information like user login
credentials and credit card number. It occurs when an attacker is masquerading as a trustworthy
entity in electronic communication.
5. Brute force
It is a type of attack which uses a trial and error method. This attack generates a large number of
guesses and validates them to obtain actual data like user password and personal identification
number. This attack may be used by criminals to crack encrypted data, or by security, analysts to
test an organization's network security.
6. Denial of Service
It is an attack which meant to make a server or network resource unavailable to the users. It
accomplishes this by flooding the target with traffic or sending it information that triggers a crash.
It uses the single system and single internet connection to attack a server. It can be classified into
the following-
Volume-based attacks- Its goal is to saturate the bandwidth of the attacked site, and is measured
in bit per second.
Protocol attacks- It consumes actual server resources, and is measured in a packet.
Application layer attacks- Its goal is to crash the web server and is measured in request per
second.
7. Dictionary attacks

This type of attack stored the list of a commonly used password and validated them to get original
password.

Page | 8
MODULE LIVING IN THE IT ERA – IT01

8. URL Interpretation
It is a type of attack where we can change the certain parts of a URL, and one can make a web
server to deliver web pages for which he is not authorized to browse.
9. File Inclusion attacks
It is a type of attack that allows an attacker to access unauthorized or essential files which is
available on the web server or to execute malicious files on the web server by making use of the
include functionality.
10. Man in the middle attacks
It is a type of attack that allows an attacker to intercepts the connection between client and
server and acts as a bridge between them. Due to this, an attacker will be able to read, insert and
modify the data in the intercepted connection.

System-based attacks

These are the attacks which are intended to compromise a computer or a computer network.
Some of the important system-based attacks are as follows-

1. Virus
It is a type of malicious software program that spread throughout the computer files without the
knowledge of a user. It is a self-replicating malicious computer program that replicates by
inserting copies of itself into other computer programs when executed. It can also execute
instructions that cause harm to the system.

2. Worm
It is a type of malware whose primary function is to replicate itself to spread to uninfected
computers. It works same as the computer virus. Worms often originate from email attachments
that appear to be from trusted senders.
3. Trojan horse
It is a malicious program that occurs unexpected changes to computer setting and unusual
activity, even when the computer should be idle. It misleads the user of its true intent. It appears
to be a normal application but when opened/executed some malicious code will run in the
background.

Page | 9
MODULE LIVING IN THE IT ERA – IT01

4. Backdoors
It is a method that bypasses the normal authentication process. A developer may create a
backdoor so that an application or operating system can be accessed for troubleshooting or other
purposes.
5. Bots
A bot (short for "robot") is an automated process that interacts with other network services.
Some bots program run automatically, while others only execute commands when they receive
specific input. Common examples of bots program are the crawler, chatroom bots, and malicious
bots.
Types of Cyber Attackers
In computer and computer networks, an attacker is the individual or organization who performs
the malicious activities to destroy, expose, alter, disable, steal or gain unauthorized access to or
make unauthorized use of an asset.
As the Internet access becomes more pervasive across the world, and each of us spends more
time on the web, there is also an attacker grows as well. Attackers use every tools and techniques
they would try and attack us to get unauthorized access.
There are four types of attackers which are described below-
Cyber Criminals
Cybercriminals are individual or group of people who use
technology to commit cybercrime with the intention of
stealing sensitive company information or personal data
and generating profits. In today's, they are the most
prominent and most active type of attacker.
Cybercriminals use computers in three broad ways to
do cybercrimes-
o Select computer as their target- In this, they
attack other people's computers to do
cybercrime, such as spreading viruses, data theft,
identity theft, etc.
o Uses the computer as their weapon- In this, they use the computer to do conventional
crime such as spam, fraud, illegal gambling, etc.
o Uses the computer as their accessory- In this, they use the computer to steal data
illegally.

Page | 10
MODULE LIVING IN THE IT ERA – IT01

Hacktivists
Hacktivists are individuals or groups of hackers who carry out malicious activity to promote a
political agenda, religious belief, or social ideology. According to Dan Lohrmann, chief security
officer for Security Mentor, a national security training firm that works with states said
"Hacktivism is a digital disobedience. It's hacking for a cause." Hacktivists are not like
cybercriminals who hack computer networks to steal data for the cash. They are individuals or
groups of hackers who work together and see themselves as fighting injustice.
State-sponsored Attacker

State-sponsored attackers have particular objectives aligned with either the political, commercial
or military interests of their country of origin. These type of attackers are not in a hurry. The
government organizations have highly skilled hackers and specialize in detecting vulnerabilities
and exploiting these before the holes are patched. It is very challenging to defeat these attackers
due to the vast resources at their disposal.
Insider Threats
The insider threat is a threat to an organization's security or data that comes from within. These
type of threats are usually occurred from employees or former employees, but may also arise
from third parties, including contractors, temporary workers, employees or customers.
Insider threats can be categorized below-

Malicious-
Malicious threats are attempts by an insider to
access and potentially harm an organization's data,
systems or IT infrastructure. These insider threats
are often attributed to dissatisfied employees or ex-
employees who believe that the organization was
doing something wrong with them in some way, and
they feel justified in seeking revenge.
Insiders may also become threats when they are
disguised by malicious outsiders, either through
financial incentives or extortion.
Accidental-
Accidental threats are threats which are accidently done by insider employees. In this type of
threats, an employee might accidentally delete an important file or inadvertently share
confidential data with a business partner going beyond company?s policy or legal requirements.

Page | 11
MODULE LIVING IN THE IT ERA – IT01

Negligent-
These are the threats in which employees try to avoid the policies of an organization put in place
to protect endpoints and valuable data. For example, if the organization have strict policies for
external file sharing, employees might try to share work on public cloud applications so that they
can work at home. There is nothing wrong with these acts, but they can open up to dangerous
threats nonetheless.

For more knowledge about cyber-attacks, please check the link provided;
https://www.youtube.com/watch?v=NDcEOW8r0xc&ab_channel=RobotsNet

REFERENCES

https://www.javatpoint.com/cyber-security-introduction

https://www.javatpoint.com/history-of-cyber-security
https://www.javatpoint.com/cyber-security-goals
https://www.javatpoint.com/types-of-cyber-attacks
https://www.javatpoint.com/types-of-cyber-attackers

Page | 12
MODULE LIVING IN THE IT ERA – IT01

CHAPTER 10: LEGAL AND ETHICAL USE OF TECHNOLOGY

Objectives:
a.) Discuss the foundation of computer ethics.
b.) Identify the future concerns in computer ethics and internet
privacy issues.
c.) Discover the ten commandments of computer ethics

Lesson 1: Introduction

Computer ethics is a part of practical philosophy


concerned with how computing professionals should make
decisions regarding professional and social conduct.
Margaret Anne Pierce, a professor in the Department of
Mathematics and Computers at Georgia Southern
University has categorized the ethical decisions related to
computer technology and usage into three primary
influences:
1. The individual's own personal code.
2. Any informal code of ethical conduct that exists in
the work place.
3. Exposure to formal codes of ethics.
Foundation

The term computer ethics was first coined by Walter Maner, a professor at Bowling
Green State University. Maner noticed ethical concerns that were brought up during his Medical
Ethics course at Old Dominion University became more complex and difficult when the use of
technology and computers became involved. The conceptual foundations of computer ethics are
investigated by information ethics, a branch of philosophical ethics promoted, among others, by
Luciano Florida.
History
The concept of computer ethics originated in the 1940s with MIT professor Norbert
Wiener, the American mathematician and philosopher. While working on anti-aircraft artillery
during World War II, Wiener and his fellow engineers developed a system of communication
between the part of a cannon that tracked a warplane, the part that performed calculations to
estimate a trajectory, and the part responsible for firing. Wiener termed the science of such
information feedback systems, "cybernetics," and he discussed this new field with its related

Page 1
MODULE LIVING IN THE IT ERA – IT01

ethical concerns in his 1948 book, Cybernetics. In 1950, Wiener's second book, The Human Use
of Human Beings, delved deeper into the ethical issues surrounding information technology and
laid out the basic foundations of computer ethics.
A bit later during the same year, the world's first computer crime was committed. A programmer
was able to use a bit of computer code to stop his banking account from being flagged as
overdrawn. However, there were no laws in place at that time to stop him, and as a result he was
not charged. To make sure another person did not follow suit, an ethics code for computers was
needed.
In 1973, the Association for Computing Machinery (ACM) adopted its first code of ethics. SRI
International's Donn Parker, an author on computer crimes, led the committee that developed
the code.
In 1976, medical teacher and researcher Walter Maner noticed that ethical decisions are much
harder to make when computers are added. He noticed a need for a different branch of ethics
for when it came to dealing with computers. The term "computer ethics" was thus invented.
In 1976 Joseph Weizenbaum made his second significant addition to the field of computer ethics.
He published a book titled Computer Power and Human Reason, which talked about how artificial
intelligence is good for the world; however it should never be allowed to make the most
important decisions as it does not have human qualities such as wisdom. By far the most
important point he makes in the book is the distinction between choosing and deciding. He
argued that deciding is a computational activity while making choices is not and thus the ability
to make choices is what makes us humans.
At a later time during the same year Abbe Mowshowitz, a professor of Computer Science at the
City College of New York, published an article titled "On approaches to the study of social issues
in computing." This article identified and analyzed technical and non-technical biases in research
on social issues present in computing.
During 1978, the Right to Financial Privacy Act was adopted by the United States Congress,
drastically limiting the government's ability to search bank records.
During the next year Terrell Ward Bynum, the professor of Philosophy at Southern Connecticut
State University as well as Director of the Research Center on Computing and Society there,
developed curriculum for a university course on computer ethics. Bynum was also editor of the
journal Metaphilosophy. In 1983 the journal held an essay contest on the topic of computer ethics
and published the winning essays in its best-selling 1985 special issue, “Computers and Ethics.”

In 1984, the United States Congress passed the Small Business Computer Security and Education
Act, which created a Small Business Administration advisory council to focus on computer
security related to small businesses.

Page 2
MODULE LIVING IN THE IT ERA – IT01

In 1985, James Moor, Professor of Philosophy at DartMouth College in New Hampshire, published
an essay called "What is Computer Ethics?" In this essay Moor states, the computer ethics
includes the following: "(1) identification of computer-generated policy vacuums, (2) clarification
of conceptual muddles, (3) formulation of policies for the use of computer technology, and (4)
ethical justification of such policies."
During the same year, Deborah Johnson, Professor of Applied Ethics and Chair of the Department
of Science, Technology, and Society in the School of Engineering and Applied Sciences of the
University of Virginia, got the first major computer ethics textbook published. Johnson's textbook
identified major issues for research in computer ethics for more than 10 years after publication
of the first edition.

In 1988, Robert Hauptman, a librarian at St. Cloud University, came up with "information ethics",
a term that was used to describe the storage, production, access and dissemination of
information. Near the same time, the Computer Matching and Privacy Act was adopted and this
act restricted United States government programs identifying debtors.
In the year 1992, ACM adopted a new set of ethical rules called "ACM code of Ethics and
Professional Conduct" which consisted of 24 statements of personal responsibility.
Three years later, in 1995, Krystyna Górniak-Kocikowska, a Professor of Philosophy at Southern
Connecticut State University, Coordinator of the Religious Studies Program, as well as a Senior
Research Associate in the Research Center on Computing and Society, came up with the idea that
computer ethics will eventually become a global ethical system and soon after, computer ethics
would replace ethics altogether as it would become the standard ethics of the information age.

In 1999, Deborah Johnson revealed her view, which was quite contrary to Górniak-Kocikowska's
belief, and stated that computer ethics will not evolve but rather be our old ethics with a slight
twist.
Post 20th century, as a result to much debate of ethical guidelines, many organizations such as
ABET offer ethical accreditation to University or College applications such as "Applied and Natural
Science, Computing, Engineering and Engineering Technology at the associate, bachelor, and
master levels" to try and promote quality works that follow sound ethical and moral guidelines.
In 2018 The Guardian and The New York Times reported that Facebook took data from 87 million
Facebook users to sell to Cambridge Analytica.
In 2019 Facebook started a fund to build an ethics center at the Technical University of Munich,
located in Germany. This was the first time that Facebook funded an academic institute for
matters regarding computer ethics.

For more knowledge about computer ethics, please check the link provided;
https://www.youtube.com/watch?v=cFszY5bTx5s

Page 3
MODULE LIVING IN THE IT ERA – IT01

Lesson 2: Future Concerns and Internet Privacy

Future Concerns
Computer crime, privacy, anonymity, freedom, and intellectual property fall under
topics that will be present in the future of computer ethics.
Ethical considerations have been linked to the Internet of Things (IoT) with many physical devices
being connected to the internet.
Virtual Crypto-currencies in regards to the balance of the current purchasing relationship
between the buyer and seller.
Autonomous technology such as self-driving cars forced to make human decisions. There is also
concern over how autonomous vehicles would behave in different countries with different
culture values.
Security risks have been identified with cloud-based technology with every user interaction being
sent and analyzed to central computing hubs. Artificial intelligence devices like the Amazon Alexa
and Google Home are collecting personal data from users while at home and uploading it to the
cloud. Apple’s Siri and Microsoft’s Cortana smartphone assistants are collecting user information,
analyzing the information, and then sending the information back to the user.
Internet privacy
Privacy is one of the major issues that has emerged since the internet has become part
of many aspects of daily life. Internet users hand over personal information in order to sign up or
register for services without realizing that they are potentially setting themselves up for invasions
of privacy.
Another example of privacy issues, with concern to Google, is tracking searches. There is a feature
within searching that allows Google to keep track of searches so that advertisements will match
your search criteria, which in turn means using people as products. Google was sued in 2018 due
to tracking user location without permission.
There is an ongoing discussion about what privacy and privacy enforcement measures imply.
With the increase in social networking sites, more and more people are allowing their private
information to be shared publicly. On the surface, this may be seen as someone listing private
information about them on a social networking site, but below the surface, it is the site that could
be sharing the information (not the individual). This is the idea of an opt-in versus opt- out
situation. There are many privacy statements that state whether there is an opt-in or an opt- out
policy. Typically an opt-in privacy policy means that the individual has to tell the company issuing
the privacy policy if they want their information shared or not. Opt-out means that their
information will be shared unless the individual tells the company not to share it.

Page 4
MODULE LIVING IN THE IT ERA – IT01

A whole industry of privacy and ethical tools has grown over time, giving people the choice to
not share their data online. These are often open source software, which allows the users to
ensure that their data is not saved to be used without their consent.
Identifying issues
Identifying ethical issues as they arise, as well as defining how to deal with them, has traditionally
been problematic. In solving problems relating to ethical issues, Michael Davis proposed a unique
problem-solving method. In Davis's model, the ethical problem is stated, facts are checked, and
a list of options is generated by considering relevant factors relating to the problem. The actual
action taken is influenced by specific ethical standards.

For more knowledge about Future Concerns and Internet Privacy, please check the
link provided; https://www.youtube.com/watch?v=yG4JL0ZRmi4

Lesson 3: Ten Commandments of Computer Ethics


The Ten Commandments of Computer Ethics
were created in 1992 by the Washington,
D.C. based Computer Ethics Institute. The
commandments were introduced in the paper "In
Pursuit of a 'Ten Commandments' for Computer Ethics"
by Ramon C. Barquin as a means to create "a set of
standards to guide and instruct people in
the ethical use of computers." They follow the Internet Advisory Board's memo on ethics from
1987. The Ten Commandments of Computer Ethics copies the archaic style of the Ten
Commandments from the King James Bible.
The Ten Commandments of Computer Ethics

1. Thou shalt not use a computer to harm other people.


2. Thou shalt not interfere with other people's computer work.
3. Thou shalt not snoop around in other people's computer files.
4. Thou shalt not use a computer to steal.
5. Thou shalt not use a computer to bear false witness.
6. Thou shalt not copy or use proprietary software for which you have not paid (without
permission).
7. Thou shalt not use other people's computer resources without authorization or proper
compensation.
8. Thou shalt not appropriate other people's intellectual output.

Page 5
MODULE LIVING IN THE IT ERA – IT01

9. Thou shalt think about the social consequences of the program you are writing or the
system you are designing.
10. Thou shalt always use a computer in ways that ensure consideration and respect for other
humans.
Exegesis
• Commandment 1
Simply put: Do not use the computer in ways that may harm other people.
Explanation: This commandment says that it is unethical to use a computer to harm another user.
It is not limited to physical injury. It includes harming or corrupting other users' data or files. The
commandment states that it is wrong to use a computer to steal someone's personal information.
Manipulating or destroying files of other users is ethically wrong. It is unethical to write programs,
which on execution lead to stealing, copying or gaining unauthorized access to other users' data.
Being involved in practices like hacking, spamming, phishing or cyber bullying does not conform
to computer ethics.
• Commandment 2
Simply put: Do not use computer technology to cause interference in other users' work.

Explanation: Computer software can be used in ways that disturb other users or disrupt their
work. Viruses, for example, are programs meant to harm useful computer programs or interfere
with the normal functioning of a computer. Malicious software can disrupt the functioning of
computers in more ways than one. It may overload computer memory through excessive
consumption of computer resources, thus slowing its functioning. It may cause a computer to
function wrongly or even stop working. Using malicious software to attack a computer is
unethical.
• Commandment 3
Simply put: Do not spy on another person's computer data.
Explanation: We know it is wrong to read someone's personal letters. On the same lines, it is
wrong to read someone else's email messages or files. Obtaining data from another person's
private files is nothing less than breaking into someone's room. Snooping around in another
person's files or reading someone else's personal messages is the invasion of his privacy. There
are exceptions to this. For example, spying is necessary and cannot be called unethical when it is
done against illegitimate use of computers. For example, intelligence agencies working on
cybercrime cases need to spy on the internet activity of suspects.
• Commandment 4
Simply put: Do not use computer technology to steal information.

Page 6
MODULE LIVING IN THE IT ERA – IT01

Explanation: Stealing sensitive information or leaking confidential information is as good as


robbery. It is wrong to acquire personal information of employees from an employee database
or patient history from a hospital database or other such information that is meant to be
confidential. Similarly, breaking into a bank account to collect information about the account or
account holder is wrong. Illegal electronic transfer of funds is a type of fraud. With the use of
technology, stealing of information is much easier. Computers can be used to store stolen
information.
• Commandment 5
Simply put: Do not contribute to the spread of misinformation using computer technology.

Explanation: Spread of information has become viral today, because of the Internet. This also
means that false news or rumors can spread speedily through social networking sites or emails.
Being involved in the circulation of incorrect information is unethical. Mails and pop-ups are
commonly used to spread the wrong information or give false alerts with the only intent of selling
products. Mails from untrusted sources advertising certain products or spreading some hard-to-
believe information, are not uncommon. Direct or indirect involvement in the circulation of false
information is ethically wrong. Giving wrong information can hurt other parties or organizations
that are affected by that particular theme.
• Commandment 6
Simply put: Refrain from copying software or buying pirated copies. Pay for software unless it is
free.
Explanation: Like any other artistic or literary work, software is copyrighted. A piece of code is
the original work of the individual who created it. It is copyrighted in his/her name. In case of a
developer writing software for the organization she works for, the organization holds the
copyright for it. Copyright holds true unless its creators announce it is not. Obtaining illegal copies
of copyrighted software is unethical and also encourages others to make copies illegally.
• Commandment 7
Simply put: Do not use someone else's computer resources unless authorized to.

Explanation: Multi-user systems have user specific passwords. Breaking into some other user's
password, thus intruding his/her private space is unethical. It is not ethical to hack passwords for
gaining unauthorized access to a password-protected computer system. Accessing data that you
are not authorized to access or gaining access to another user's computer without her permission
is not ethical.
• Commandment 8

Page 7
MODULE LIVING IN THE IT ERA – IT01

Simply put: It is wrong to claim ownership on a work which is the output of someone else's
intellect.
Explanation: Programs developed by a software developer are her property. If he is working with
an organization, they are the organization's property. Copying them and propagating them in
one's own name is unethical. This applies to any creative work, program or design. Establishing
ownership on a work which is not yours is ethically wrong.
• Commandment 9
Simply put: Before developing a software, think about the social impact it can have.

Explanation: Looking at the social consequences that a program can have, describes a broader
perspective of looking at technology. A computer software on release, reaches millions. Software
like video games and animations or educational software can have a social impact on their users.
When working on animation films or designing video games, for example, it is the programmer's
responsibility to understand his target audience/users and the effect it may have on them. For
example, a computer game for kids should not have content that can influence them negatively.
Similarly, writing malicious software is ethically wrong. A software developer/development firm
should consider the influence their code can have on the society at large.
• Commandment 10
Simply put: In using computers for communication, be respectful and courteous with the fellow
members.

Explanation: The communication etiquette we follow in the real world applies to communication
over computers as well. While communicating over the Internet, one should treat others with
respect. One should not intrude others' private space, use abusive language, make false
statements or pass irresponsible remarks about others. One should be courteous while
communicating over the web and should respect others' time and resources. Also, one should be
considerate with a novice computer user.

For more knowledge about ten commandments of computer ethics, please check the
link provided; https://www.youtube.com/watch?v=fVYH-O_Il0g

REFERENCES

https://en.wikipedia.org/wiki/Computer_ethics
https://en.wikipedia.org/wiki/Ten_Commandments_of_Computer_Ethics

Page 8

You might also like