You are on page 1of 12

MTTM -04 /2021

3. Discuss the different generation of computers' while mentioning the key features of each generation.

Even more so the generation who have grown from infancy within the global desktop and laptop
revolution since the 1980s.

The history of the computer goes back several decades however and there are five definable generations
of computers.

Each generation is defined by a significant technological development that changes fundamentally how
computers operate – leading to more compact, less expensive, but more powerful, efficient and robust
machines.

1940 – 1956: First Generation – Vacuum Tubes

These early computers used vacuum tubes as circuitry and magnetic drums for memory. As a result they
were enormous, literally taking up entire rooms and costing a fortune to run. These were inefficient
materials which generated a lot of heat, sucked huge electricity and subsequently generated a lot of heat
which caused ongoing breakdowns.

These first generation computers relied on ‘machine language’ (which is the most basic programming
language that can be understood by computers). These computers were limited to solving one problem at
a time. Input was based on punched cards and paper tape. Output came out on print-outs. The two
notable machines of this era were the UNIVAC and ENIAC machines – the UNIVAC is the first every
commercial computer which was purchased in 1951 by a business – the US Census Bureau.

1956 – 1963: Second Generation – Transistors

The replacement of vacuum tubes by transistors saw the advent of the second generation of computing.
Although first invented in 1947, transistors weren’t used significantly in computers until the end of the
1950s. They were a big improvement over the vacuum tube, despite still subjecting computers to
damaging levels of heat. However they were hugely superior to the vacuum tubes, making computers
smaller, faster, cheaper and less heavy on electricity use. They still relied on punched card for
input/printouts.

The language evolved from cryptic binary language to symbolic (‘assembly’) languages. This meant
programmers could create instructions in words. About the same time high level programming languages
were being developed (early versions of COBOL and FORTRAN). Transistor-driven machines were the first
computers to store instructions into their memories – moving from magnetic drum to magnetic core
‘technology’. The early versions of these machines were developed for the atomic energy industry.
1964 – 1971: Third Generation – Integrated Circuits

By this phase, transistors were now being miniaturised and put on silicon chips (called semiconductors).
This led to a massive increase in speed and efficiency of these machines. These were the first computers
where users interacted using keyboards and monitors which interfaced with an operating system, a
significant leap up from the punch cards and printouts. This enabled these machines to run several
applications at once using a central program which functioned to monitor memory.

As a result of these advances which again made machines cheaper and smaller, a new mass market of
users emerged during the ‘60s.

1972 – 2010: Fourth Generation – Microprocessors

This revolution can be summed in one word: Intel. The chip-maker developed the Intel 4004 chip in 1971,
which positioned all computer components (CPU, memory, input/output controls) onto a single chip. What
filled a room in the 1940s now fit in the palm of the hand. The Intel chip housed thousands of integrated
circuits. The year 1981 saw the first ever computer (IBM) specifically designed for home use and 1984 saw
the MacIntosh introduced by Apple. Microprocessors even moved beyond the realm of computers and
into an increasing number of everyday products.

The increased power of these small computers meant they could be linked, creating networks. Which
ultimately led to the development, birth and rapid evolution of the Internet. Other major advances during
this period have been the Graphical user interface (GUI), the mouse and more recently the astounding
advances in lap-top capability and hand-held devices.

2010- : Fifth Generation – Artificial Intelligence

Computer devices with artificial intelligence are still in development, but some of these technologies are
beginning to emerge and be used such as voice recognition.

AI is a reality made possible by using parallel processing and superconductors. Leaning to the future,
computers will be radically transformed again by quantum computation, molecular and nano technology.

The essence of fifth generation will be using these technologies to ultimately create machines which can
process and respond to natural language, and have capability to learn and organise themselves.

6. Define Systems Analysis. Explain the System Life Cycle. 20

Systems development is systematic process which includes phases such as planning, analysis, design,
deployment, and maintenance. Here, in this tutorial, we will primarily focus on −

 Systems analysis
 Systems design

Systems Analysis
It is a process of collecting and interpreting facts, identifying the problems, and decomposition of a
system into its components.
System analysis is conducted for the purpose of studying a system or its parts in order to identify its
objectives. It is a problem solving technique that improves the system and ensures that all the
components of the system work efficiently to accomplish their purpose.
Analysis specifies what the system should do.

Systems Design

It is a process of planning a new business system or replacing an existing system by defining its
components or modules to satisfy the specific requirements. Before planning, you need to understand
the old system thoroughly and determine how computers can best be used in order to operate efficiently.
System Design focuses on how to accomplish the objective of the system.
System Analysis and Design (SAD) mainly focuses on −

 Systems
 Processes
 Technology

The System Life Cycle

Quick revise

The system life cycle is a series of stages that are worked through during the development of a new
information system.
A lot of time and money can be wasted if a system is developed that doesn’t work properly or do exactly
what is required of it.
A new system is much more likely to be successful if it is carefully planned and developed.
Feasibility study

The first stage of the system life cycle


This is an investigation that is carried out by a systems analyst to find out what the main problems are with
the existing system and if it is technically possible and cost-effective to solve these problems by developing
a computer based solution.
Feasibility report contents

 A description of the existing system outlining what is being done and how it is being done;
 A set of problem statements describing exactly what the problems are with the existing system;
 A set of system objectives which describe what the new system must be able to do;
 A description of some alternative solutions;
 A description of the technical, economic, legal and social factors that have been considered;
 A recommended course of action.

Analysis
During the analysis stage systems analysts investigate the existing system to identify exactly what the
problems are with the existing system
Systems analysts will use a variety of fact-finding methods to gather information for example

 Questionnaires
 Interviews
 Observation
 Examining documents

Design

 Alternative possible solutions are identified


 Alternative solutions evaluated
 The best solution is identified

A design specification is produced containing information about:

 Input
 Output
 Data storage
 User interface
 Backup and recovery procedures
 Security procedures

Test plan Typical format for a test plan


Implementation

This stage involves:


Setting up the system so that it matches the design specification
Testing carried out using the plan to make sure that all the parts of the system work correctly with normal,
extreme and erroneous data

 Normal test data is used to check that a system can handle the sort of data that would be expected
during day-to-day use
 Extreme test data is used to check that a system can cope with data that lies on the boundaries of
what is acceptable
 Erroneous (or exceptional) test data is used to check that a system can identify data that is wrong
and reject it

Testing using normal, extreme and erroneous data


Installing the new system
Might include:

 Installing any new hardware and software;


 Transferring data from the existing system to the new one;
 Training users how to operate the new system

Producing documentation
Technical documentation

 the system design specification;


 systems flowcharts;
 data flow diagrams;
 a description of the various parts of the system and what each one does;
 screen layouts and user interface designs;
 the test plan.

User documentation

 a description of what the system is designed to do;


 minimum hardware and software requirements of the system;
 instructions on how to load and run the system;
 detailed instructions on how to operate each part of the system;
 Error messages, their meaning and how to deal with them.
 Where to get more help, such as telephone support lines and on-line tutorials.

Post-implementation review
Carried out after the new system has been running for a few weeks or months to identify any
modifications that may need to be made.
Maintenance

A new information system may need to be changed due to:

 Change in needs of user


 Problems not found during testing
 Improvements required in the way the system works

7. Describe the importance and role of computers in Management Decision – Making Process. 20
Information plays a vital role in decision-making. Even to take very simple decisions, we need information.
To understand the role played by information in decision-making, we have to understand how decisions
are taken. Decision-making is basically a process that includes the following stages:
Information is thus, very important to take decisions.
Imagine a simple decision like the one a driver (say) makes when he puts on the brakes to stop a
speeding vehicle when he sees a child crossing the road (in middle of the road). The driver decides on
braking based on a lot of information processing that happens in his brain. At every stage of the
decision-
making he uses information that he captures visually. All decisions are like this.
First we get information about a problem, format it into a structure and then factor in the information
about the context in which the problem has occurred. Like in the above case instead of the child being at
the middle of the road and crossing it, the driver would have seen the child about to cross over with a few
steps only he would probably not have braked to stop but would have slowed down, as he would have
calculated that by the time the vehicle reaches the crossing stage, the child would already have passed.
So if the problem was structured as ‘how to not hit the child crossing the road?’, and if the child was at
the middle of the road, the driver would have braked but had the child been at (say) at ninety per cent
completion level of crossing the road, the driver would have only slowed down and not braked to stop.
Therefore, we see that the context has a major role in the decision-making and information is required
both about the problem and about the context in which the problem occurred. The next stage for the
decision maker would be to generate alternatives. In the driver’s case such possible alternatives would be

Stages of Decision- Role of Information


making

Identification and One needs information to


structuring of identify a problem and put it in
a structured manner. Without
problem/opportunity
information about a problem or
opportunity, the decision-
making process does not even
start.

Putting the problem/ Without information about the


context in which the problem
opportunity in
has occurred, one cannot take
context
any decision on it. In a way, the
.information about the context
defines the problem.

Generation of Information is a key ingredient


alternatives in the generation of
alternatives for decision-
making. One has to have
information about possible
solutions to generate
alternatives.

Choice of best Based on the information about


alternative the suitability of the
alternatives, a choice is made
to select the best alternative.
a. to stop by braking
b. to slow down
c. to take a sharp turn towards left or right to avoid the child
d. press the horn so that the child crosses the road fast
e. To drive the vehicle on to the footpath and out of the road to avoid collision, etc.

So the decision-maker generates these possible solutions to the problem at hand based on information
about such possible solutions. Each of the alternatives represents a possible solution, which one can
generate if one has information about them. In the case of the driver, obviously, he needs knowledge and
information to generate these alternatives, i.e., to stop by breaking the driver would need to know
that braking stops the vehicle. If he is unaware of this crucial information he would not have been able to
generate this alternative. So information is vital for generation of alternatives. Now for the choice part
also, the decision maker needs to have information about the suitability of each alternative to decide,
which the ‘best’ is. In our example, the driver calculates the ‘payoff’ for each alternative based on his
calculation of the outcome that again is based on information. He selects the ‘best’ option that solves the
problem. Thus, we can see that information is the key to the decision making process, without
information and the right kind of information decision-making is not possible. Information plays a crucial
role in every stage of the decision-making process.
Decision-making is the most important task of managers in an organization. Therefore, to enable
managers to take good quality decisions, it is very important to provide them with the right kind of
information. Information management in organizations therefore assumes a special significance. In most
organizations, business or otherwise, a systematic systems based method is used for information
management. Systems based information management works best under a computerized environment
and such computer based information management system is normally called ‘Management Information
Systems (MIS)’, which provides the service of information supply to the managers enabling them to take
informed decisions. It may be worthwhile to mention here that MIS does not necessitates the use
of computer based technology, but the use of computers and information technology makes MIS suitable
for business organizations in a competitive environment as it helps to provide timely and accurate
information. MIS done manually, without the help of computers is neither timely nor accurate.
8. Define Hacking. Discuss the precautions that are to be adopted against Hacking. 20
Hacking is an attempt to exploit a computer system or a private network inside a computer. Simply put, it
is the unauthorised access to or control over computer network security systems for some illicit purpose.

To better describe hacking, one needs to first understand hackers. One can easily assume them to be
intelligent and highly skilled in computers. In fact, breaking a security system requires more intelligence
and expertise than actually creating one. There are no hard and fast rules whereby we can categorize
hackers into neat compartments. However, in general computer parlance, we call them white hats, black
hats and grey hats. White hat professionals hack to check their own security systems to make it more
hack-proof. In most cases, they are part of the same organisation. Black hat hackers hack to take control
over the system for personal gains. They can destroy, steal or even prevent authorized users from
accessing the system. They do this by finding loopholes and weaknesses in the system. Some computer
experts call them crackers instead of hackers. Grey hat hackers comprise curious people who have just
about enough computer language skills to enable them to hack a system to locate potential loopholes in
the network security system. Grey hats differ from black hats in the sense that the former notify the admin
of the network system about the weaknesses discovered in the system, whereas the latter is only looking
for personal gains. All kinds of hacking are considered illegal barring the work done by white hat hackers.
10 ways to beat you.

1. Update your OS and other software frequently, if not automatically. This keeps hackers from
accessing your computer through vulnerabilities in outdated programs (which can be exploited by
malware). For extra protection, enable Microsoft product updates so that the Office Suite will be
updated at the same time. Consider retiring particularly susceptible software such as Java or
Flash, especially as many sites and services continue to move away from them.

2. Download up-to-date security programs, including anti-malware software with multiple


technologies for protecting against spyware, ransomware, and exploits, as well as a firewall, if your
OS didn’t come pre-packaged with it. (You’ll want to check if your OS has both firewall and antivirus
built in and enabled by default, and whether those programs are compatible with additional
cybersecurity software.)

3. Destroy all traces of your personal info on hardware you plan on selling. Consider using d-ban to
erase your hard drive. For those looking to pillage your recycled devices, this makes information
much more difficult to recover. If the information you’d like to protect is critical enough, removing
the platters where the information is stored then destroying them is the way to go.

4. Do not use open Wi-Fi on your router; it makes it too easy for threat actors to steal your
connection and download illegal files. Protect your Wi-Fi with an encrypted password, and consider
refreshing your equipment every few years. Some routers have vulnerabilities that are never
patched. Newer routers allow you to provide guests with segregated wireless access. Plus, they
make frequent password changes easier.

5. Speaking of passwords: password protect all of your devices, including your desktop, laptop,
phone, smartwatch, tablet, camera, lawnmower…you get the idea. The ubiquity of mobile devices
makes them especially vulnerable. Lock your phone and make the timeout fairly short. Use
fingerprint lock for the iPhone and passkey or swipe for Android. “It’s easy to forget that mobile
devices are essentially small computers that just happen to fit in your pocket and can be used as a
phone,” says Jean-Philippe Taggart, Senior Security Researcher at Malwarebytes. “Your mobile
device contains a veritable treasure trove of personal information and, once unlocked, can lead to
devastating consequences.”

6. Sensing a pattern here? Create difficult passwords, and never use the same ones across multiple
services. If that’s as painful as a stake to a vampire’s heart, use a password manager like LastPass or
1Password. For extra hacker protection, ask about two-step authentication. Several services have
only recently started to offer 2FA, and they require the user to initiate the process. Trust us, the
extra friction is worth it. Two-factor authentication makes taking over an account that much more
difficult, and on the flip side, much easier to reclaim should the worst happen.

7. Come up with creative answers for your security questions. People can now figure out your
mother’s maiden name or where you graduated from high school with a simple Google search.
Consider answering like a crazy person. If Bank of America asks, “What was the name of your first
boyfriend/girlfriend?” reply, “Your mom.” Just don’t forget that’s how you answered when they ask
you again.

8. Practice smart emailing. Phishing campaigns still exist, but cybercriminals have become much
cleverer than that Nigerian prince who needs your money. Hover over links to see their actual URLs
(as opposed to just seeing words in hyperlink text). Also, check to see if the email is really from the
person or company claiming to have sent it. If you’re not sure, pay attention to awkward sentence
construction and formatting. If something still seems fishy, do a quick search on the Internet for the
subject line. Others may have been scammed and posted about it online.

9. Some websites will ask you to sign in with a specific service to access features or post a comment.
Ensure the login option isn’t a sneaky phish, and if you’re giving permission to an app to perform a
task, ensure you know how to revoke access once you no longer need it. Old, abandoned
connections from service to service are an easy way to see your main account compromised by
spam.

10. Keep sensitive data off the cloud. “No matter which way you cut it, data stored on the cloud
doesn’t belong to you,” says Taggart. “There are very few cloud storage solutions that offer
encryption for ‘data at rest.’ Use the cloud accordingly. If it’s important, don’t.”

9. Critically analyse the impact of computerisation on the society. 20

Computing technologies, like most other forms of technology, are not socially neutral. They affect and are
themselves affected by society. Computers have changed the way people relate to one another and their
living environment, as well as how humans organize their work, their communities, and their time. Society,
in turn, has influenced the development of computers through the needs people have for processing
information. The study of these relationships has come to be known as "social informatics."

Computing technology has evolved as a means of solving specific problems in human society. The earliest
kinds of computational devices were the mechanical calculators developed by Blaise Pascal (1623–1662) in
1645 and Gottfried Leibniz (1646–1716) in 1694 for solving the navigational and scientific problems that
began to arise as Europe entered a new and heightened period of scientific development and international
commerce. In 1801 Joseph-Marie Jacquard (1752–1834) invented perhaps the first type of programmed
machine, called Jacquard's Loom, in order to automate the weaving of cloth with patterns. Jacquard was
motivated by the desire of capitalists in the early Industrial Age who wanted to reduce the cost of
producing their goods through mass production in factories.

The twentieth century saw the development of scientific research and engineering applications that
required increasingly complex computations. Urgent military needs created by World War II spurred the
development of the first electronic computers; the devices in use today are the descendants of these
room-sized early efforts to streamline military planning and calculation. The needs and desires of society
have subsequently influenced the development of a vast array of computing technologies,
including supercomputers , graphics processors, games, digital video and audio, mobile computing
devices, and telephones.

In the twenty-first century, computers are used in almost every facet of society, including (but not limited
to) agriculture, architecture, art, commerce and global trade, communication, education, governance, law,
music, politics, science, transportation, and writing. In general, computing technologies have been applied
to almost every situation falling into one of two categories. The first category covers applications that
require the organization, storage, and retrieval of large amounts of information such as library catalogs or
bank records. The second category includes applications that require the coordination of complex
processes, like the control of machinery involved in the manufacture of cars or the printing of books and
newspapers.
Impact of Computers on Work

One of the ways that computers have made an impact on society is in how people have organized
themselves in workplace groups in relationship to computers. The earliest computers were developed to
perform specific tasks in science, engineering, or warfare that had previously been done by hand. Soon
general-purpose computers could automate almost any information processing task required to manage
an organization, such as payroll processing and record management. However, since early generation
computers were relatively expensive, all of an organization's information processing tasks were typically
centralized around the one large computer it could afford. Departments and people in such organizations
would likewise be organized in a centralized fashion to facilitate their access to the computer. Companies
with centralized information processing, for example, usually had most of their administrative offices in
the same geographic location as their computer resources.

Subsequent developments in computing technology changed the way companies organized people who
perform similar tasks. The advent of computer networking and lower cost minicomputers enabled entire
organizations that were once centralized around a single computer to rearrange themselves into
geographically dispersed divisions. The integration of telecommunications with computing allowed people
in remote places such as branch offices to use computers located in distant parts of their organization. This
decentralization continued with the advent of the personal computer. PCs provided a low-cost way for
large organizations to transform themselves further by redistributing information processing
responsibilities to small departments and individuals in many locations.

Not only have computers changed the way in which workplaces structure their tasks and workers, they
have also dramatically changed the work itself. Computer-aided manufacturing (CAM) was first introduced
in the 1950s with numerically controlled machines. These and other forms of computer-based automation
have been associated with the loss of jobs and certain skills, and the need to master new skills. Since the
middle of the twentieth century, computer-controlled devices have gradually eliminated certain types of
jobs and the need for people to perform particular skills. As a consequence, workers have had to learn new
skills in order to continue working in environments that increasingly depend on computers.

One major result has been the shift of some economies, such as that of the United States, from
manufacturing to service jobs. Entirely new categories of jobs have been created to support and
implement computer technology. In addition, the ease of networking computers has led businesses to
relocate jobs to remote locations. For example, a number of companies now hire computer programmers
who are located in other countries, such as India, in order to save on labor costs. Within the United States,
increasing numbers of companies allow employees to work from their homes or work centers away from
the corporate headquarters. These so-called telecommuters are able to communicate with their
employers and deliver their work using the Internet.

The advent of e-mail, the World Wide Web, and other Internet technologies has perhaps made the most
significant impact on the social fabric of American society. People can now communicate with others in
remote places, easily, affordably, and often anonymously. They can search for, share, and transfer more
information, and more quickly, than ever before. People distributed across remote locations can organize
themselves into "virtual communities" based on shared interests, regardless of their geographic locations.
The Internet has also changed the way both education and entertainment can be delivered into private
homes and public spaces.
Effects of the Computer Age

Psychologists have long been interested in observing and analyzing the way humans interact with
computers. Research in human-computer interaction has studied how people read and process
information presented to them on computer screens, the types of input errors people are most likely to
make when using different computer systems, and the effectiveness of various kinds of input devices such
as keyboards, mice, and light pens. Psychological issues have also been identified in how people behave
toward other people when they use computing technologies such as e-mail and how they behave toward
computers. Studies have shown, for example, that people use the anonymity that e-mail and other
Internet technologies afford to construct alternate identities for themselves. Other studies indicate that
people often apply the same rules of social behavior, such as politeness, toward computers as they would
to other people.

The impact of computers on lifestyles has largely paralleled the impact of computing on social
organization, work, and personal communication. The effect has become more pronounced as personal
computing devices become increasingly more commonplace in American society. In particular, computers
coupled with telecommunications technologies enable many people to live and work more independently
and remotely than ever before. Individuals using personal computers can publish books, make airline
reservations, and hold meetings to share information with any number of people across the globe. Some
observers view these developments positively, while others are concerned that the widespread use of
computers has led to lifestyles that contain increasing amounts of work.

You might also like